You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests
Boyuan Yao 40c916b192
[autoparallel] Patch meta information of `torch.nn.functional.softmax` and `torch.nn.Softmax` (#2674)
2 years ago
..
components_to_test [testing] add beit model for unit testings (#2196) 2 years ago
test_amp [workflow] only report coverage for changed files (#2524) 2 years ago
test_auto_parallel [autoparallel] Patch meta information of `torch.nn.functional.softmax` and `torch.nn.Softmax` (#2674) 2 years ago
test_autochunk [autochunk] support diffusion for autochunk (#2621) 2 years ago
test_comm
test_config
test_context
test_data
test_data_pipeline_tensor_parallel
test_ddp [gemini] update ddp strict mode (#2518) 2 years ago
test_device [device] find best logical mesh 2 years ago
test_engine
test_fx [Pipeline] Add Topo Class (#2059) 2 years ago
test_gemini [gemini] add fake_release_chunk for keep-gathered chunk in the inference mode (#2671) 2 years ago
test_layers improved allgather & reducescatter for 3d 2 years ago
test_moe
test_ops
test_optimizer [setup] support pre-build and jit-build of cuda kernels (#2374) 2 years ago
test_pipeline [PP Middleware] Add bwd and step for PP middleware (#2111) 2 years ago
test_tensor [gemini] update ddp strict mode (#2518) 2 years ago
test_trainer
test_utils updated attention kernel (#2133) 2 years ago
test_zero [zero] add zero wrappers (#2523) 2 years ago
__init__.py