You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests
Boyuan Yao 40c916b192
[autoparallel] Patch meta information of `torch.nn.functional.softmax` and `torch.nn.Softmax` (#2674)
2 years ago
..
components_to_test
test_amp [workflow] only report coverage for changed files (#2524) 2 years ago
test_auto_parallel [autoparallel] Patch meta information of `torch.nn.functional.softmax` and `torch.nn.Softmax` (#2674) 2 years ago
test_autochunk [autochunk] support diffusion for autochunk (#2621) 2 years ago
test_comm
test_config
test_context
test_data
test_data_pipeline_tensor_parallel
test_ddp [gemini] update ddp strict mode (#2518) 2 years ago
test_device
test_engine
test_fx
test_gemini [gemini] add fake_release_chunk for keep-gathered chunk in the inference mode (#2671) 2 years ago
test_layers
test_moe
test_ops
test_optimizer
test_pipeline
test_tensor [gemini] update ddp strict mode (#2518) 2 years ago
test_trainer
test_utils
test_zero [zero] add zero wrappers (#2523) 2 years ago
__init__.py