You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Super Daniel 0dbd61c29b
[fx] fix test and algorithm bugs in activation checkpointing. (#1451)
2 years ago
..
amp [doc] update rst and docstring (#1351) 2 years ago
builder
cli
communication [communication] add p2p_v2.py to support communication with List[Any] (#1407) 2 years ago
context [doc] update rst and docstring (#1351) 2 years ago
device [device] add DeviceMesh class to support logical device layout (#1394) 2 years ago
engine [engin/schedule] use p2p_v2 to recontruct pipeline_schedule (#1408) 2 years ago
fx [fx] fix test and algorithm bugs in activation checkpointing. (#1451) 2 years ago
gemini [zero] add chunk_managerV2 for all-gather chunk (#1441) 2 years ago
kernel [hotfix] fix CPUAdam kernel nullptr (#1410) 2 years ago
logging
nn fix nvme docstring (#1450) 2 years ago
pipeline
registry
tensor [tensor] shape consistency generate transform path and communication cost (#1435) 2 years ago
testing
trainer
utils [test] fixed the activation codegen test (#1447) 2 years ago
zero [utils] Impl clip_grad_norm for ColoTensor and ZeroOptimizer (#1442) 2 years ago
__init__.py
constants.py
core.py
global_variables.py
initialize.py