You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Boyuan Yao 5774fe0270
[fx] Use colossalai checkpoint and add offload recognition in codegen (#1439)
2 years ago
..
amp [doc] update rst and docstring (#1351) 2 years ago
builder
cli
communication [communication] add p2p_v2.py to support communication with List[Any] (#1407) 2 years ago
context [doc] update rst and docstring (#1351) 2 years ago
device [device] add DeviceMesh class to support logical device layout (#1394) 2 years ago
engine [engin/schedule] use p2p_v2 to recontruct pipeline_schedule (#1408) 2 years ago
fx [fx] Use colossalai checkpoint and add offload recognition in codegen (#1439) 2 years ago
gemini [zero] add chunk_managerV2 for all-gather chunk (#1441) 2 years ago
kernel [hotfix] fix CPUAdam kernel nullptr (#1410) 2 years ago
logging
nn [tensor] added linear implementation for the new sharding spec (#1416) 2 years ago
pipeline
registry
tensor [tensor] added linear implementation for the new sharding spec (#1416) 2 years ago
testing
trainer
utils [utils] Impl clip_grad_norm for ColoTensor and ZeroOptimizer (#1442) 2 years ago
zero [utils] Impl clip_grad_norm for ColoTensor and ZeroOptimizer (#1442) 2 years ago
__init__.py
constants.py
core.py
global_variables.py
initialize.py