You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Super Daniel d8a5aded19
[hotfix] change namespace for meta_trace. (#1541)
2 years ago
..
amp [doc] update rst and docstring (#1351) 2 years ago
auto_parallel [autoparellel]add strategies constructor (#1505) 2 years ago
builder
cli
communication [communication] add p2p_v2.py to support communication with List[Any] (#1407) 2 years ago
context [doc] update rst and docstring (#1351) 2 years ago
device [tensor]add 1D device mesh (#1492) 2 years ago
engine [engin/schedule] use p2p_v2 to recontruct pipeline_schedule (#1408) 2 years ago
fx [hotfix] change namespace for meta_trace. (#1541) 2 years ago
gemini [zero] add chunk_managerV2 for all-gather chunk (#1441) 2 years ago
kernel [hotfix] fix CPUAdam kernel nullptr (#1410) 2 years ago
logging
nn [embedding] polish parallel embedding tablewise (#1545) 2 years ago
pipeline [pipeline/pipleline_process_group] finish PipelineProcessGroup to manage local abd global rank in TP,DP and PP (#1508) 2 years ago
registry
tensor [tensor]add 1D device mesh (#1492) 2 years ago
testing
trainer
utils [hotfix] fix init context (#1543) 2 years ago
zero [utils] Impl clip_grad_norm for ColoTensor and ZeroOptimizer (#1442) 2 years ago
__init__.py [fx] support meta tracing for aten level computation graphs like functorch. (#1536) 2 years ago
_meta_registrations.py [fx] support meta tracing for aten level computation graphs like functorch. (#1536) 2 years ago
constants.py
core.py
global_variables.py
initialize.py