You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Jiarui Fang b3b89865e2
[Gemini] ParamOpHook -> ColoParamOpHook (#2080)
2 years ago
..
_C [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2 years ago
amp [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2 years ago
auto_parallel [autoparallel] add binary elementwise metainfo for auto parallel (#2058) 2 years ago
builder [NFC] polish colossalai/builder/__init__.py code style (#1560) 2 years ago
cli [cli] updated installation cheheck with more inforamtion (#2050) 2 years ago
communication [communication] add p2p_v2.py to support communication with List[Any] (#1407) 2 years ago
context updated tp layers 2 years ago
device [device] update flatten device mesh usage (#2079) 2 years ago
engine [engin/schedule] use p2p_v2 to recontruct pipeline_schedule (#1408) 2 years ago
fx [Pipeline] Add Topo Class (#2059) 2 years ago
gemini [Gemini] ParamOpHook -> ColoParamOpHook (#2080) 2 years ago
kernel [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2 years ago
logging fixed logger 2 years ago
nn [Gemini] ParamOpHook -> ColoParamOpHook (#2080) 2 years ago
pipeline [Pipeline] Add Topo Class (#2059) 2 years ago
registry
tensor [Gemini] ParamOpHook -> ColoParamOpHook (#2080) 2 years ago
testing [zero] test gradient accumulation (#1964) 2 years ago
trainer [polish] remove useless file _mem_tracer_hook.py (#1963) 2 years ago
utils [gemini] fix init bugs for modules (#2047) 2 years ago
zero [Gemini] ParamOpHook -> ColoParamOpHook (#2080) 2 years ago
__init__.py [setup] supported conda-installed torch (#2048) 2 years ago
constants.py updated tp layers 2 years ago
core.py [Tensor] distributed view supports inter-process hybrid parallel (#1169) 2 years ago
global_variables.py updated tp layers 2 years ago
initialize.py [hotfix] remove potiential circle import (#1307) 2 years ago