You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Boyuan Yao 4b40fbd743
[autoparallel] fix forward memory calculation (#2062)
2 years ago
..
_C [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2 years ago
amp [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2 years ago
auto_parallel [autoparallel] fix forward memory calculation (#2062) 2 years ago
builder
cli [cli] updated installation cheheck with more inforamtion (#2050) 2 years ago
communication
context
device [autoparallel] mix gather (#1977) 2 years ago
engine
fx [Pipeline] Add Topo Class (#2059) 2 years ago
gemini [Gemini] fix grad unreleased issue and param recovery issue (#2052) 2 years ago
kernel [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2 years ago
logging
nn [gemini] add arguments (#2046) 2 years ago
pipeline [Pipeline] Add Topo Class (#2059) 2 years ago
registry
tensor [autoparallel] add experimental permute handler (#2029) 2 years ago
testing [zero] test gradient accumulation (#1964) 2 years ago
trainer
utils [gemini] fix init bugs for modules (#2047) 2 years ago
zero [zero] test gradient accumulation (#1964) 2 years ago
__init__.py [setup] supported conda-installed torch (#2048) 2 years ago
constants.py
core.py
global_variables.py
initialize.py