ColossalAI/colossalai
Jiarui Fang 223332ff7e
[Gemini] rename ParamTracerWrapper -> RuntimeMemTracer (#2073)
2022-12-05 12:45:11 +08:00
..
_C [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2022-11-17 13:42:33 +08:00
amp [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2022-11-17 13:42:33 +08:00
auto_parallel [autoparallel] add binary elementwise metainfo for auto parallel (#2058) 2022-12-04 15:18:51 +08:00
builder
cli [cli] updated installation cheheck with more inforamtion (#2050) 2022-11-30 17:53:55 +08:00
communication
context
device [autoparallel] mix gather (#1977) 2022-11-23 21:49:17 +08:00
engine
fx [Pipeline] Add Topo Class (#2059) 2022-12-02 18:13:20 +08:00
gemini [Gemini] rename ParamTracerWrapper -> RuntimeMemTracer (#2073) 2022-12-05 12:45:11 +08:00
kernel [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2022-11-17 13:42:33 +08:00
logging
nn [gemini] add arguments (#2046) 2022-11-30 16:40:13 +08:00
pipeline [Pipeline] Add Topo Class (#2059) 2022-12-02 18:13:20 +08:00
registry
tensor [autoparallel] add experimental permute handler (#2029) 2022-11-27 20:26:52 +08:00
testing [zero] test gradient accumulation (#1964) 2022-11-29 13:00:30 +08:00
trainer
utils [gemini] fix init bugs for modules (#2047) 2022-11-30 17:06:10 +08:00
zero [zero] test gradient accumulation (#1964) 2022-11-29 13:00:30 +08:00
__init__.py [setup] supported conda-installed torch (#2048) 2022-11-30 16:45:15 +08:00
constants.py
core.py
global_variables.py
initialize.py