ColossalAI/colossalai
Jiarui Fang 70a8556946
[gemini] get the param visited order during runtime (#2108)
2022-12-09 16:13:03 +08:00
..
_C [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2022-11-17 13:42:33 +08:00
amp [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2022-11-17 13:42:33 +08:00
auto_parallel [autoparallel] add sum handler (#2101) 2022-12-08 17:02:54 +08:00
builder
cli [cli] updated installation cheheck with more inforamtion (#2050) 2022-11-30 17:53:55 +08:00
communication
context updated tp layers 2022-11-02 12:19:38 +08:00
device [device] update flatten device mesh usage (#2079) 2022-12-05 16:16:07 +08:00
engine
fx [autoparallel] support linear function bias addition (#2104) 2022-12-09 10:31:36 +08:00
gemini [gemini] get the param visited order during runtime (#2108) 2022-12-09 16:13:03 +08:00
kernel [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2022-11-17 13:42:33 +08:00
logging fixed logger 2022-11-15 16:00:07 +08:00
nn [Gemini] remove static tracer (#2083) 2022-12-06 12:53:58 +08:00
pipeline [Pipeline Middleware] fix data race in Pipeline Scheduler for DAG (#2087) 2022-12-08 13:32:27 +08:00
registry
tensor [Gemini] ParamOpHook -> ColoParamOpHook (#2080) 2022-12-05 17:11:06 +08:00
testing [zero] test gradient accumulation (#1964) 2022-11-29 13:00:30 +08:00
trainer [polish] remove useless file _mem_tracer_hook.py (#1963) 2022-11-16 15:55:10 +08:00
utils [hotfix] fix a type in ColoInitContext (#2106) 2022-12-09 11:44:39 +08:00
zero [Gemini] use MemStats to store the tracing data. Seperate it from Collector. (#2084) 2022-12-06 16:43:06 +08:00
__init__.py [setup] supported conda-installed torch (#2048) 2022-11-30 16:45:15 +08:00
constants.py updated tp layers 2022-11-02 12:19:38 +08:00
core.py
global_variables.py updated tp layers 2022-11-02 12:19:38 +08:00
initialize.py