ColossalAI/colossalai
Jiarui Fang 05bb28aacf
[Gemini] mapping of preop timestep and param (#2124)
2022-12-13 12:50:24 +08:00
..
_C [optimizer] add div_scale for optimizers (#2117) 2022-12-12 17:58:57 +08:00
amp
auto_parallel [autoparallel] gpt2lp runtimee test (#2113) 2022-12-12 18:06:40 +08:00
builder
cli [cli] updated installation cheheck with more inforamtion (#2050) 2022-11-30 17:53:55 +08:00
communication
context
device [device] update flatten device mesh usage (#2079) 2022-12-05 16:16:07 +08:00
engine
fx [autoparallel] support linear function bias addition (#2104) 2022-12-09 10:31:36 +08:00
gemini [Gemini] mapping of preop timestep and param (#2124) 2022-12-13 12:50:24 +08:00
kernel [optimizer] add div_scale for optimizers (#2117) 2022-12-12 17:58:57 +08:00
logging
nn [Gemini] chunk init using runtime visited param order (#2115) 2022-12-12 18:06:16 +08:00
pipeline [PP Middleware] Add bwd and step for PP middleware (#2111) 2022-12-12 12:40:03 +08:00
registry
tensor [NFC] polish comments for Chunk class (#2116) 2022-12-12 15:39:31 +08:00
testing [zero] test gradient accumulation (#1964) 2022-11-29 13:00:30 +08:00
trainer
utils [hotfix] fix a type in ColoInitContext (#2106) 2022-12-09 11:44:39 +08:00
zero [NFC] polish comments for Chunk class (#2116) 2022-12-12 15:39:31 +08:00
__init__.py [setup] supported conda-installed torch (#2048) 2022-11-30 16:45:15 +08:00
constants.py
core.py
global_variables.py
initialize.py