ColossalAI/colossalai
Jiarui Fang 640a6cd304
[refactory] refactory the initialize method for new zero design (#431)
2022-03-16 19:29:37 +08:00
..
amp fixed fp16 optimizer none grad bug (#432) 2022-03-16 14:35:46 +08:00
builder add pytorch hooks (#179) 2022-01-25 22:20:54 +08:00
communication fix format (#332) 2022-03-11 15:50:28 +08:00
context fixed bug in activation checkpointing test (#387) 2022-03-11 15:50:28 +08:00
engine use double buffer to handle grad 2022-03-16 14:24:09 +08:00
kernel [formart] format fixed for kernel\cuda_native codes (#335) 2022-03-11 15:50:28 +08:00
logging [log] better logging display with rich (#426) 2022-03-16 09:51:15 +08:00
nn added Multiply Jitter and capacity factor eval for MOE (#434) 2022-03-16 16:47:44 +08:00
registry add pytorch hooks (#179) 2022-01-25 22:20:54 +08:00
testing added testing module (#435) 2022-03-16 17:20:05 +08:00
trainer Added profiler communication operations 2022-03-11 15:50:28 +08:00
utils fixed mem monitor device (#433) 2022-03-16 15:25:02 +08:00
zero [refactory] refactory the initialize method for new zero design (#431) 2022-03-16 19:29:37 +08:00
__init__.py Develop/experiments (#59) 2021-12-09 15:08:29 +08:00
constants.py fix format constants.py (#358) 2022-03-11 15:50:28 +08:00
core.py Develop/experiments (#59) 2021-12-09 15:08:29 +08:00
global_variables.py Optimized MoE layer and fixed some bugs; 2022-03-11 15:50:28 +08:00
initialize.py [refactory] refactory the initialize method for new zero design (#431) 2022-03-16 19:29:37 +08:00