ColossalAI/colossalai
Jiarui Fang 6b6002962a [zero] zero init context collect numel of model (#375) 2022-03-11 15:50:28 +08:00
..
amp fix format (#362) 2022-03-11 15:50:28 +08:00
builder add pytorch hooks (#179) 2022-01-25 22:20:54 +08:00
communication fix format (#332) 2022-03-11 15:50:28 +08:00
context flake8 style change (#363) 2022-03-11 15:50:28 +08:00
engine Fix/format colossalai/engine/paramhooks/(#350) 2022-03-11 15:50:28 +08:00
kernel [formart] format fixed for kernel\cuda_native codes (#335) 2022-03-11 15:50:28 +08:00
logging fixed mkdir conflict and align yapf config with flake (#220) 2022-02-15 11:31:13 +08:00
nn fix format (#362) 2022-03-11 15:50:28 +08:00
registry add pytorch hooks (#179) 2022-01-25 22:20:54 +08:00
trainer Added profiler communication operations 2022-03-11 15:50:28 +08:00
utils Added PCIE profiler to dectect data transmission (#373) 2022-03-11 15:50:28 +08:00
zero [zero] zero init context collect numel of model (#375) 2022-03-11 15:50:28 +08:00
__init__.py Develop/experiments (#59) 2021-12-09 15:08:29 +08:00
constants.py fix format constants.py (#358) 2022-03-11 15:50:28 +08:00
core.py Develop/experiments (#59) 2021-12-09 15:08:29 +08:00
global_variables.py Optimized MoE layer and fixed some bugs; 2022-03-11 15:50:28 +08:00
initialize.py set criterion as optional in colossalai initialize (#336) 2022-03-11 15:50:28 +08:00