You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
HELSON 36b8477228
Fixed parameter initialization in FFNExpert (#251)
3 years ago
..
amp fixed apex import (#227) 3 years ago
builder add pytorch hooks (#179) 3 years ago
communication moved env variables to global variables; (#215) 3 years ago
context moved env variables to global variables; (#215) 3 years ago
engine moved env variables to global variables; (#215) 3 years ago
kernel Optimized MoE layer and fixed some bugs; 3 years ago
logging fixed mkdir conflict and align yapf config with flake (#220) 3 years ago
nn Fixed parameter initialization in FFNExpert (#251) 3 years ago
registry add pytorch hooks (#179) 3 years ago
trainer moved env variables to global variables; (#215) 3 years ago
utils fixed mkdir conflict and align yapf config with flake (#220) 3 years ago
zero Fixed docstring in colossalai (#171) 3 years ago
__init__.py Develop/experiments (#59) 3 years ago
constants.py moved env variables to global variables; (#215) 3 years ago
core.py Develop/experiments (#59) 3 years ago
global_variables.py Optimized MoE layer and fixed some bugs; 3 years ago
initialize.py fixed ddp bug on torch 1.8 (#194) 3 years ago