You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Jiarui Fang 00670c870e
[zero] bucketized tensor cpu gpu copy (#368)
3 years ago
..
amp refactored grad scaler (#338) 3 years ago
builder add pytorch hooks (#179) 3 years ago
communication Added profiler communication operations 3 years ago
context moved env variables to global variables; (#215) 3 years ago
engine [zero] able to place params on cpu after zero init context (#365) 3 years ago
kernel [zero] cpu adam kernel (#288) 3 years ago
logging fixed mkdir conflict and align yapf config with flake (#220) 3 years ago
nn [zero] cpu adam kernel (#288) 3 years ago
registry add pytorch hooks (#179) 3 years ago
trainer Added profiler communication operations 3 years ago
utils [zero] bucketized tensor cpu gpu copy (#368) 3 years ago
zero [zero] bucketized tensor cpu gpu copy (#368) 3 years ago
__init__.py Develop/experiments (#59) 3 years ago
constants.py moved env variables to global variables; (#215) 3 years ago
core.py Develop/experiments (#59) 3 years ago
global_variables.py Optimized MoE layer and fixed some bugs; 3 years ago
initialize.py set criterion as optional in colossalai initialize (#336) 3 years ago