You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
アマデウス 9ee197d0e9
moved env variables to global variables; (#215)
3 years ago
..
amp moved env variables to global variables; (#215) 3 years ago
builder add pytorch hooks (#179) 3 years ago
communication moved env variables to global variables; (#215) 3 years ago
context moved env variables to global variables; (#215) 3 years ago
engine moved env variables to global variables; (#215) 3 years ago
kernel moved env variables to global variables; (#215) 3 years ago
logging Fixed docstring in colossalai (#171) 3 years ago
nn moved env variables to global variables; (#215) 3 years ago
registry add pytorch hooks (#179) 3 years ago
trainer moved env variables to global variables; (#215) 3 years ago
utils moved env variables to global variables; (#215) 3 years ago
zero Fixed docstring in colossalai (#171) 3 years ago
__init__.py Develop/experiments (#59) 3 years ago
constants.py moved env variables to global variables; (#215) 3 years ago
core.py Develop/experiments (#59) 3 years ago
global_variables.py moved env variables to global variables; (#215) 3 years ago
initialize.py fixed ddp bug on torch 1.8 (#194) 3 years ago