You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
ver217 ab8c6b4a0e
[zero] refactor memstats collector (#706)
3 years ago
..
amp fix format (#570) 3 years ago
builder [NFC] polish colossalai/builder/builder.py code style (#662) 3 years ago
communication [NFC] polish colossalai/communication/utils.py code style (#656) 3 years ago
context [NFC] polish colossalai/context/process_group_initializer/initializer_sequence.py colossalai/context/process_group_initializer initializer_tensor.py code style (#639) 3 years ago
engine [zero] adapt zero hooks for unsharded module (#699) 3 years ago
kernel [NFC] polish colossalai/kernel/cuda_native/csrc/multi_tensor_adam.cu code style (#667) 3 years ago
logging Refactored docstring to google style 3 years ago
nn []Corrected 3d vocab parallel embedding (#707) 3 years ago
registry Refactored docstring to google style 3 years ago
testing [test] fixed rerun_on_exception and adapted test cases (#487) 3 years ago
trainer [pipeline] refactor pipeline (#679) 3 years ago
utils [zero] refactor memstats collector (#706) 3 years ago
zero [zero] refactor memstats collector (#706) 3 years ago
__init__.py Develop/experiments (#59) 3 years ago
constants.py fix format constants.py (#358) 3 years ago
core.py [polish] polish singleton and global context (#500) 3 years ago
global_variables.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
initialize.py [pipeline] refactor pipeline (#679) 3 years ago