You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Jiarui Fang 193dc8dacb
[refactor] refactor the memory utils (#715)
3 years ago
..
amp fix format (#570) 3 years ago
builder [NFC] polish colossalai/builder/builder.py code style (#662) 3 years ago
communication [NFC] polish colossalai/communication/utils.py code style (#656) 3 years ago
context [NFC] polish colossalai/context/process_group_initializer/initializer_sequence.py colossalai/context/process_group_initializer initializer_tensor.py code style (#639) 3 years ago
engine [refactor] refactor the memory utils (#715) 3 years ago
kernel [NFC] polish colossalai/kernel/cuda_native/csrc/multi_tensor_adam.cu code style (#667) 3 years ago
logging Refactored docstring to google style 3 years ago
nn [compatibility] fixed tensor parallel compatibility with torch 1.9 (#700) 3 years ago
registry Refactored docstring to google style 3 years ago
testing [test] fixed rerun_on_exception and adapted test cases (#487) 3 years ago
trainer [pipeline] refactor pipeline (#679) 3 years ago
utils [refactor] refactor the memory utils (#715) 3 years ago
zero [refactor] refactor the memory utils (#715) 3 years ago
__init__.py
constants.py fix format constants.py (#358) 3 years ago
core.py [polish] polish singleton and global context (#500) 3 years ago
global_variables.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
initialize.py [pipeline] refactor pipeline (#679) 3 years ago