Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
ver217 7faef93326
fix dist spec mgr (#1045)
3 years ago
..
amp
builder [NFC] polish colossalai/builder/pipeline.py code style (#951) 3 years ago
cli [hotfix] fix some bugs caused by size mismatch. (#1011) 3 years ago
communication [p2p]add object list send/recv (#1024) 3 years ago
context
engine [engine] fixed bug in gradient accumulation dataloader to keep the last step (#1030) 3 years ago
gemini [gemini] accelerate adjust_layout() (#878) 3 years ago
kernel [NFC] polish colossalai/kernel/cuda_native/csrc/colossal_C_frontend.cpp code style 3 years ago
logging
nn [tensor] ColoTensor supports ZeRo (#1015) 3 years ago
registry
tensor fix dist spec mgr (#1045) 3 years ago
testing
trainer
utils [Tensor] add Parameter inheritance for ColoParameter (#1041) 3 years ago
zero [tensor] ColoTensor supports ZeRo (#1015) 3 years ago
__init__.py [NFC] polish __init__.py code style (#965) 3 years ago
constants.py fix typo in constants (#1027) 3 years ago
core.py
global_variables.py
initialize.py