You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
FoolPlayer dfca9678fa
integrate with dist layer (#4011)
1 year ago
..
_C
_analyzer [example] add train resnet/vit with booster example (#3694) 2 years ago
amp [bf16] add bf16 support (#3882) 1 year ago
auto_parallel [nfc] fix typo colossalai/ applications/ (#3831) 2 years ago
autochunk fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2 years ago
booster [gemini] fix argument naming during chunk configuration searching 1 year ago
builder
checkpoint_io [hotfix] fix import bug in checkpoint_io (#4142) 1 year ago
cli Modify torch version requirement to adapt torch 2.0 (#3896) 1 year ago
cluster fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2 years ago
communication [CI] fix some spelling errors (#3707) 2 years ago
context [CI] fix some spelling errors (#3707) 2 years ago
device [device] support init device mesh from process group (#3990) 1 year ago
engine [nfc]fix ColossalaiOptimizer is not defined (#4122) 1 year ago
fx [nfc] fix typo colossalai/cli fx kernel (#3847) 2 years ago
interface
kernel fix Tensor is not defined (#4129) 1 year ago
lazy Revert "[sync] sync feature/shardformer with develop" 1 year ago
logging
nn [shardformer] integrated linear 1D with dtensor (#3996) 1 year ago
pipeline [nfc]fix typo colossalai/pipeline tensor nn (#3899) 1 year ago
registry
shardformer integrate with dist layer (#4011) 1 year ago
tensor [shardformer] integrated linear 1D with dtensor (#3996) 1 year ago
testing [testing] move pytest to be inside the function (#4087) 1 year ago
trainer fix typo with colossalai/trainer utils zero (#3908) 1 year ago
utils fix typo with colossalai/trainer utils zero (#3908) 1 year ago
zero [gemini] fix argument naming during chunk configuration searching 1 year ago
__init__.py
constants.py
core.py
global_variables.py
initialize.py [nfc] fix typo colossalai/zero (#3923) 1 year ago