You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Liu Ziming 8065cc5fba
Modify torch version requirement to adapt torch 2.0 (#3896)
1 year ago
..
_C
_analyzer [example] add train resnet/vit with booster example (#3694) 2 years ago
amp [NFC] fix typo colossalai/amp auto_parallel autochunk (#3756) 2 years ago
auto_parallel [nfc] fix typo colossalai/ applications/ (#3831) 2 years ago
autochunk fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2 years ago
booster [booster] add warning for torch fsdp plugin doc (#3833) 2 years ago
builder
checkpoint_io [booster] torch fsdp fix ckpt (#3788) 2 years ago
cli Modify torch version requirement to adapt torch 2.0 (#3896) 1 year ago
cluster fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2 years ago
communication [CI] fix some spelling errors (#3707) 2 years ago
context [CI] fix some spelling errors (#3707) 2 years ago
device [nfc] fix typo colossalai/cli fx kernel (#3847) 2 years ago
engine fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2 years ago
fx [nfc] fix typo colossalai/cli fx kernel (#3847) 2 years ago
interface [booster] implemented the torch ddd + resnet example (#3232) 2 years ago
kernel [nfc] fix typo colossalai/cli fx kernel (#3847) 2 years ago
lazy [lazy] refactor lazy init (#3891) 1 year ago
logging
nn [NFC]fix typo colossalai/auto_parallel nn utils etc. (#3779) 2 years ago
pipeline
registry
tensor [tensor] Refactor handle_trans_spec in DistSpecManager 2 years ago
testing [NFC] fix typo with colossalai/auto_parallel/tensor_shard (#3742) 2 years ago
trainer
utils [lazy] refactor lazy init (#3891) 1 year ago
zero [lazy] refactor lazy init (#3891) 1 year ago
__init__.py
constants.py
core.py
global_variables.py [NFC] polish colossalai/global_variables.py code style (#3259) 2 years ago
initialize.py [zero] reorganize zero/gemini folder structure (#3424) 2 years ago