Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
FoolPlayer 997544c1f9 [shardformer] update readme with modules implement doc (#3834) 1 year ago
..
_C
_analyzer
amp [bf16] add bf16 support (#3882) 1 year ago
auto_parallel [nfc] fix typo colossalai/ applications/ (#3831) 2 years ago
autochunk fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2 years ago
booster [doc] fix docs about booster api usage (#3898) 1 year ago
builder
checkpoint_io [booster] torch fsdp fix ckpt (#3788) 2 years ago
cli Modify torch version requirement to adapt torch 2.0 (#3896) 1 year ago
cluster fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2 years ago
communication [CI] fix some spelling errors (#3707) 2 years ago
context [CI] fix some spelling errors (#3707) 2 years ago
device [dtensor] updated api and doc (#3845) 1 year ago
engine fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2 years ago
fx [nfc] fix typo colossalai/cli fx kernel (#3847) 1 year ago
interface
kernel [bf16] add bf16 support (#3882) 1 year ago
lazy [dtensor] updated api and doc (#3845) 1 year ago
logging
nn [shardformer]: Feature/shardformer, add some docstring and readme (#3816) 1 year ago
pipeline [nfc]fix typo colossalai/pipeline tensor nn (#3899) 1 year ago
registry
shardformer [shardformer] update readme with modules implement doc (#3834) 1 year ago
tensor [dtensor] updated api and doc (#3845) 1 year ago
testing [NFC] fix typo with colossalai/auto_parallel/tensor_shard (#3742) 2 years ago
trainer
utils [lazy] refactor lazy init (#3891) 1 year ago
zero [bf16] add bf16 support (#3882) 1 year ago
__init__.py
constants.py
core.py
global_variables.py
initialize.py