Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
digger yu 32f81f14d4
[NFC] fix typo colossalai/amp auto_parallel autochunk (#3756)
2 years ago
..
_C [setup] support pre-build and jit-build of cuda kernels (#2374) 2 years ago
_analyzer [example] add train resnet/vit with booster example (#3694) 2 years ago
amp [NFC] fix typo colossalai/amp auto_parallel autochunk (#3756) 2 years ago
auto_parallel [NFC] fix typo colossalai/amp auto_parallel autochunk (#3756) 2 years ago
autochunk [NFC] fix typo colossalai/amp auto_parallel autochunk (#3756) 2 years ago
booster [plugin] torch ddp plugin supports sharded model checkpoint (#3775) 2 years ago
builder
checkpoint_io [plugin] torch ddp plugin supports sharded model checkpoint (#3775) 2 years ago
cli [NFC] fix typo applications/ and colossalai/ (#3735) 2 years ago
cluster [booster] implemented the torch ddd + resnet example (#3232) 2 years ago
communication [CI] fix some spelling errors (#3707) 2 years ago
context [CI] fix some spelling errors (#3707) 2 years ago
device [hotfix] add copyright for solver and device mesh (#2803) 2 years ago
engine [format] Run lint on colossalai.engine (#3367) 2 years ago
fx [doc] Fix typo under colossalai and doc(#3618) 2 years ago
interface [booster] implemented the torch ddd + resnet example (#3232) 2 years ago
kernel [doc] Fix typo under colossalai and doc(#3618) 2 years ago
logging [logger] hotfix, missing _FORMAT (#2231) 2 years ago
nn [doc] Fix typo under colossalai and doc(#3618) 2 years ago
pipeline [pipeline] Add Simplified Alpa DP Partition (#2507) 2 years ago
registry
tensor [tensor] Refactor handle_trans_spec in DistSpecManager 2 years ago
testing [NFC] fix typo with colossalai/auto_parallel/tensor_shard (#3742) 2 years ago
trainer [polish] remove useless file _mem_tracer_hook.py (#1963) 2 years ago
utils [doc] Fix typo under colossalai and doc(#3618) 2 years ago
zero [booster] gemini plugin support shard checkpoint (#3610) 2 years ago
__init__.py [setup] supported conda-installed torch (#2048) 2 years ago
constants.py updated tp layers 2 years ago
core.py
global_variables.py [NFC] polish colossalai/global_variables.py code style (#3259) 2 years ago
initialize.py [zero] reorganize zero/gemini folder structure (#3424) 2 years ago