You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Baizhou Zhang 38ccb8b1a3
[shardformer] support from_pretrained when loading model with HybridParallelPlugin (#4575)
1 year ago
..
_C [setup] support pre-build and jit-build of cuda kernels (#2374) 2 years ago
_analyzer [example] add train resnet/vit with booster example (#3694) 2 years ago
amp [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354) 1 year ago
auto_parallel [NFC] polish runtime_preparation_pass style (#4266) 1 year ago
autochunk fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2 years ago
booster [shardformer] support from_pretrained when loading model with HybridParallelPlugin (#4575) 1 year ago
builder
checkpoint_io [shardformer] support from_pretrained when loading model with HybridParallelPlugin (#4575) 1 year ago
cli fix localhost measurement (#4320) 1 year ago
cluster [shardformer] support interleaved pipeline (#4448) 1 year ago
communication [NFC] fix: format (#4270) 1 year ago
context [CI] fix some spelling errors (#3707) 2 years ago
device [format] applied code formatting on changed files in pull request 4152 (#4157) 1 year ago
engine [nfc]fix ColossalaiOptimizer is not defined (#4122) 1 year ago
fx [nfc] fix typo colossalai/cli fx kernel (#3847) 2 years ago
interface [pipeline] refactor 1f1b schedule (#4115) 1 year ago
kernel [shardformer] update shardformer to use flash attention 2 (#4392) 1 year ago
lazy [shardformer] support lazy init (#4202) 1 year ago
logging
nn [doc] add Series A Funding and NeurIPS news (#4377) 1 year ago
pipeline [shardformer] fix emerged bugs after updating transformers (#4526) 1 year ago
registry
shardformer [shardformer] fix submodule replacement bug when enabling pp (#4544) 1 year ago
tensor [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354) 1 year ago
testing Next commit [checkpointio] Unsharded Optimizer Checkpoint for Gemini Plugin (#4141) 1 year ago
trainer fix typo with colossalai/trainer utils zero (#3908) 1 year ago
utils [test] remove useless tests (#4359) 1 year ago
zero [shardformer] support sharded optimizer checkpointIO of HybridParallelPlugin (#4540) 1 year ago
__init__.py
constants.py
core.py
global_variables.py [NFC] polish colossalai/global_variables.py code style (#3259) 2 years ago
initialize.py [nfc] fix typo colossalai/zero (#3923) 1 year ago