ColossalAI/colossalai
Hongxin Liu a39a5c66fe
Merge branch 'main' into feature/shardformer
2023-09-04 23:43:13 +08:00
..
_C
_analyzer
amp [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354) 2023-08-15 23:25:14 +08:00
auto_parallel fix runtime prepare pass (#4502) 2023-08-30 17:29:38 +08:00
autochunk
booster Merge branch 'main' into feature/shardformer 2023-09-04 23:43:13 +08:00
builder
checkpoint_io Merge branch 'main' into feature/shardformer 2023-09-04 23:43:13 +08:00
cli [example] add llama2 example (#4527) 2023-08-28 17:59:11 +08:00
cluster [shardformer] support interleaved pipeline (#4448) 2023-08-16 19:29:03 +08:00
communication [NFC] fix: format (#4270) 2023-07-26 14:12:57 +08:00
context
device [format] applied code formatting on changed files in pull request 4152 (#4157) 2023-07-04 16:07:47 +08:00
engine
fx
interface [pipeline] refactor 1f1b schedule (#4115) 2023-08-15 23:25:14 +08:00
kernel [example] add llama2 example (#4527) 2023-08-28 17:59:11 +08:00
lazy [shardformer] support lazy init (#4202) 2023-08-15 23:25:14 +08:00
logging
nn [doc] add Series A Funding and NeurIPS news (#4377) 2023-08-04 17:42:07 +08:00
pipeline [shardformer] update bert finetune example with HybridParallelPlugin (#4584) 2023-09-04 21:46:29 +08:00
registry
shardformer [shardformer] Pytree fix (#4533) 2023-09-04 17:52:23 +08:00
tensor [gemini] improve compatibility and add static placement policy (#4479) 2023-08-24 09:29:25 +08:00
testing Next commit [checkpointio] Unsharded Optimizer Checkpoint for Gemini Plugin (#4141) 2023-07-07 16:33:06 +08:00
trainer
utils [test] remove useless tests (#4359) 2023-08-01 18:52:14 +08:00
zero Merge branch 'main' into feature/shardformer 2023-09-04 23:43:13 +08:00
__init__.py
constants.py
core.py
global_variables.py
initialize.py