Commit Graph

1779 Commits (68ec99e946129298b2e6d8e6463886fe6b22a5df)

Author SHA1 Message Date
flybird11111 8b7d535977
fix gptj (#5652)
7 months ago
Hongxin Liu 1b387ca9fe
[shardformer] refactor pipeline grad ckpt config (#5646)
7 months ago
Season 7ef91606e1
[Fix]: implement thread-safety singleton to avoid deadlock for very large-scale training scenarios (#5625)
7 months ago
Hongxin Liu bbb2c21f16
[shardformer] fix chatglm implementation (#5644)
7 months ago
flybird11111 5d88ef1aaf
[shardformer] remove useless code (#5645)
7 months ago
flybird11111 148506c828
[coloattention]modify coloattention (#5627)
7 months ago
Edenzzzz 7ee569b05f
[hotfix] Fixed fused layernorm bug without apex (#5609)
7 months ago
Wang Binluo 0d0a582033
[shardformer] update transformers (#5583)
7 months ago
Hongxin Liu 4de4e31818
[exampe] update llama example (#5626)
7 months ago
Hongxin Liu e094933da1
[shardformer] fix pipeline grad ckpt (#5620)
7 months ago
Edenzzzz d83c633ca6
[hotfix] Fix examples no pad token & auto parallel codegen bug; (#5606)
7 months ago
flybird11111 a0ad587c24
[shardformer] refactor embedding resize (#5603)
7 months ago
Hongxin Liu 3788fefc7a
[zero] support multiple (partial) backward passes (#5596)
8 months ago
Hongxin Liu 641b1ee71a
[devops] remove post commit ci (#5566)
8 months ago
Edenzzzz 15055f9a36
[hotfix] quick fixes to make legacy tutorials runnable (#5559)
8 months ago
Zhongkai Zhao 8e412a548e
[shardformer] Sequence Parallelism Optimization (#5533)
8 months ago
Edenzzzz 7e0ec5a85c
fix incorrect sharding without zero (#5545)
8 months ago
Wenhao Chen e614aa34f3
[shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508)
8 months ago
Insu Jang 00525f7772
[shardformer] fix pipeline forward error if custom layer distribution is used (#5189)
8 months ago
github-actions[bot] e6707a6e8d
[format] applied code formatting on changed files in pull request 5510 (#5517)
8 months ago
Hongxin Liu 19e1a5cf16
[shardformer] update colo attention to support custom mask (#5510)
8 months ago
Edenzzzz 9a3321e9f4
Merge pull request #5515 from Edenzzzz/fix_layout_convert
8 months ago
Edenzzzz 61da3fbc52 fixed layout converter caching and updated tester
8 months ago
Rocky Duan cbe34c557c
Fix ColoTensorSpec for py11 (#5440)
8 months ago
flybird11111 0688d92e2d
[shardformer]Fix lm parallel. (#5480)
8 months ago
Wenhao Chen bb0a668fee
[hotfix] set return_outputs=False in examples and polish code (#5404)
8 months ago
flybird11111 5e16bf7980
[shardformer] fix gathering output when using tensor parallelism (#5431)
9 months ago
Hongxin Liu f2e8b9ef9f
[devops] fix compatibility (#5444)
9 months ago
digger yu 385e85afd4
[hotfix] fix typo s/keywrods/keywords etc. (#5429)
9 months ago
digger yu 5e1c93d732
[hotfix] fix typo change MoECheckpintIO to MoECheckpointIO (#5335)
9 months ago
digger yu 049121d19d
[hotfix] fix typo change enabel to enable under colossalai/shardformer/ (#5317)
9 months ago
digger yu 16c96d4d8c
[hotfix] fix typo change _descrption to _description (#5331)
9 months ago
Hongxin Liu 070df689e6
[devops] fix extention building (#5427)
9 months ago
flybird11111 29695cf70c
[example]add gpt2 benchmark example script. (#5295)
9 months ago
flybird11111 0a25e16e46
[shardformer]gather llama logits (#5398)
9 months ago
QinLuo bf34c6fef6
[fsdp] impl save/load shard model/optimizer (#5357)
9 months ago
Stephan Kölker 5d380a1a21
[hotfix] Fix wrong import in meta_registry (#5392)
9 months ago
Hongxin Liu 7303801854
[llama] fix training and inference scripts (#5384)
9 months ago
Frank Lee efef43b53c
Merge pull request #5372 from hpcaitech/exp/mixtral
10 months ago
Frank Lee 4c03347fc7
Merge pull request #5377 from hpcaitech/example/llama-npu
10 months ago
ver217 06db94fbc9 [moe] fix tests
10 months ago
Hongxin Liu da39d21b71 [moe] support mixtral (#5309)
10 months ago
Hongxin Liu c904d2ae99 [moe] update capacity computing (#5253)
10 months ago
Xuanlei Zhao 7d8e0338a4 [moe] init mixtral impl
10 months ago
Hongxin Liu c53ddda88f
[lr-scheduler] fix load state dict and add test (#5369)
10 months ago
Hongxin Liu eb4f2d90f9
[llama] polish training script and fix optim ckpt (#5368)
10 months ago
Hongxin Liu 6c0fa7b9a8
[llama] fix dataloader for hybrid parallel (#5358)
10 months ago
Hongxin Liu 2dd01e3a14
[gemini] fix param op hook when output is tuple (#5355)
10 months ago
Wenhao Chen 1c790c0877
[fix] remove unnecessary dp_size assert (#5351)
10 months ago
Hongxin Liu ffffc32dc7
[checkpointio] fix gemini and hybrid parallel optim checkpoint (#5347)
10 months ago