Commit Graph

3098 Commits (b2e97458883c64e4f357059f585ff2585fa12edd)
 

Author SHA1 Message Date
hxwang b2e9745888 [chore] sync
6 months ago
hxwang 6e38eafebe [gemini] prefetch chunks
6 months ago
Edenzzzz 785cd9a9c9
[misc] Update PyTorch version in docs (#5711)
7 months ago
Wang Binluo 537f6a3855
[Shardformer]fix the num_heads assert for llama model and qwen model (#5704)
7 months ago
Wang Binluo a3cc68ca93
[Shardformer] Support the Qwen2 model (#5699)
7 months ago
flybird11111 d4c5ef441e
[gemini]remove registered gradients hooks (#5696)
7 months ago
Wang Binluo 22297789ab
Merge pull request #5684 from wangbluo/parallel_output
7 months ago
wangbluo 4e50cce26b fix the mistral model
7 months ago
wangbluo a8408b4d31 remove comment code
7 months ago
pre-commit-ci[bot] ca56b93d83 [pre-commit.ci] auto fixes from pre-commit.com hooks
7 months ago
wangbluo 108ddfb795 add parallel_output for the opt model
7 months ago
pre-commit-ci[bot] 88f057ce7c [pre-commit.ci] auto fixes from pre-commit.com hooks
7 months ago
Edenzzzz 58954b2986
[misc] Add an existing issue checkbox in bug report (#5691)
7 months ago
flybird11111 77ec773388
[zero]remove registered gradients hooks (#5687)
7 months ago
Edenzzzz c25f83c85f
fix missing pad token (#5690)
7 months ago
wangbluo 2632916329 remove useless code
7 months ago
wangbluo 9efc79ef24 add parallel output for mistral model
7 months ago
Wang Binluo d3f34ee8cc
[Shardformer] add assert for num of attention heads divisible by tp_size (#5670)
7 months ago
flybird11111 6af6d6fc9f
[shardformer] support bias_gelu_jit_fused for models (#5647)
7 months ago
Hongxin Liu 7f8b16635b
[misc] refactor launch API and tensor constructor (#5666)
7 months ago
linsj20 91fa553775 [Feature] qlora support (#5586)
7 months ago
flybird11111 8954a0c2e2 [LowLevelZero] low level zero support lora (#5153)
7 months ago
Baizhou Zhang 14b0d4c7e5 [lora] add lora APIs for booster, support lora for TorchDDP (#4981)
7 months ago
Hongxin Liu c1594e4bad
[devops] fix release docker ci (#5665)
7 months ago
Hongxin Liu 4cfbf30a5e
[release] update version (#5654)
7 months ago
Tong Li 68ec99e946
[hotfix] add soft link to support required files (#5661)
7 months ago
binmakeswell b8a711aa2d
[news] llama3 and open-sora v1.1 (#5655)
7 months ago
Hongxin Liu 2082852f3f
[lazyinit] skip whisper test (#5653)
7 months ago
flybird11111 8b7d535977
fix gptj (#5652)
7 months ago
Hongxin Liu 1b387ca9fe
[shardformer] refactor pipeline grad ckpt config (#5646)
7 months ago
Season 7ef91606e1
[Fix]: implement thread-safety singleton to avoid deadlock for very large-scale training scenarios (#5625)
7 months ago
Hongxin Liu bbb2c21f16
[shardformer] fix chatglm implementation (#5644)
7 months ago
flybird11111 5d88ef1aaf
[shardformer] remove useless code (#5645)
7 months ago
flybird11111 148506c828
[coloattention]modify coloattention (#5627)
7 months ago
Edenzzzz 7ee569b05f
[hotfix] Fixed fused layernorm bug without apex (#5609)
7 months ago
Wang Binluo 0d0a582033
[shardformer] update transformers (#5583)
7 months ago
binmakeswell f4c5aafe29
[example] llama3 (#5631)
7 months ago
Hongxin Liu 4de4e31818
[exampe] update llama example (#5626)
7 months ago
Tong Li 862fbaaa62
[Feature] Support LLaMA-3 CPT and ST (#5619)
7 months ago
Hongxin Liu e094933da1
[shardformer] fix pipeline grad ckpt (#5620)
7 months ago
Edenzzzz d83c633ca6
[hotfix] Fix examples no pad token & auto parallel codegen bug; (#5606)
7 months ago
flybird11111 a0ad587c24
[shardformer] refactor embedding resize (#5603)
7 months ago
Hongxin Liu 3788fefc7a
[zero] support multiple (partial) backward passes (#5596)
7 months ago
Camille Zhong 89049b0d89
[doc] fix ColossalMoE readme (#5599)
7 months ago
Hongxin Liu 641b1ee71a
[devops] remove post commit ci (#5566)
8 months ago
digger yu 341263df48
[hotfix] fix typo s/get_defualt_parser /get_default_parser (#5548)
8 months ago
digger yu a799ca343b
[fix] fix typo s/muiti-node /multi-node etc. (#5448)
8 months ago
Edenzzzz 15055f9a36
[hotfix] quick fixes to make legacy tutorials runnable (#5559)
8 months ago
Zhongkai Zhao 8e412a548e
[shardformer] Sequence Parallelism Optimization (#5533)
8 months ago
Edenzzzz 7e0ec5a85c
fix incorrect sharding without zero (#5545)
8 months ago