2161 Commits (162251ab7844e4116a36d6e0fec2ac7ccd03f74d)

Author SHA1 Message Date
botbw 162251ab78 [ckpt] add safetensors util 1 month ago
Tong Li 4c8e85ee0d
[Coati] Train DPO using PP (#6054) 1 month ago
Hongxin Liu dc2cdaf3e8
[shardformer] optimize seq parallelism (#6086) 1 month ago
Hongxin Liu 646b3c5a90
[shardformer] fix linear 1d row and support uneven splits for fused qkv linear (#6084) 1 month ago
botbw 4fa6b9509c
[moe] add parallel strategy for shared_expert && fix test for deepseek (#6063) 2 months ago
wangbluo 10e4f7da72 fix 2 months ago
wangbluo 827ef3ee9a fix 2 months ago
Guangyao Zhang bdb125f83f
[doc] FP8 training and communication document (#6050) 2 months ago
Guangyao Zhang f20b066c59
[fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059) 2 months ago
wangbluo b582319273 fix 2 months ago
wangbluo 0ad3129cb9 fix 2 months ago
wangbluo 0b14a5512e fix 2 months ago
botbw 696fced0d7
[fp8] fix missing fp8_comm flag in mixtral (#6057) 2 months ago
wangbluo dc032172c3 fix 2 months ago
wangbluo f393867cff fix 2 months ago
wangbluo 6eb8832366 fix 2 months ago
wangbluo 683179cefd fix 2 months ago
wangbluo 0a01e2a453 fix the attn 2 months ago
pre-commit-ci[bot] 216d54e374 [pre-commit.ci] auto fixes from pre-commit.com hooks 2 months ago
wangbluo fdd84b9087 fix the sp 2 months ago
Hongxin Liu 13946c4448
[fp8] hotfix backward hook (#6053) 2 months ago
botbw c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 2 months ago
Wenxuan Tan 8fd25d6e09
[Feature] Split cross-entropy computation in SP (#5959) 2 months ago
Hanks 5ce6dd75bf
[fp8] disable all_to_all_fp8 in intranode (#6045) 2 months ago
Hongxin Liu 26e553937b
[fp8] fix linear hook (#6046) 3 months ago
Hongxin Liu c3b5caff0e
[fp8] optimize all-gather (#6043) 3 months ago
Gao, Ruiyuan e9032fb0b2
[colossalai/checkpoint_io/...] fix bug in load_state_dict_into_model; format error msg (#6020) 3 months ago
Guangyao Zhang e96a0761ea
[FP8] unsqueeze scale to make it compatible with torch.compile (#6040) 3 months ago
Hongxin Liu cc1b0efc17
[plugin] hotfix zero plugin (#6036) 3 months ago
pre-commit-ci[bot] 80d24ae519 [pre-commit.ci] auto fixes from pre-commit.com hooks 3 months ago
wangbluo dae39999d7 fix 3 months ago
Wenxuan Tan 7cf9df07bc
[Hotfix] Fix llama fwd replacement bug (#6031) 3 months ago
flybird11111 9e767643dd
Update low_level_zero_plugin.py 3 months ago
pre-commit-ci[bot] 3b0df30362 [pre-commit.ci] auto fixes from pre-commit.com hooks 3 months ago
pre-commit-ci[bot] a292554179 [pre-commit.ci] auto fixes from pre-commit.com hooks 3 months ago
wangbluo 971b16a74f fix 3 months ago
Wang Binluo eea37da6fa
[fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
wangbluo 698c8b9804 fix 3 months ago
wangbluo 6aface9316 fix 3 months ago
wangbluo 193030f696 fix 3 months ago
wangbluo eb5ba40def fix the merge 3 months ago
Hongxin Liu 0d3b0bd864
[plugin] add cast inputs option for zero (#6003) (#6022) 3 months ago
wangbluo 2ee6235cfa fix 3 months ago
wangbluo f7acfa1bd5 fix 3 months ago
wangbluo 53823118f2 fix 3 months ago
Edenzzzz dcc44aab8d
[misc] Use dist logger in plugins (#6011) 3 months ago
wangbluo 1f703e0ef4 fix 3 months ago
wangbluo 88b3f0698c fix the merge 3 months ago
wangbluo 2eb36839c6 fix 3 months ago
wangbluo 12b44012d9 fix 3 months ago