13 Commits (279300dc5f34db219c90a297c0996d00221eae96)

Author SHA1 Message Date
Hongxin Liu da39d21b71 [moe] support mixtral (#5309) 10 months ago
Baizhou Zhang c0a033700c
[shardformer] fix master param sync for hybrid plugin/rewrite unwrapping logic (#4758) 1 year ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752) 1 year ago
Baizhou Zhang 58913441a1
Next commit [checkpointio] Unsharded Optimizer Checkpoint for Gemini Plugin (#4141) 1 year ago
Baizhou Zhang 822c3d4d66
[checkpointio] sharded optimizer checkpoint for DDP plugin (#4002) 1 year ago
Baizhou Zhang c9cff7e7fa
[checkpointio] General Checkpointing of Sharded Optimizers (#3984) 1 year ago
Hongxin Liu 5452df63c5
[plugin] torch ddp plugin supports sharded model checkpoint (#3775) 2 years ago
jiangmingyan 307894f74d
[booster] gemini plugin support shard checkpoint (#3610) 2 years ago
digger-yu b9a8dff7e5
[doc] Fix typo under colossalai and doc(#3618) 2 years ago
jiangmingyan 366a035552
[checkpoint] Shard saved checkpoint need to be compatible with the naming format of hf checkpoint files (#3479) 2 years ago
Frank Lee 1beb85cc25
[checkpoint] refactored the API and added safetensors support (#3427) 2 years ago
Frank Lee 73d3e4d309
[booster] implemented the torch ddd + resnet example (#3232) 2 years ago
Frank Lee cd142fbefa
[api] implemented the checkpoint io module (#3205) 2 years ago