9 Commits (cf519dac6a5799b8f314aac6f510e2a98d3af9c6)

Author SHA1 Message Date
Wang Binluo eea37da6fa
[fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905) 3 months ago
Wang Binluo 0d0a582033
[shardformer] update transformers (#5583) 7 months ago
Zhongkai Zhao 8e412a548e
[shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
Wenhao Chen e614aa34f3
[shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 8 months ago
Jianghai cf579ff46d
[Inference] Dynamic Batching Inference, online and offline (#4953) 1 year ago
Hongxin Liu b8e770c832
[test] merge old components to test to model zoo (#4945) 1 year ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752) 1 year ago
flybird11111 7486ed7d3a
[shardformer] update llama2/opt finetune example and fix llama2 policy (#4645) 1 year ago
Frank Lee 58df720570 [shardformer] adapted T5 and LLaMa test to use kit (#4049) 1 year ago