Commit Graph

72 Commits (58ad76d4665032bbe548d066116d1c572ce98979)

Author SHA1 Message Date
Edenzzzz 43995ee436
[Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694)
6 months ago
Wang Binluo a3cc68ca93
[Shardformer] Support the Qwen2 model (#5699)
7 months ago
Hongxin Liu bbb2c21f16
[shardformer] fix chatglm implementation (#5644)
7 months ago
Wang Binluo 0d0a582033
[shardformer] update transformers (#5583)
7 months ago
Hongxin Liu 641b1ee71a
[devops] remove post commit ci (#5566)
8 months ago
Zhongkai Zhao 8e412a548e
[shardformer] Sequence Parallelism Optimization (#5533)
8 months ago
Wenhao Chen e614aa34f3
[shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508)
8 months ago
flybird11111 29695cf70c
[example]add gpt2 benchmark example script. (#5295)
9 months ago
Frank Lee d69cd2eb89
[workflow] fixed oom tests (#5275)
10 months ago
Frank Lee 2b83418719
[ci] fixed ddp test (#5254)
11 months ago
Frank Lee d5eeeb1416
[ci] fixed booster test (#5251)
11 months ago
Frank Lee edf94a35c3
[workflow] fixed build CI (#5240)
11 months ago
flybird11111 21aa5de00b
[gemini] hotfix NaN loss while using Gemini + tensor_parallel (#5150)
12 months ago
Wenhao Chen 7172459e74
[shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088)
1 year ago
Jianghai cf579ff46d
[Inference] Dynamic Batching Inference, online and offline (#4953)
1 year ago
Hongxin Liu b8e770c832
[test] merge old components to test to model zoo (#4945)
1 year ago
Zhongkai Zhao db40e086c8 [test] modify model supporting part of low_level_zero plugin (including correspoding docs)
1 year ago
Jianghai ce7ade3882
[inference] chatglm2 infer demo (#4724)
1 year ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752)
1 year ago
digger yu 9c2feb2f0b
fix some typo with colossalai/device colossalai/tensor/ etc. (#4171)
1 year ago
flybird11111 eedaa3e1ef
[shardformer]fix gpt2 double head (#4663)
1 year ago
flybird11111 7486ed7d3a
[shardformer] update llama2/opt finetune example and fix llama2 policy (#4645)
1 year ago
Hongxin Liu a39a5c66fe
Merge branch 'main' into feature/shardformer
1 year ago
Hongxin Liu 27061426f7
[gemini] improve compatibility and add static placement policy (#4479)
1 year ago
flybird11111 59e252ecdb
[shardformer] chatglm support sequence parallel (#4482)
1 year ago
Jianghai 5545114fd8
rename chatglm to chatglm2 (#4484)
1 year ago
flybird11111 108e54a0b4 [shardformer]update t5 tests for using all optimizations. (#4407)
1 year ago
flybird11111 1edc9b5fb3 [shardformer] update tests for all optimization (#4413)
1 year ago
Baizhou Zhang 7711bd524a [shardformer] rewrite tests for opt/bloom/llama/vit/chatglm (#4395)
1 year ago
Jianghai 7596e9ae08 [pipeline] rewrite bert tests and fix some bugs (#4409)
1 year ago
flybird1111 906426cb44 [Shardformer] Merge flash attention branch to pipeline branch (#4362)
1 year ago
Jianghai a88e92251d [pipeline] add chatglm (#4363)
1 year ago
Baizhou Zhang b1feeced8e [shardformer] add util functions for shardformer tests/fix sync_shared_param (#4366)
1 year ago
Bin Jia 5c6f183192 [test] Hotfix/fix some model test and refactor check util api (#4369)
1 year ago
FoolPlayer 879301d0da [shardformer] support Blip2 (#4243)
1 year ago
klhhhhh 8120eca0c0 [shardformer] support ChatGLMForConditionalGeneration & add fusedlayernorm for vit
1 year ago
klhhhhh 4da05052f4 [shardformer] pre-commit check files
1 year ago
klhhhhh f155ae89c4 [shardformer] ChatGLM support layernorm sharding
1 year ago
klhhhhh 00f6ef159d [shardformer] delete some file
1 year ago
klhhhhh dad00c42aa [shardformer] support chatglm without layernorm
1 year ago
klhhhhh cbb54d3202 [shardformer] polish code
1 year ago
klhhhhh 6ee4c9ee21 [shardformer] add test kit in model zoo for chatglm
1 year ago
klhhhhh 7377be7a53 import chatglm
1 year ago
Kun Lin ed34bb1310 Feature/chatglm (#4240)
1 year ago
FoolPlayer 9ee4ebea83 [shardformer] support whisper (#4212)
1 year ago
FoolPlayer dd2bf02679 [shardformer] support SAM (#4231)
1 year ago
Baizhou Zhang 0ceec8f9a9 [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354)
1 year ago
FoolPlayer b3f5d7a3ba [shardformer] support pipeline base vit model (#4284)
1 year ago
Baizhou Zhang 36e546b2cc [pipeline] add pipeline support for T5Stack/T5EncoderModel (#4300)
1 year ago
Baizhou Zhang 2a2eacfaf1 [pipeline] support shardformer for GPT2ForQuestionAnswering & complete pipeline support for GPT2 (#4245)
1 year ago