Commit Graph

2196 Commits (feature/async-io)

Author SHA1 Message Date
Wang Binluo 75c963686f
[lora] lora support hybrid parallel plugin (#5956)
4 months ago
botbw 62cdac6b7b [chore] remove redundant test case, print string & reduce test tokens
4 months ago
botbw d1d1ab871e [moe] solve dp axis issue
4 months ago
botbw 65daa87627 [doc] add MoeHybridParallelPlugin docstring
4 months ago
hxwang 7bedd03739 [moe] remove force_overlap_comm flag and add warning instead
4 months ago
hxwang f7c5485ed6 [chore] docstring
4 months ago
haze188 7e737df5ad [misc] remove useless condition
4 months ago
haze188 70793ce9ed [misc] fix ci failure: change default value to false in moe plugin
4 months ago
hxwang 606b0891ed [chore] change moe_pg_mesh to private
4 months ago
hxwang 5b4c12381b Revert "[moe] implement submesh initialization"
4 months ago
hxwang cb01c0d5ce [moe] refactor mesh assignment
4 months ago
haze188 034020bd04 [misc] remove debug/print code
4 months ago
hxwang c3dc9b4dba [deepseek] replace attn (a workaround for bug in transformers)
4 months ago
hxwang 6c39f0b144 [test] add check
4 months ago
haze188 b2952a5982 [moe] deepseek moe sp support
4 months ago
botbw 96d0fbc531 [bug] fix: somehow logger hangs the program
4 months ago
hxwang 067e18f7e9 [test] fix test: test_zero1_2
4 months ago
hxwang 74b03de3f9 [moe] remove ops
4 months ago
hxwang 70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure
4 months ago
hxwang 46037c2ccd [chore] minor fix after rebase
4 months ago
hxwang 803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix)
4 months ago
hxwang 7077d38d5a [moe] finalize test (no pp)
4 months ago
haze188 2cddeac717 moe sp + ep bug fix
4 months ago
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp
4 months ago
hxwang 09d6280d3e [chore] minor fix
4 months ago
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918)
4 months ago
hxwang 3e2b6132b7 [moe] clean legacy code
4 months ago
hxwang 74eccac0db [moe] test deepseek
4 months ago
botbw dc583aa576 [moe] implement tp
4 months ago
hxwang 102b784a10 [chore] arg pass & remove drop token
4 months ago
botbw 8dbb86899d [chore] trivial fix
4 months ago
botbw 014faf6c5a [chore] manually revert unintended commit
4 months ago
botbw 9b9b76bdcd [moe] add mixtral dp grad scaling when not all experts are activated
4 months ago
botbw e28e05345b [moe] implement submesh initialization
4 months ago
haze188 5ed5e8cfba solve hang when parallel mode = pp + dp
4 months ago
botbw 13b48ac0aa [zero] solve hang
4 months ago
botbw b5bfeb2efd [moe] implement transit between non moe tp and ep
4 months ago
botbw 37443cc7e4 [test] pass mixtral shardformer test
4 months ago
hxwang 46c069b0db [zero] solve hang
4 months ago
hxwang 0fad23c691 [chore] handle non member group
4 months ago
hxwang a249e71946 [test] mixtra pp shard test
4 months ago
hxwang 8ae8525bdf [moe] fix plugin
4 months ago
hxwang 0b76b57cd6 [test] add mixtral transformer test
4 months ago
hxwang f9b6fcf81f [test] add mixtral for sequence classification
4 months ago
Hongxin Liu 060892162a
[zero] hotfix update master params (#5951)
4 months ago
Runyu Lu bcf0181ecd
[Feat] Distrifusion Acceleration Support for Diffusion Inference (#5895)
4 months ago
Hongxin Liu 7b38964e3a
[shardformer] hotfix attn mask (#5947)
4 months ago
Hongxin Liu 9664b1bc19
[shardformer] hotfix attn mask (#5945)
4 months ago
Edenzzzz 2069472e96
[Hotfix] Fix ZeRO typo #5936
4 months ago
Hongxin Liu 5fd0592767
[fp8] support all-gather flat tensor (#5932)
4 months ago
Gao, Ruiyuan 5fb958cc83
[FIX BUG] convert env param to int in (#5934)
4 months ago
Insu Jang a521ffc9f8
Add n_fused as an input from native_module (#5894)
4 months ago
Hongxin Liu e86127925a
[plugin] support all-gather overlap for hybrid parallel (#5919)
4 months ago
GuangyaoZhang 5b969fd831 fix shardformer fp8 communication training degradation
4 months ago
GuangyaoZhang 6a20f07b80 remove all to all
4 months ago
GuangyaoZhang 5a310b9ee1 fix rebase
4 months ago
GuangyaoZhang 457a0de79f shardformer fp8
4 months ago
アマデウス 530283dba0 fix object_to_tensor usage when torch>=2.3.0 (#5820)
4 months ago
Guangyao Zhang 2e28c793ce [compatibility] support torch 2.2 (#5875)
4 months ago
Guangyao Zhang 1c961b20f3
[ShardFormer] fix qwen2 sp (#5903)
4 months ago
Stephan Kö 45c49dde96
[Auto Parallel]: Speed up intra-op plan generation by 44% (#5446)
4 months ago
pre-commit-ci[bot] 51f916b11d [pre-commit.ci] auto fixes from pre-commit.com hooks
5 months ago
BurkeHulk 1f1b856354 Merge remote-tracking branch 'origin/feature/fp8_comm' into feature/fp8_comm
5 months ago
BurkeHulk e88190184a support fp8 communication in pipeline parallelism
5 months ago
BurkeHulk 1e1959467e fix scaling algorithm in FP8 casting
5 months ago
Hongxin Liu c068ef0fa0
[zero] support all-gather overlap (#5898)
5 months ago
GuangyaoZhang dbfa7d39fc fix typo
5 months ago
Guangyao Zhang 669849d74b
[ShardFormer] Add Ulysses Sequence Parallelism support for Command-R, Qwen2 and ChatGLM (#5897)
5 months ago
Edenzzzz fbf33ecd01
[Feature] Enable PP + SP for llama (#5868)
5 months ago
Runyu Lu 66abf1c6e8
[HotFix] CI,import,requirements-test for #5838 (#5892)
5 months ago
Runyu Lu cba20525a8
[Feat] Diffusion Model(PixArtAlpha/StableDiffusion3) Support (#5838)
5 months ago
Edenzzzz 8ec24b6a4d
[Hoxfix] Fix CUDA_DEVICE_MAX_CONNECTIONS for comm overlap
5 months ago
Haze188 3420921101
[shardformer] DeepseekMoE support (#5871)
5 months ago
pre-commit-ci[bot] e17f835df7 [pre-commit.ci] auto fixes from pre-commit.com hooks
5 months ago
Hanks 6991819a97
Merge branch 'hpcaitech:main' into feature/fp8_comm
5 months ago
Hongxin Liu 7afbc81d62
[quant] fix bitsandbytes version check (#5882)
5 months ago
Wang Binluo 6cd4c32be4
[shardformer] fix the moe (#5883)
5 months ago
Edenzzzz eb24fcd914
[Hotfix] Fix OPT gradient checkpointing forward
5 months ago
Haze188 ea94c07b95
[hotfix] fix the bug that large tensor exceed the maximum capacity of TensorBucket (#5879)
5 months ago
pre-commit-ci[bot] 7c2f79fa98
[pre-commit.ci] pre-commit autoupdate (#5572)
5 months ago
Jianghai 8ab46b4000
[Shardformer] change qwen2 modeling into gradient checkpointing style (#5874)
5 months ago
HangXu f5a52e1600
fp8 operators for compressed communication
5 months ago
Haze188 416580b314
[MoE/ZeRO] Moe refactor with zero refactor (#5821)
5 months ago
flybird11111 773d9f964a
[shardformer]delete xformers (#5859)
5 months ago
Runyu Lu 3c7cda0c9a
[Inference]Lazy Init Support (#5785)
5 months ago
Guangyao Zhang d9d5e7ea1f
[shardformer] Support the T5ForTokenClassification model (#5816)
5 months ago
Hongxin Liu 5dfbcd7746
[zero] use bucket during allgather (#5860)
5 months ago
botbw 8e718a1421
[gemini] fixes for benchmarking (#5847)
5 months ago
Edenzzzz 2a25a2aff7
[Feature] optimize PP overlap (#5735)
5 months ago
botbw 8a5c86439a
[gemini] fix missing return (#5845)
5 months ago
Yuanheng Zhao 7b249c76e5
[Fix] Fix spec-dec Glide LlamaModel for compatibility with transformers (#5837)
5 months ago
Kai Lv 0adca5b688
[launch] Support IPv4 host initialization in launch (#5822)
5 months ago
GuangyaoZhang d84d68601a change 'xxx if xxx else None' to 'xxx or None'
5 months ago
GuangyaoZhang a83a2336e8 rebase master llama change
5 months ago
GuangyaoZhang 363cde6957 merge model and attention forward
5 months ago
GuangyaoZhang 7a2b08646f Remove CohereLayerNorm and use existing layernorm
5 months ago
GuangyaoZhang fe2e74c03a fix precommit
5 months ago
GuangyaoZhang f656d61778 change command
5 months ago
GuangyaoZhang 0b81163bc0 Copy llama to command
5 months ago
Edenzzzz 8795bb2e80
Support 4d parallel + flash attention (#5789)
5 months ago
flybird11111 2ddf624a86
[shardformer] upgrade transformers to 4.39.3 (#5815)
5 months ago
botbw 3bcbba9262
[gemini] quick fix on possible async operation (#5803)
6 months ago
Haze188 d9dddf574f
[Gemini] Use async stream to prefetch and h2d data moving (#5781)
6 months ago
Li Xingjian 8554585a5f
[Inference] Fix flash-attn import and add model test (#5794)
6 months ago
Hongxin Liu aa125bcc91
[shardformer] fix modeling of bloom and falcon (#5796)
6 months ago
Runyu Lu c0948aff97
[Inference]refactor baichuan (#5791)
6 months ago
char-1ee f5981e808e Remove flash attention backend
6 months ago
char-1ee ceba662d22 Clean up
6 months ago
char-1ee 5f398fc000 Pass inference model shard configs for module init
6 months ago
char-1ee eec77e5702 Fix tests and naming
6 months ago
char-1ee 04386d9eff Refactor modeling by adding attention backend
6 months ago
Hongxin Liu 73e88a5553
[shardformer] fix import (#5788)
6 months ago
Hongxin Liu b9d646fe9e
[misc] fix dist logger (#5782)
6 months ago
botbw 3f7e3131d9
[gemini] optimize reduce scatter d2h copy (#5760)
6 months ago
Edenzzzz 79f7a7b211
[misc] Accelerate CI for zero and dist optim (#5758)
6 months ago
flybird11111 50b4c8e8cf
[hotfix] fix llama flash attention forward (#5777)
6 months ago
yuehuayingxueluo b45000f839
[Inference]Add Streaming LLM (#5745)
6 months ago
Yuanheng Zhao 406443200f
[Hotfix] Add missing init file in inference.executor (#5774)
6 months ago
duanjunwen 1b76564e16
[test] Fix/fix testcase (#5770)
6 months ago
flybird11111 3f2be80530
fix (#5765)
6 months ago
botbw 023ea13cb5
Merge pull request #5749 from hpcaitech/prefetch
6 months ago
hxwang 8547562884 [chore] remove unnecessary assert since compute list might not be recorded
6 months ago
hxwang e5e3320948 [bug] continue fix
6 months ago
hxwang 936dd96dbb [bug] workaround for idx fix
6 months ago
Edenzzzz 5f8c0a0ac3
[Feature] auto-cast optimizers to distributed version (#5746)
6 months ago
hxwang ff507b755e Merge branch 'main' of github.com:hpcaitech/ColossalAI into prefetch
6 months ago
botbw 2fc85abf43
[gemini] async grad chunk reduce (all-reduce&reduce-scatter) (#5713)
6 months ago
Jianghai 85946d4236
[Inference]Fix readme and example for API server (#5742)
6 months ago
hxwang 15d21a077a Merge remote-tracking branch 'origin/main' into prefetch
6 months ago
binmakeswell 4647ec28c8
[inference] release (#5747)
6 months ago
Yuanheng Zhao df6747603f
[Colossal-Inference] (v0.1.0) Merge pull request #5739 from hpcaitech/feature/colossal-infer
6 months ago
Yuanheng Zhao bd38fe6b91
[NFC] Fix code factors on inference triton kernels (#5743)
6 months ago
botbw 13c06d36a3
[bug] fix early return (#5740)
6 months ago
Haze188 22ce873c3f
[Shardformer] Add parallel output for shardformer models(bloom, falcon) (#5702)
6 months ago
pre-commit-ci[bot] b3c0e6d871 [pre-commit.ci] auto fixes from pre-commit.com hooks
6 months ago
hxwang 137a7c341b [chore] fix init error
6 months ago
Yuanheng Zhao 8633c15da9 [sync] Sync feature/colossal-infer with main
6 months ago
Yuanheng Zhao d8b1ea4ac9
[doc] Update Inference Readme (#5736)
6 months ago
Yuanheng Zhao bdf9a001d6
[Fix/Inference] Add unsupported auto-policy error message (#5730)
6 months ago
genghaozhe 90d8d0183c remove personal comments
6 months ago
genghaozhe bfcb2d1ff8 refactor the code structure to solve the circular import
6 months ago
genghaozhe 1ec92d29af remove perf log, unrelated file and so on
6 months ago
genghaozhe 5c6c5d6be3 remove comments
6 months ago
genghaozhe 7416e4943b fix conflicts to beautify the code
6 months ago
genghaozhe d22bf30ca6 implement auto policy prefetch and modify a little origin code.
6 months ago
pre-commit-ci[bot] f1918e18a5 [pre-commit.ci] auto fixes from pre-commit.com hooks
6 months ago
hxwang a55a9e298b [gemini] init auto policy prefetch
6 months ago
Yuanheng Zhao 283c407a19
[Inference] Fix Inference Generation Config and Sampling (#5710)
6 months ago
genghaozhe 06a3a100b3 remove unrelated code
6 months ago
genghaozhe 3d625ca836 add some todo Message
6 months ago