2001 Commits (7a2b08646f79d06797114b1987685ed1ad600c16)

Author SHA1 Message Date
GuangyaoZhang 7a2b08646f Remove CohereLayerNorm and use existing layernorm 5 months ago
GuangyaoZhang fe2e74c03a fix precommit 5 months ago
GuangyaoZhang f656d61778 change command 5 months ago
GuangyaoZhang 0b81163bc0 Copy llama to command 5 months ago
Edenzzzz 8795bb2e80
Support 4d parallel + flash attention (#5789) 5 months ago
flybird11111 2ddf624a86
[shardformer] upgrade transformers to 4.39.3 (#5815) 5 months ago
botbw 3bcbba9262
[gemini] quick fix on possible async operation (#5803) 5 months ago
Haze188 d9dddf574f
[Gemini] Use async stream to prefetch and h2d data moving (#5781) 5 months ago
Li Xingjian 8554585a5f
[Inference] Fix flash-attn import and add model test (#5794) 5 months ago
Hongxin Liu aa125bcc91
[shardformer] fix modeling of bloom and falcon (#5796) 5 months ago
Runyu Lu c0948aff97
[Inference]refactor baichuan (#5791) 6 months ago
char-1ee f5981e808e Remove flash attention backend 6 months ago
char-1ee ceba662d22 Clean up 6 months ago
char-1ee 5f398fc000 Pass inference model shard configs for module init 6 months ago
char-1ee eec77e5702 Fix tests and naming 6 months ago
char-1ee 04386d9eff Refactor modeling by adding attention backend 6 months ago
Hongxin Liu 73e88a5553
[shardformer] fix import (#5788) 6 months ago
Hongxin Liu b9d646fe9e
[misc] fix dist logger (#5782) 6 months ago
botbw 3f7e3131d9
[gemini] optimize reduce scatter d2h copy (#5760) 6 months ago
Edenzzzz 79f7a7b211
[misc] Accelerate CI for zero and dist optim (#5758) 6 months ago
flybird11111 50b4c8e8cf
[hotfix] fix llama flash attention forward (#5777) 6 months ago
yuehuayingxueluo b45000f839
[Inference]Add Streaming LLM (#5745) 6 months ago
Yuanheng Zhao 406443200f
[Hotfix] Add missing init file in inference.executor (#5774) 6 months ago
duanjunwen 1b76564e16
[test] Fix/fix testcase (#5770) 6 months ago
flybird11111 3f2be80530
fix (#5765) 6 months ago
hxwang 8547562884 [chore] remove unnecessary assert since compute list might not be recorded 6 months ago
hxwang e5e3320948 [bug] continue fix 6 months ago
hxwang 936dd96dbb [bug] workaround for idx fix 6 months ago
Edenzzzz 5f8c0a0ac3
[Feature] auto-cast optimizers to distributed version (#5746) 6 months ago
botbw 2fc85abf43
[gemini] async grad chunk reduce (all-reduce&reduce-scatter) (#5713) 6 months ago
Jianghai 85946d4236
[Inference]Fix readme and example for API server (#5742) 6 months ago
binmakeswell 4647ec28c8
[inference] release (#5747) 6 months ago
Yuanheng Zhao bd38fe6b91
[NFC] Fix code factors on inference triton kernels (#5743) 6 months ago
botbw 13c06d36a3
[bug] fix early return (#5740) 6 months ago
Haze188 22ce873c3f
[Shardformer] Add parallel output for shardformer models(bloom, falcon) (#5702) 6 months ago
pre-commit-ci[bot] b3c0e6d871 [pre-commit.ci] auto fixes from pre-commit.com hooks 6 months ago
hxwang 137a7c341b [chore] fix init error 6 months ago
Yuanheng Zhao d8b1ea4ac9
[doc] Update Inference Readme (#5736) 6 months ago
Yuanheng Zhao bdf9a001d6
[Fix/Inference] Add unsupported auto-policy error message (#5730) 6 months ago
genghaozhe 90d8d0183c remove personal comments 6 months ago
genghaozhe bfcb2d1ff8 refactor the code structure to solve the circular import 6 months ago
genghaozhe 1ec92d29af remove perf log, unrelated file and so on 6 months ago
genghaozhe 5c6c5d6be3 remove comments 6 months ago
genghaozhe df63db7e63 remote comments 6 months ago
genghaozhe d22bf30ca6 implement auto policy prefetch and modify a little origin code. 6 months ago
pre-commit-ci[bot] f1918e18a5 [pre-commit.ci] auto fixes from pre-commit.com hooks 6 months ago
hxwang a55a9e298b [gemini] init auto policy prefetch 6 months ago
Yuanheng Zhao 283c407a19
[Inference] Fix Inference Generation Config and Sampling (#5710) 6 months ago
genghaozhe 06a3a100b3 remove unrelated code 6 months ago
genghaozhe 3d625ca836 add some todo Message 6 months ago