3017 Commits (16c96d4d8cbe26b5ee32a35fd5ee809e035c9e96)
 

Author SHA1 Message Date
digger yu 16c96d4d8c
[hotfix] fix typo change _descrption to _description (#5331) 9 months ago
digger yu 70cce5cbed
[doc] update some translations with README-zh-Hans.md (#5382) 9 months ago
Luo Yihang e239cf9060
[hotfix] fix typo of openmoe model source (#5403) 9 months ago
MickeyCHAN e304e4db35
[hotfix] fix sd vit import error (#5420) 9 months ago
Hongxin Liu 070df689e6
[devops] fix extention building (#5427) 9 months ago
binmakeswell 822241a99c
[doc] sora release (#5425) 9 months ago
flybird11111 29695cf70c
[example]add gpt2 benchmark example script. (#5295) 9 months ago
Camille Zhong 4b8312c08e
fix sft single turn inference example (#5416) 9 months ago
binmakeswell a1c6cdb189 [doc] fix blog link 9 months ago
binmakeswell 5de940de32 [doc] fix blog link 9 months ago
Frank Lee 2461f37886
[workflow] added pypi channel (#5412) 9 months ago
Tong Li a28c971516
update requirements (#5407) 9 months ago
flybird11111 0a25e16e46
[shardformer]gather llama logits (#5398) 9 months ago
Frank Lee dcdd8a5ef7
[setup] fixed nightly release (#5388) 9 months ago
QinLuo bf34c6fef6
[fsdp] impl save/load shard model/optimizer (#5357) 9 months ago
Hongxin Liu d882d18c65
[example] reuse flash attn patch (#5400) 9 months ago
Hongxin Liu 95c21e3950
[extension] hotfix jit extension setup (#5402) 9 months ago
Stephan Kölker 5d380a1a21
[hotfix] Fix wrong import in meta_registry (#5392) 9 months ago
CZYCW b833153fd5
[hotfix] fix variable type for top_p (#5313) 9 months ago
Frank Lee 705a62a565
[doc] updated installation command (#5389) 9 months ago
yixiaoer 69e3ad01ed
[doc] Fix typo (#5361) 9 months ago
Hongxin Liu 7303801854
[llama] fix training and inference scripts (#5384) 9 months ago
Hongxin Liu adae123df3
[release] update version (#5380) 10 months ago
Frank Lee efef43b53c
Merge pull request #5372 from hpcaitech/exp/mixtral 10 months ago
Frank Lee 4c03347fc7
Merge pull request #5377 from hpcaitech/example/llama-npu 10 months ago
ver217 06db94fbc9 [moe] fix tests 10 months ago
Hongxin Liu 65e5d6baa5 [moe] fix mixtral optim checkpoint (#5344) 10 months ago
Hongxin Liu 956b561b54 [moe] fix mixtral forward default value (#5329) 10 months ago
Hongxin Liu b60be18dcc [moe] fix mixtral checkpoint io (#5314) 10 months ago
Hongxin Liu da39d21b71 [moe] support mixtral (#5309) 10 months ago
Hongxin Liu c904d2ae99 [moe] update capacity computing (#5253) 10 months ago
Xuanlei Zhao 7d8e0338a4 [moe] init mixtral impl 10 months ago
Hongxin Liu 084c91246c
[llama] fix memory issue (#5371) 10 months ago
Hongxin Liu c53ddda88f
[lr-scheduler] fix load state dict and add test (#5369) 10 months ago
Hongxin Liu eb4f2d90f9
[llama] polish training script and fix optim ckpt (#5368) 10 months ago
Camille Zhong a5756a8720
[eval] update llama npu eval (#5366) 10 months ago
Camille Zhong 44ca61a22b
[llama] fix neftune & pbar with start_step (#5364) 10 months ago
Hongxin Liu a4cec1715b
[llama] add flash attn patch for npu (#5362) 10 months ago
Hongxin Liu 73f9f23fc6
[llama] update training script (#5360) 10 months ago
Hongxin Liu 6c0fa7b9a8
[llama] fix dataloader for hybrid parallel (#5358) 10 months ago
Hongxin Liu 2dd01e3a14
[gemini] fix param op hook when output is tuple (#5355) 10 months ago
Wenhao Chen 1c790c0877
[fix] remove unnecessary dp_size assert (#5351) 10 months ago
Hongxin Liu ffffc32dc7
[checkpointio] fix gemini and hybrid parallel optim checkpoint (#5347) 10 months ago
YeAnbang c5239840e6
[Chat] fix sft loss nan (#5345) 10 months ago
Frank Lee abd8e77ad8
[extension] fixed exception catch (#5342) 10 months ago
digger yu 71321a07cf
fix typo change dosen't to doesn't (#5308) 10 months ago
digger yu 6a3086a505
fix typo under extensions/ (#5330) 10 months ago
Frank Lee febed23288
[doc] added docs for extensions (#5324) 10 months ago
flybird11111 388179f966
[tests] fix t5 test. (#5322) 10 months ago
Frank Lee a6709afe66
Merge pull request #5321 from FrankLeeeee/hotfix/accelerator-api 10 months ago