Commit Graph

2907 Commits (f6731db67c84a5b8421e68143047e63eeea26656)
 

Author SHA1 Message Date
Tong Li ed06731e00
update Colossal (#4832)
1 year ago
Xu Kai c3bef20478
add autotune (#4822)
1 year ago
binmakeswell 822051d888
[doc] update slack link (#4823)
1 year ago
Yuanchen 1fa8c5e09f
Update Qwen-7B results (#4821)
1 year ago
flybird11111 be400a0936
[chat] fix gemini strategy (#4698)
1 year ago
Tong Li bbbcac26e8
fix format (#4815)
1 year ago
github-actions[bot] fb46d05cdf
[format] applied code formatting on changed files in pull request 4595 (#4602)
1 year ago
littsk 11f1e426fe
[hotfix] Correct several erroneous code comments (#4794)
1 year ago
littsk 54b3ad8924
[hotfix] fix norm type error in zero optimizer (#4795)
1 year ago
Hongxin Liu da15fdb9ca
[doc] add lazy init docs (#4808)
1 year ago
Yan haixu a22706337a
[misc] add last_epoch in CosineAnnealingWarmupLR (#4778)
1 year ago
Chandler-Bing b6cf0aca55
[hotfix] change llama2 Colossal-LLaMA-2 script filename (#4800)
1 year ago
Desperado-Jia 62b6af1025
Merge pull request #4805 from TongLi3701/docs/fix
1 year ago
Tong Li 8cbce6184d update
1 year ago
Hongxin Liu 4965c0dabd
[lazy] support from_pretrained (#4801)
1 year ago
Tong Li bd014673b0 update readme
1 year ago
Baizhou Zhang 64a08b2dc3
[checkpointio] support unsharded checkpointIO for hybrid parallel (#4774)
1 year ago
Baizhou Zhang a2db75546d
[doc] polish shardformer doc (#4779)
1 year ago
flybird11111 26cd6d850c
[fix] fix weekly runing example (#4787)
1 year ago
binmakeswell d512a4d38d
[doc] add llama2 domain-specific solution news (#4789)
1 year ago
Yuanchen ce777853ae
[feature] ColossalEval: Evaluation Pipeline for LLMs (#4786)
1 year ago
Tong Li 74aa7d964a
initial commit: add colossal llama 2 (#4784)
1 year ago
Hongxin Liu 4146f1c0ce
[release] update version (#4775)
1 year ago
Jianghai ce7ade3882
[inference] chatglm2 infer demo (#4724)
1 year ago
Xu Kai 946ab56c48
[feature] add gptq for inference (#4754)
1 year ago
littsk 1e0e080837
[bug] Fix the version check bug in colossalai run when generating the cmd. (#4713)
1 year ago
Hongxin Liu 3e05c07bb8
[lazy] support torch 2.0 (#4763)
1 year ago
Wenhao Chen 901ab1eedd
[chat]: add lora merge weights config (#4766)
1 year ago
Baizhou Zhang 493a5efeab
[doc] add shardformer doc to sidebar (#4768)
1 year ago
Hongxin Liu 66f3926019
[doc] clean up outdated docs (#4765)
1 year ago
Baizhou Zhang df66741f77
[bug] fix get_default_parser in examples (#4764)
1 year ago
Baizhou Zhang c0a033700c
[shardformer] fix master param sync for hybrid plugin/rewrite unwrapping logic (#4758)
1 year ago
Wenhao Chen 7b9b86441f
[chat]: update rm, add wandb and fix bugs (#4471)
1 year ago
ppt0011 07c2e3d09c
Merge pull request #4757 from ppt0011/main
1 year ago
Pengtai Xu 4d7537ba25 [doc] put native colossalai plugins first in description section
1 year ago
Pengtai Xu e10d9f087e [doc] add model examples for each plugin
1 year ago
Pengtai Xu a04337bfc3 [doc] put individual plugin explanation in front
1 year ago
Pengtai Xu 10513f203c [doc] explain suitable use case for each plugin
1 year ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752)
1 year ago
github-actions[bot] 3c6b831c26
[format] applied code formatting on changed files in pull request 4743 (#4750)
1 year ago
Hongxin Liu b5f9e37c70
[legacy] clean up legacy code (#4743)
1 year ago
Xuanlei Zhao 32e7f99416
[kernel] update triton init #4740 (#4740)
1 year ago
Baizhou Zhang d151dcab74
[doc] explaination of loading large pretrained models (#4741)
1 year ago
flybird11111 4c4482f3ad
[example] llama2 add fine-tune example (#4673)
1 year ago
Xuanlei Zhao ac2797996b
[shardformer] add custom policy in hybrid parallel plugin (#4718)
1 year ago
Baizhou Zhang 451c3465fb
[doc] polish shardformer doc (#4735)
1 year ago
ppt0011 73eb3e8862
Merge pull request #4738 from ppt0011/main
1 year ago
Bin Jia 608cffaed3
[example] add gpt2 HybridParallelPlugin example (#4653)
1 year ago
Bin Jia 6a03c933a0
[shardformer] update seq parallel document (#4730)
1 year ago
Pengtai Xu cd4e61d149 [legacy] remove deterministic data loader test
1 year ago