Commit Graph

2282 Commits (179558a87ab37096450223d8ee4c2b1a06a334a4)
 

Author SHA1 Message Date
Hongxin Liu 179558a87a
[devops] fix chat ci (#3628)
2 years ago
digger-yu d7bf284706
[chat] polish code note typo (#3612)
2 years ago
Yuanchen c4709d34cf
Chat evaluate (#3608)
2 years ago
digger-yu 633bac2f58
[doc] .github/workflows/README.md (#3605)
2 years ago
digger-yu becd3b0f54
[doc] fix setup.py typo (#3603)
2 years ago
digger-yu 7570d9ae3d
[doc] fix op_builder/README.md (#3597)
2 years ago
Hongxin Liu 12eff9eb4c
[gemini] state dict supports fp16 (#3590)
2 years ago
github-actions[bot] d544ed4345
[bot] Automated submodule synchronization (#3596)
2 years ago
digger-yu d96567bb5d
[misc] op_builder/builder.py (#3593)
2 years ago
binmakeswell 5a79cffdfd
[coati] fix install cmd (#3592)
2 years ago
Yuanchen 1ec0d386a9
reconstruct chat trainer and fix training script (#3588)
2 years ago
Hongxin Liu dac127d0ee
[fx] fix meta tensor registration (#3589)
2 years ago
Camille Zhong 36a519b49f Update test_ci.sh
2 years ago
digger-yu d0fbd4b86f
[example] fix community doc (#3586)
2 years ago
Hongxin Liu f313babd11
[gemini] support save state dict in shards (#3581)
2 years ago
tingfeng cao 7788e0b0a5
fix: fix sft (#3568)
2 years ago
digger-yu 6e7e43c6fe
[doc] Update .github/workflows/README.md (#3577)
2 years ago
Fazzie-Maqianli 6b1a39b17b
[coati] add costom model suppor tguide (#3579)
2 years ago
binmakeswell cc1eec2f53
[chat] update reward model sh (#3578)
2 years ago
csric e355144375
[chatgpt] Detached PPO Training (#3195)
2 years ago
YH d329c294ec
Add docstr for zero3 chunk search utils (#3572)
2 years ago
digger-yu 9edeadfb24
[doc] Update 1D_tensor_parallel.md (#3573)
2 years ago
Hongxin Liu 173dad0562
[misc] add verbose arg for zero and op builder (#3552)
2 years ago
Hongxin Liu 4341f5e8e6
[lazyinit] fix clone and deepcopy (#3553)
2 years ago
digger-yu 1c7734bc94
[doc] Update 1D_tensor_parallel.md (#3563)
2 years ago
binmakeswell f1b3d60cae
[example] reorganize for community examples (#3557)
2 years ago
MisterLin1995 1a809eddaa
[chat] ChatGPT train prompts on ray example (#3309)
2 years ago
binmakeswell 535b896435
[chat] polish tutorial doc (#3551)
2 years ago
digger-yu 77efdfe1dd
[doc] Update README.md (#3549)
2 years ago
digger-yu 3f760da9f0
Update README.md (#3548)
2 years ago
digger-yu a3ac48ef3d
[doc] Update README-zh-Hans.md (#3541)
2 years ago
natalie_cao de84c0311a Polish Code
2 years ago
Hongxin Liu 152239bbfa
[gemini] gemini supports lazy init (#3379)
2 years ago
jiangmingyan 366a035552
[checkpoint] Shard saved checkpoint need to be compatible with the naming format of hf checkpoint files (#3479)
2 years ago
Yuanchen 7182ac2a04
[chat]add examples of training with limited resources in chat readme (#3536)
2 years ago
zhang-yi-chi e6a132a449
[chat]: add vf_coef argument for PPOTrainer (#3318)
2 years ago
ver217 89fd10a1c9
[chat] add zero2 cpu strategy for sft training (#3520)
2 years ago
binmakeswell 990d4c3e4e
[doc] hide diffusion in application path (#3519)
2 years ago
binmakeswell 0c0455700f
[doc] add requirement and highlight application (#3516)
2 years ago
NatalieC323 635d0a1baf
[Chat Community] Update README.md (fixed#3487) (#3506)
2 years ago
YH bcf0cbcbe7
[doc] Add docs for clip args in zero optim (#3504)
2 years ago
gongenlei a7ca297281
[coati] Fix LlamaCritic (#3475)
2 years ago
mandoxzhang 8f2c55f9c9
[example] remove redundant texts & update roberta (#3493)
2 years ago
mandoxzhang ab5fd127e3
[example] update roberta with newer ColossalAI (#3472)
2 years ago
NatalieC323 fb8fae6f29
Revert "[dreambooth] fixing the incompatibity in requirements.txt (#3190) (#3378)" (#3481)
2 years ago
binmakeswell 891b8e7fac
[chat] fix stage3 PPO sample sh command (#3477)
2 years ago
NatalieC323 c701b77b11
[dreambooth] fixing the incompatibity in requirements.txt (#3190) (#3378)
2 years ago
Frank Lee 4e9989344d
[doc] updated contributor list (#3474)
2 years ago
jiangmingyan 52a933e175
[checkpoint] support huggingface style sharded checkpoint (#3461)
2 years ago
Fazzie-Maqianli 6afeb1202a
add community example dictionary (#3465)
2 years ago