flybird11111
be400a0936
[chat] fix gemini strategy ( #4698 )
...
* [chat] fix gemini strategy
* [chat] fix gemini strategy
* [chat] fix gemini strategy
* [chat] fix gemini strategy
* g# This is a combination of 2 commits.
[chat] fix gemini strategy
fox
* [chat] fix gemini strategy
update llama2 example
[chat] fix gemini strategy
* [fix] fix gemini strategy
* [fix] fix gemini strategy
* [fix] fix gemini strategy
* [fix] fix gemini strategy
* [fix] fix gemini strategy
* [fix] fix gemini strategy
* [fix] fix gemini strategy
* [fix] fix gemini strategy
* [fix] fix gemini strategy
* [fix] fix gemini strategy
* fix
* fix
* fix
* fix
* fix
* Update train_prompts.py
2023-09-27 13:15:32 +08:00
Wenhao Chen
7b9b86441f
[chat]: update rm, add wandb and fix bugs ( #4471 )
...
* feat: modify forward fn of critic and reward model
* feat: modify calc_action_log_probs
* to: add wandb in sft and rm trainer
* feat: update train_sft
* feat: update train_rm
* style: modify type annotation and add warning
* feat: pass tokenizer to ppo trainer
* to: modify trainer base and maker base
* feat: add wandb in ppo trainer
* feat: pass tokenizer to generate
* test: update generate fn tests
* test: update train tests
* fix: remove action_mask
* feat: remove unused code
* fix: fix wrong ignore_index
* fix: fix mock tokenizer
* chore: update requirements
* revert: modify make_experience
* fix: fix inference
* fix: add padding side
* style: modify _on_learn_batch_end
* test: use mock tokenizer
* fix: use bf16 to avoid overflow
* fix: fix workflow
* [chat] fix gemini strategy
* [chat] fix
* sync: update colossalai strategy
* fix: fix args and model dtype
* fix: fix checkpoint test
* fix: fix requirements
* fix: fix missing import and wrong arg
* fix: temporarily skip gemini test in stage 3
* style: apply pre-commit
* fix: temporarily skip gemini test in stage 1&2
---------
Co-authored-by: Mingyan Jiang <1829166702@qq.com>
2023-09-20 15:53:58 +08:00
Hongxin Liu
079bf3cb26
[misc] update pre-commit and run all files ( #4752 )
...
* [misc] update pre-commit
* [misc] run pre-commit
* [misc] remove useless configuration files
* [misc] ignore cuda for clang-format
2023-09-19 14:20:26 +08:00
yingliu-hpc
1467e3b41b
[coati] add chatglm model ( #4539 )
...
* update configuration of chatglm and add support in coati
* add unit test & update chatglm default config & fix bos index issue
* remove chatglm due to oom
* add dataset pkg in requirement-text
* fix parameter issue in test_models
* add ref in tokenize & rm unnessary parts
* separate source & target tokenization in chatglm
* add unit test to chatglm
* fix test dataset issue
* update truncation of chatglm
* fix Colossalai version
* fix colossal ai version in test
2023-08-29 17:58:51 +08:00
Fazzie-Maqianli
b0ce5a1032
[Coati] first commit ( #3283 )
2023-03-28 20:25:36 +08:00