You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/applications/Chat/coati
Wenhao Chen edd75a59ea
[chat] remove naive strategy and split colossalai strategy (#4094)
1 year ago
..
dataset [chat] refactor actor class (#3968) 1 year ago
experience_maker [chat] refactor actor class (#3968) 1 year ago
kernels [CI] fix some spelling errors (#3707) 2 years ago
models [chat] refactor actor class (#3968) 1 year ago
quant [chat] add distributed PPO trainer (#3740) 1 year ago
ray [chat] remove naive strategy and split colossalai strategy (#4094) 1 year ago
replay_buffer [chat] polish code note typo (#3612) 2 years ago
trainer [chat] remove naive strategy and split colossalai strategy (#4094) 1 year ago
utils [chat] remove lm model class (#3653) 2 years ago
__init__.py [Coati] first commit (#3283) 2 years ago