This website requires JavaScript.
Explore
关于
Help
Register
Sign In
github
/
ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI
Watch
1
Star
0
Fork
You've already forked ColossalAI
0
Code
Issues
Projects
Releases
Wiki
Activity
80c3c8789b
ColossalAI
/
examples
/
language
/
openmoe
/
requirements.txt
6 lines
92 B
Plaintext
Raw
Normal View
History
Unescape
Escape
[moe] merge moe into main (#4978) * update moe module * support openmoe
2023-11-02 02:21:24 +00:00
colossalai >= 0.3.3
torch >= 1.8.1
[moe] support optimizer checkpoint (#5015) * Refactor MoE Manager setup method * unshard optim ckpt * optim io * update transformer version * update requirements * update ckpt * update ckpt * update ckpt * fix engine * fix engine
2023-11-08 15:07:03 +00:00
transformers >= 4.20.0, <= 4.34.0
[moe] merge moe into main (#4978) * update moe module * support openmoe
2023-11-02 02:21:24 +00:00
sentencepiece
datasets