This website requires JavaScript.
Explore
关于
Help
Register
Sign In
github
/
ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI
Watch
1
Star
0
Fork
You've already forked ColossalAI
0
Code
Issues
Projects
Releases
Wiki
Activity
ce7b2c9ae3
ColossalAI
/
requirements
/
requirements.txt
9 lines
79 B
Plaintext
Raw
Normal View
History
Unescape
Escape
Migrated project
2021-10-28 16:21:23 +00:00
torch>=1.8
torchvision>=0.9
numpy
tqdm
psutil
Support TP-compatible Torch AMP and Update trainer API (#27) * Add gradient accumulation, fix lr scheduler * fix FP16 optimizer and adapted torch amp with tensor parallel (#18) * fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes * fixed trainer * Revert "fixed trainer" This reverts commit 2e0b0b76990e8d4e337add483d878c0f61cf5097. * improved consistency between trainer, engine and schedule (#23) Co-authored-by: 1SAA <c2h214748@gmail.com> Co-authored-by: 1SAA <c2h214748@gmail.com> Co-authored-by: ver217 <lhx0217@gmail.com>
2021-11-18 11:45:06 +00:00
tensorboard
Update GitHub action and pre-commit settings (#196) * Update GitHub action and pre-commit settings * Update GitHub action and pre-commit settings (#198)
2022-01-28 08:59:53 +00:00
packaging
pre-commit