mirror of https://github.com/hpcaitech/ColossalAI
aibig-modeldata-parallelismdeep-learningdistributed-computingfoundation-modelsheterogeneous-traininghpcinferencelarge-scalemodel-parallelismpipeline-parallelism
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
15 lines
381 B
15 lines
381 B
diffusers |
|
fbgemm-gpu==0.2.0 |
|
pytest |
|
pytest-cov |
|
torchvision |
|
transformers |
|
timm |
|
titans |
|
torchaudio |
|
torchrec==0.2.0 |
|
contexttimer |
|
einops |
|
triton==2.0.0.dev20221202 |
|
git+https://github.com/HazyResearch/flash-attention.git@c422fee3776eb3ea24e011ef641fd5fbeb212623#egg=flash_attn |
|
requests==2.27.1 # downgrade to avoid huggingface error https://github.com/huggingface/transformers/issues/17611
|
|
|