mirror of https://github.com/hpcaitech/ColossalAI
16 lines
381 B
Plaintext
16 lines
381 B
Plaintext
diffusers
|
|
fbgemm-gpu==0.2.0
|
|
pytest
|
|
pytest-cov
|
|
torchvision
|
|
transformers
|
|
timm
|
|
titans
|
|
torchaudio
|
|
torchrec==0.2.0
|
|
contexttimer
|
|
einops
|
|
triton==2.0.0.dev20221202
|
|
git+https://github.com/HazyResearch/flash-attention.git@c422fee3776eb3ea24e011ef641fd5fbeb212623#egg=flash_attn
|
|
requests==2.27.1 # downgrade to avoid huggingface error https://github.com/huggingface/transformers/issues/17611
|