Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 

13 lines
258 B

fbgemm-gpu==0.2.0
pytest
pytest-cov
torchvision
transformers
timm
titans
torchaudio
torchrec==0.2.0
contexttimer
einops
triton==2.0.0.dev20221011
git+https://github.com/HazyResearch/flash-attention.git@c422fee3776eb3ea24e011ef641fd5fbeb212623#egg=flash_attn