You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/requirements/requirements.txt

26 lines
269 B

numpy
tqdm
psutil
packaging
pre-commit
rich
click
fabric
contexttimer
ninja
torch>=2.1.0,<2.3.0
safetensors
einops
pydantic
ray
sentencepiece
google
protobuf
transformers>=4.36.2,<4.40.0
peft>=0.7.1
bitsandbytes>=0.39.0
rpyc==6.0.0
fastapi
uvicorn==0.29.0
galore_torch