This website works better with JavaScript.
Explore
关于
Help
Register
Sign In
github
/
ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI
Watch
1
Star
0
Fork
You've already forked ColossalAI
0
Code
Issues
Projects
Releases
Wiki
Activity
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
f8598e3ec5
ckpt_api
main
support-npu
feature/zerobubble
feature/async-io
pre-commit-ci-update-config
ckpt
supercooledith-patch-1
flybird11111-patch-1
ColossalChat
colossalchat
moe_sp
dev/zero-offload
colossalchat_upgrade
fix-setup
feature/colossal-infer
fix/format
feat/online-serving
feature/lora
llama3
feat/speculative-decoding
hotfix/kernel_build_before_load
feat/moe
refactor/inference
feature/inference-refactor
hotfix/example_test
cloud/coati
feature/2-stage
feature/stable-diffusion
develop
feature/elixir
dev/gpt2_metainfo_patch
v0.0.1-beta
v0.0.2
v0.1.0
v0.1.1
v0.1.10
v0.1.11rc1
v0.1.11rc2
v0.1.11rc3
v0.1.11rc4
v0.1.11rc5
v0.1.12
v0.1.13
v0.1.2
v0.1.3
v0.1.4
v0.1.5
v0.1.6
v0.1.7
v0.1.8
v0.1.9
v0.2.0
v0.2.1
v0.2.2
v0.2.3
v0.2.4
v0.2.5
v0.2.6
v0.2.7
v0.2.8
v0.3.0
v0.3.1
v0.3.2
v0.3.3
v0.3.4
v0.3.5
v0.3.6
v0.3.7
v0.3.8
v0.3.9
v0.4.0
v0.4.1
v0.4.2
v0.4.3
v0.4.4
v0.4.5
v0.4.6
Branches
Tags
${ item.name }
Create tag
${ searchTerm }
Create branch
${ searchTerm }
from 'f8598e3ec5'
${ noResults }
ColossalAI
/
requirements
/
requirements-infer.txt
3 lines
33 B
Raw
Normal View
History
Unescape
Escape
[Inference] Add the logic of the inference engine (#5173) * add infer_struct and infer_config * update codes * change InferConfig * Add hf_model_config to the engine * rm _get_hf_model_config * update codes * made adjustments according to the feedback from the reviewer. * update codes * add ci test for config and struct * Add the logic of the inference engine * update engine and test * Recover cache_manager.py * add logger * fix conflict * update codes * update codes * update model and tokenizer * fix add the logic about shardformer * change kvcache_manager docstring * add policy * fix ci bug in test_kvcache_manager.py * remove codes related o tokenizer and move model_policy * fix code style * add ordered_set to requirements-infer.txt * Delete extra empty lines * add ordered_set to requirements-test.txt
11 months ago
ordered_set
[Fix/Infer] Remove unused deps and revise requirements (#5341) * remove flash-attn dep * rm padding llama * revise infer requirements * move requirements out of module
10 months ago
transformers==4.36.2