You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests/test_infer
flybird11111 2ddf624a86
[shardformer] upgrade transformers to 4.39.3 (#5815)
5 months ago
..
test_async_engine [Inference] Fix bugs and docs for feat/online-server (#5598) 7 months ago
test_kernels [shardformer] upgrade transformers to 4.39.3 (#5815) 5 months ago
test_models [Inference] Fix flash-attn import and add model test (#5794) 6 months ago
__init__.py [Fix] Fix Inference Example, Tests, and Requirements (#5688) 7 months ago
_utils.py [Inference] Add the logic of the inference engine (#5173) 11 months ago
test_batch_bucket.py [Fix/Inference] Fix format of input prompts and input model in inference engine (#5395) 9 months ago
test_config_and_struct.py [Fix] Fix Inference Example, Tests, and Requirements (#5688) 7 months ago
test_continuous_batching.py [inference] Fix running time of test_continuous_batching (#5750) 6 months ago
test_cuda_graph.py [Fix] Fix Inference Example, Tests, and Requirements (#5688) 7 months ago
test_drafter.py [Fix] Fix Inference Example, Tests, and Requirements (#5688) 7 months ago
test_inference_engine.py [Inference] Fix bugs and docs for feat/online-server (#5598) 7 months ago
test_kvcache_manager.py [Fix] Fix & Update Inference Tests (compatibility w/ main) 7 months ago
test_request_handler.py [Fix] Fix & Update Inference Tests (compatibility w/ main) 7 months ago
test_rpc_engine.py [release] update version (#5752) 6 months ago
test_streamingllm.py [Inference]Add Streaming LLM (#5745) 6 months ago