You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/inference
yuehuayingxueluo 10e3c9f923
rm torch.cuda.synchronize
11 months ago
..
core rm torch.cuda.synchronize 11 months ago
kv_cache adapted to pad_context_forward 11 months ago
modeling fix CI bugs 11 months ago
__init__.py [Inference] First PR for rebuild colossal-infer (#5143) 11 months ago
config.py adapted to pad_context_forward 11 months ago
logit_processors.py [Inference] add logit processor and request handler (#5166) 11 months ago
readme.md [Inference]Update inference config and fix test (#5178) 11 months ago
sampler.py adapted to pad_context_forward 11 months ago
struct.py adapted to pad_context_forward 11 months ago

readme.md

Colossal-Infer

Introduction

Colossal-Infer is a library for inference of LLMs and MLMs. It is built on top of Colossal AI.

Structures

Overview

The main design will be released later on.

Roadmap

  • [] design of structures
  • [] Core components
    • [] engine
    • [] request handler
    • [] kv cache manager
    • [] modeling
    • [] custom layers
    • [] online server
  • [] supported models
    • [] llama2