You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/inference
Jianghai 93aeacca34
[Inference]Update inference config and fix test (#5178)
11 months ago
..
core [Inference]Update inference config and fix test (#5178) 11 months ago
kv_cache [Inference]Update inference config and fix test (#5178) 11 months ago
__init__.py [Inference] First PR for rebuild colossal-infer (#5143) 11 months ago
config.py [Inference]Update inference config and fix test (#5178) 11 months ago
readme.md [Inference]Update inference config and fix test (#5178) 11 months ago
struct.py [Inference]Update inference config and fix test (#5178) 11 months ago

readme.md

Colossal-Infer

Introduction

Colossal-Infer is a library for inference of LLMs and MLMs. It is built on top of Colossal AI.

Structures

Overview

The main design will be released later on.

Roadmap

  • [] design of structures
  • [] Core components
    • [] engine
    • [] request handler
    • [] kv cache manager
    • [] modeling
    • [] custom layers
    • [] online server
  • [] supported models
    • [] llama2