You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
ver217 63ee6fffe6
Merge branch 'main' into exp/mixtral
11 months ago
..
_C
_analyzer
amp [npu] add npu support for gemini and zero (#5067) 1 year ago
auto_parallel [npu] add npu support for gemini and zero (#5067) 1 year ago
autochunk
booster Merge branch 'main' into exp/mixtral 11 months ago
checkpoint_io [pipeline,shardformer] Fix p2p efficiency in pipeline, allow skipping loading weight not in weight_map when `strict=False`, fix llama flash attention forward, add flop estimation by megatron in llama benchmark (#5017) 1 year ago
cli
cluster fix-test (#5210) 11 months ago
context [moe] merge moe into main (#4978) 1 year ago
device [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
fx
inference [Hotfix] Fix model policy matching strategy in ShardFormer (#5064) 1 year ago
interface
kernel fix thrust-transform-reduce error (#5078) 1 year ago
lazy [doc] add lazy init docs (#4808) 1 year ago
legacy [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
logging
moe update 11 months ago
nn [npu] add npu support for gemini and zero (#5067) 1 year ago
pipeline [pipeline] A more general _communicate in p2p (#5062) 11 months ago
shardformer [pipeline]: fix p2p comm, add metadata cache and support llama interleaved pp (#5134) 11 months ago
tensor fix (#5158) 12 months ago
testing [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
utils [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
zero update optim 11 months ago
__init__.py
initialize.py [npu] add npu support for gemini and zero (#5067) 1 year ago