You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Xuanlei Zhao 7c5b1a585f
update
11 months ago
..
_C
_analyzer
amp [npu] add npu support for gemini and zero (#5067) 1 year ago
auto_parallel [npu] add npu support for gemini and zero (#5067) 1 year ago
autochunk
booster update 11 months ago
checkpoint_io
cli
cluster
context
device [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
fx
inference [Hotfix] Fix model policy matching strategy in ShardFormer (#5064) 1 year ago
interface
kernel fix thrust-transform-reduce error (#5078) 1 year ago
lazy
legacy [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
logging
moe update 12 months ago
nn [npu] add npu support for gemini and zero (#5067) 1 year ago
pipeline [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
shardformer [shardformer] llama support DistCrossEntropy (#5176) 12 months ago
tensor fix (#5158) 12 months ago
testing [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
utils [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
zero [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
__init__.py
initialize.py [npu] add npu support for gemini and zero (#5067) 1 year ago