You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Zhongkai Zhao 75af66cd81
[Hotfix] Fix model policy matching strategy in ShardFormer (#5064)
1 year ago
..
_C
_analyzer
amp [npu] add npu support for gemini and zero (#5067) 1 year ago
auto_parallel [npu] add npu support for gemini and zero (#5067) 1 year ago
autochunk
booster [format] applied code formatting on changed files in pull request 5067 (#5072) 1 year ago
checkpoint_io [pipeline,shardformer] Fix p2p efficiency in pipeline, allow skipping loading weight not in weight_map when `strict=False`, fix llama flash attention forward, add flop estimation by megatron in llama benchmark (#5017) 1 year ago
cli
cluster [gemini] gemini support tensor parallelism. (#4942) 1 year ago
context
device
fx
inference [Hotfix] Fix model policy matching strategy in ShardFormer (#5064) 1 year ago
interface
kernel fix thrust-transform-reduce error (#5078) 1 year ago
lazy
legacy [npu] add npu support for gemini and zero (#5067) 1 year ago
logging
moe [hotfix]: modify create_ep_hierarchical_group and add test (#5032) 1 year ago
nn [npu] add npu support for gemini and zero (#5067) 1 year ago
pipeline [inference] refactor examples and fix schedule (#5077) 1 year ago
shardformer [Hotfix] Fix model policy matching strategy in ShardFormer (#5064) 1 year ago
tensor [hotfix]: modify create_ep_hierarchical_group and add test (#5032) 1 year ago
testing
utils [npu] add npu support for gemini and zero (#5067) 1 year ago
zero [gemini]fix gemini optimzer, saving Shardformer in Gemini got list assignment index out of range (#5085) 1 year ago
__init__.py
initialize.py [npu] add npu support for gemini and zero (#5067) 1 year ago