ColossalAI/colossalai/shardformer/shard
Hongxin Liu 2b415e5999
[shardformer] support ep for deepseek v3 (#6185)
* [feature] support ep for deepseek v3

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix test

* [shardformer] fix deepseek v3 init

* [lazy] fit lora for lazy init

* [example] support npu for deepseek v3

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-02-11 16:10:25 +08:00
..
__init__.py [shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 2024-04-01 11:34:58 +08:00
grad_ckpt_config.py [shardformer] refactor pipeline grad ckpt config (#5646) 2024-04-25 15:19:30 +08:00
shard_config.py [shardformer] support ep for deepseek v3 (#6185) 2025-02-11 16:10:25 +08:00
sharder.py [nfc] fix typo colossalai/shardformer/ (#5133) 2024-01-04 16:21:55 +08:00
shardformer.py [FP8] rebase main (#5963) 2024-08-06 16:29:37 +08:00
utils.py [pipeline] update shardformer policy 2023-08-15 23:25:14 +08:00