ColossalAI/examples/language/openmoe/model
Hongxin Liu 641b1ee71a
[devops] remove post commit ci (#5566)
* [devops] remove post commit ci

* [misc] run pre-commit on all files

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-04-08 15:09:40 +08:00
..
__init__.py [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
convert_openmoe_ckpt.py [devops] remove post commit ci (#5566) 2024-04-08 15:09:40 +08:00
convert_openmoe_ckpt.sh [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
modeling_openmoe.py [fix] fix typo s/muiti-node /multi-node etc. (#5448) 2024-04-07 18:42:15 +08:00
openmoe_8b_config.json [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
openmoe_base_config.json [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
openmoe_policy.py [shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 2024-04-01 11:34:58 +08:00