ColossalAI/colossalai/shardformer/shard
Edenzzzz 7ee569b05f
[hotfix] Fixed fused layernorm bug without apex (#5609)
* fixed fused layernorm bug without apex

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* same for flash attn

* remove flash attn check

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-04-24 23:04:06 +08:00
..
__init__.py [shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 2024-04-01 11:34:58 +08:00
grad_ckpt_config.py [shardformer] fix pipeline grad ckpt (#5620) 2024-04-22 11:25:39 +08:00
shard_config.py [hotfix] Fixed fused layernorm bug without apex (#5609) 2024-04-24 23:04:06 +08:00
sharder.py
shardformer.py
utils.py