mirror of https://github.com/hpcaitech/ColossalAI
7ee569b05f
* fixed fused layernorm bug without apex * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * same for flash attn * remove flash attn check --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> |
||
---|---|---|
.. | ||
__init__.py | ||
grad_ckpt_config.py | ||
shard_config.py | ||
sharder.py | ||
shardformer.py | ||
utils.py |