ColossalAI/colossalai
Edenzzzz 7ee569b05f
[hotfix] Fixed fused layernorm bug without apex (#5609)
* fixed fused layernorm bug without apex

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* same for flash attn

* remove flash attn check

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-04-24 23:04:06 +08:00
..
_C [setup] support pre-build and jit-build of cuda kernels (#2374) 2023-01-06 20:50:26 +08:00
_analyzer [hotfix] Fix examples no pad token & auto parallel codegen bug; (#5606) 2024-04-18 18:15:50 +08:00
accelerator [hotfix] fix typo change MoECheckpintIO to MoECheckpointIO (#5335) 2024-03-05 21:52:30 +08:00
amp [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
auto_parallel [hotfix] Fix examples no pad token & auto parallel codegen bug; (#5606) 2024-04-18 18:15:50 +08:00
autochunk [hotfix] Fix examples no pad token & auto parallel codegen bug; (#5606) 2024-04-18 18:15:50 +08:00
booster [exampe] update llama example (#5626) 2024-04-23 14:12:20 +08:00
checkpoint_io [shardformer] refactor embedding resize (#5603) 2024-04-18 16:10:18 +08:00
cli [devops] fix extention building (#5427) 2024-03-05 15:35:54 +08:00
cluster [devops] remove post commit ci (#5566) 2024-04-08 15:09:40 +08:00
context [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
device [npu] add npu support for hybrid plugin and llama (#5090) 2023-11-22 19:23:21 +08:00
fx [hotfix] Fix examples no pad token & auto parallel codegen bug; (#5606) 2024-04-18 18:15:50 +08:00
inference [devops] remove post commit ci (#5566) 2024-04-08 15:09:40 +08:00
interface [lazy] support from_pretrained (#4801) 2023-09-26 11:04:11 +08:00
kernel [devops] remove post commit ci (#5566) 2024-04-08 15:09:40 +08:00
lazy [doc] add lazy init docs (#4808) 2023-09-27 10:24:04 +08:00
legacy [devops] remove post commit ci (#5566) 2024-04-08 15:09:40 +08:00
logging [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
moe [hotfix] fix typo change MoECheckpintIO to MoECheckpointIO (#5335) 2024-03-05 21:52:30 +08:00
nn [hotfix] quick fixes to make legacy tutorials runnable (#5559) 2024-04-07 12:06:27 +08:00
pipeline [shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 2024-04-01 11:34:58 +08:00
shardformer [hotfix] Fixed fused layernorm bug without apex (#5609) 2024-04-24 23:04:06 +08:00
tensor [shardformer] refactor embedding resize (#5603) 2024-04-18 16:10:18 +08:00
testing [shardformer] refactor embedding resize (#5603) 2024-04-18 16:10:18 +08:00
utils Merge pull request #5310 from hpcaitech/feature/npu 2024-01-29 13:49:39 +08:00
zero [shardformer] update transformers (#5583) 2024-04-24 22:51:50 +08:00
__init__.py [devops] remove post commit ci (#5566) 2024-04-08 15:09:40 +08:00
initialize.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00