ColossalAI/colossalai
Hongxin Liu da39d21b71 [moe] support mixtral (#5309)
* [moe] add mixtral block for single expert

* [moe] mixtral block fwd support uneven ep

* [moe] mixtral block bwd support uneven ep

* [moe] add mixtral moe layer

* [moe] simplify replace

* [meo] support save sharded mixtral

* [meo] support load sharded mixtral

* [meo] support save sharded optim

* [meo] integrate moe manager into plug

* [meo] fix optimizer load

* [meo] fix mixtral layer
2024-02-07 19:21:02 +08:00
..
_C [setup] support pre-build and jit-build of cuda kernels (#2374) 2023-01-06 20:50:26 +08:00
_analyzer [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
accelerator [accelerator] fixed npu api 2024-01-29 14:27:52 +08:00
amp [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
auto_parallel [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
autochunk [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
booster [moe] support mixtral (#5309) 2024-02-07 19:21:02 +08:00
checkpoint_io [moe] support mixtral (#5309) 2024-02-07 19:21:02 +08:00
cli [bug] Fix the version check bug in colossalai run when generating the cmd. (#4713) 2023-09-22 10:50:47 +08:00
cluster fix-test (#5210) 2024-01-03 14:26:13 +08:00
context [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
device [npu] add npu support for hybrid plugin and llama (#5090) 2023-11-22 19:23:21 +08:00
fx [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
inference [Hotfix] Fix model policy matching strategy in ShardFormer (#5064) 2023-11-22 11:19:39 +08:00
interface [lazy] support from_pretrained (#4801) 2023-09-26 11:04:11 +08:00
kernel [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00
lazy [doc] add lazy init docs (#4808) 2023-09-27 10:24:04 +08:00
legacy [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00
logging [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
moe [moe] support mixtral (#5309) 2024-02-07 19:21:02 +08:00
nn [lr-scheduler] fix load state dict and add test (#5369) 2024-02-06 14:23:32 +08:00
pipeline [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00
shardformer fix typo change dosen't to doesn't (#5308) 2024-01-30 09:57:38 +08:00
tensor [moe] support mixtral (#5309) 2024-02-07 19:21:02 +08:00
testing [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
utils Merge pull request #5310 from hpcaitech/feature/npu 2024-01-29 13:49:39 +08:00
zero [moe] support mixtral (#5309) 2024-02-07 19:21:02 +08:00
__init__.py [accelerator] init the accelerator module (#5129) 2023-11-30 13:25:17 +08:00
initialize.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00