ColossalAI/colossalai
Insu Jang 00525f7772
[shardformer] fix pipeline forward error if custom layer distribution is used (#5189)
* Use self.[distribute_layers|get_stage_index] to exploit custom layer distribution

* Change static methods for t5 layer distribution to member functions

* Change static methods for whisper layer distribution to member functions

* Replace whisper policy usage with self one

* Fix test case to use non-static layer distribution methods

* fix: fix typo

---------

Co-authored-by: Wenhao Chen <cwher@outlook.com>
2024-03-27 13:57:00 +08:00
..
_C [setup] support pre-build and jit-build of cuda kernels (#2374) 2023-01-06 20:50:26 +08:00
_analyzer [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
accelerator [hotfix] fix typo change MoECheckpintIO to MoECheckpointIO (#5335) 2024-03-05 21:52:30 +08:00
amp [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
auto_parallel [hotfix] Fix wrong import in meta_registry (#5392) 2024-02-20 19:24:43 +08:00
autochunk [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
booster [hotfix] set return_outputs=False in examples and polish code (#5404) 2024-03-25 12:31:09 +08:00
checkpoint_io [hotfix] fix typo change MoECheckpintIO to MoECheckpointIO (#5335) 2024-03-05 21:52:30 +08:00
cli [devops] fix extention building (#5427) 2024-03-05 15:35:54 +08:00
cluster fix-test (#5210) 2024-01-03 14:26:13 +08:00
context [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
device [npu] add npu support for hybrid plugin and llama (#5090) 2023-11-22 19:23:21 +08:00
fx [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
inference [hotfix] fix typo s/keywrods/keywords etc. (#5429) 2024-03-12 11:25:16 +08:00
interface [lazy] support from_pretrained (#4801) 2023-09-26 11:04:11 +08:00
kernel [shardformer] update colo attention to support custom mask (#5510) 2024-03-27 11:19:32 +08:00
lazy [doc] add lazy init docs (#4808) 2023-09-27 10:24:04 +08:00
legacy Fix ColoTensorSpec for py11 (#5440) 2024-03-26 15:56:49 +08:00
logging [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
moe [hotfix] fix typo change MoECheckpintIO to MoECheckpointIO (#5335) 2024-03-05 21:52:30 +08:00
nn [shardformer] update colo attention to support custom mask (#5510) 2024-03-27 11:19:32 +08:00
pipeline [hotfix] set return_outputs=False in examples and polish code (#5404) 2024-03-25 12:31:09 +08:00
shardformer [shardformer] fix pipeline forward error if custom layer distribution is used (#5189) 2024-03-27 13:57:00 +08:00
tensor fixed layout converter caching and updated tester 2024-03-26 17:22:27 +08:00
testing [shardformer] update colo attention to support custom mask (#5510) 2024-03-27 11:19:32 +08:00
utils Merge pull request #5310 from hpcaitech/feature/npu 2024-01-29 13:49:39 +08:00
zero [llama] fix training and inference scripts (#5384) 2024-02-19 16:41:04 +08:00
__init__.py [accelerator] init the accelerator module (#5129) 2023-11-30 13:25:17 +08:00
initialize.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00