InternLM/internlm/core/scheduler
ytxiong 5ee651c2f1
feat(*): support not-flash-attn for pp and no-pp (#145)
* support not flash attention for no-pp

* support pipeline

* modify the config

* refactor the code

* refactor the code

* remove some unnecessary code
2023-07-28 16:13:04 +08:00
..
__init__.py feat(core/scheduler): support pipeline parallel (#98) 2023-07-24 20:52:09 +08:00
base_scheduler.py feat(core/scheduler): support pipeline parallel (#98) 2023-07-24 20:52:09 +08:00
no_pipeline_scheduler.py feat(*): support not-flash-attn for pp and no-pp (#145) 2023-07-28 16:13:04 +08:00
pipeline_scheduler.py feat(*): support not-flash-attn for pp and no-pp (#145) 2023-07-28 16:13:04 +08:00