Commit Graph

24 Commits (629e6a5ad1c2df22559333e9fa9accd639609850)

Author SHA1 Message Date
Wenwen Qu 629e6a5ad1 add comments for moe 2023-08-25 19:03:31 +08:00
Wenwen Qu aa2612edc4
Merge branch 'develop' into feature_add_moe 2023-08-25 13:35:56 +08:00
Wenwen Qu 409f139ba5 merge 2023-08-24 16:38:36 +08:00
ytxiong 9cd1e0314e
fix(pipeline): modify the sequence_parallel in pipeline (#227)
* move sequence_parallel to parallel config

* set the sequece_parallel default value is False

* fix lint

* fix lint

* fix lint

* modify the sequence_parallel in pp
2023-08-24 14:45:40 +08:00
zhanglei 72e3b1afd5 change the scale position for latent moe_loss 2023-08-23 13:25:20 +08:00
zhanglei 3a3ca71459 fix moe loss logger for the interleaved pp 2023-08-23 13:03:21 +08:00
zhanglei 8407c203a3 refactor code 2023-08-22 10:53:21 +08:00
zhanglei ac243e5b33 refactor code 2023-08-22 10:42:39 +08:00
zhanglei a8dd77ce76 fix bug on logger 2023-08-22 10:35:17 +08:00
huangting4201 4832671abe
fix(pipeline_scheduler.py): fix tensor shape err and comm block (#210) 2023-08-21 12:09:27 +08:00
zhanglei db685e8a31 fix the pp moe bugs 2023-08-21 09:59:58 +08:00
zhanglei 7b1709a7ff Merge branch 'feature_add_moe' of github.com:blankde/InternLM into feature_add_moe_pp_zl
Conflicts:
	train.py
2023-08-17 17:00:04 +08:00
zhanglei 2983076d89 add logger for moe_loss 2023-08-17 16:52:11 +08:00
zhanglei 8cdd1abb35 suppport interleaved pp 2023-08-16 12:02:59 +08:00
huangting4201 db13bc46bc
fix(ci): fix ci train error (#199) 2023-08-15 20:09:54 +08:00
zhanglei 92a31732f9 fix the moe_loss_coeff bug 2023-08-15 18:47:19 +08:00
zhanglei 8c7d868f01 support zero_overlap_communication 2023-08-15 16:18:20 +08:00
zhanglei 1accc9f08d add no-interleaved & no-overlapped moe pp support 2023-08-14 11:10:37 +08:00
ytxiong c219065348
feat(*): support sequence_parallel (#180)
* support sequence_parallel for no pipeline

* sequence_parallel does not support no-flash-attn

* support sequence parallel for pipeline

* add memory profiler

* Update 13B.py

* add memory profiler

* fix evaluation bug

* remove some unnecessary code

* remove some unnecessary code

* Update parallel_context.py

* modify the config

* remove memory profiler

* modify the config

* support selective dropout
2023-08-07 16:42:52 +08:00
cx 0268d8eda1
refactor(scheduler): rewrite pipeline scheduler (#138)
* refactor(scheduler): rewrite pipeline scheduler

* fix(*): fix pipeline scheduler bugs

* fix(*): fix merge bug

* feat(*): update codes with todo tag

* feat(*): add comments

* feat(internlm/core/scheduler): update recv_prev/next logic

* feat(utils/evaluation.py): update sche metric hook for valid

---------

Co-authored-by: huangting.p <huangting@sensetime.com>
2023-08-03 11:48:12 +08:00
huangting4201 66a23e326a
feat(utils/evaluation.py): support evaluate (#154)
* style(internlm): fix lint error

* feat(utils/logger.py): support uniscale logger

* fix(utils/logger.py): fix import circular error

* feat(train.py): support dashboard metric panel and fix ci train config

* fix(ci_scripts/train/slurm_train.sh): fix ci train error

* fix(ci_scripts/train/torchrun.sh): fix ci train error

* feat(utils/evaluation.py): support evaluate on validation dataset

* fix(utils/evaluation.py): fix demo error

* fix(ci_scripts/train/ci_7B_sft.py): fix ci train error

* feat(initialize/launch.py): set default value for valid_bsz and valid_every

* fix(ci_scripts/train): restore ci update

* docs(configs/7B_sft.py): update comment for config

* fix(config.json): delete config.json

* fix evaluation bug in scheduler when use_flash_attn=False

* feat(scheduler/no_pipeline_scheduler.py): support micro_bsz>1 in no pp

* modify the jugement in pp and no-pp scheduler

* modify the data_process_func in evaluation

* fix bugs when use_flash_attn=False

* rename symbol

* feat(configs/7B_sft.py): change para valid_bsz to valid_micro_num

* feat(scheduler/no_pipeline_scheduler.py): update para set _grad_accum_batch_size

---------

Co-authored-by: 黄婷 <huangting3@CN0014010744M.local>
Co-authored-by: huangting.p <huangting@sensetime.com>
Co-authored-by: yingtongxiong <974106207@qq.com>
2023-08-02 19:03:59 +08:00
huangting4201 1f7304a8bb
feat(utils/logger.py): support uniscale logger (#152)
* style(internlm): fix lint error

* feat(utils/logger.py): support uniscale logger

* fix(utils/logger.py): fix import circular error

* feat(train.py): support dashboard metric panel and fix ci train config

* fix(ci_scripts/train/slurm_train.sh): fix ci train error

* fix(ci_scripts/train/torchrun.sh): fix ci train error

* fix(ci_scripts/train): restore ci update

* fix(config.json): delete alert webhook

* feat(train.py): optimize func init logger

* feat(config.json): delete config.json

---------

Co-authored-by: 黄婷 <huangting3@CN0014010744M.local>
Co-authored-by: huangting.p <huangting@sensetime.com>
2023-08-01 17:37:32 +08:00
ytxiong 5ee651c2f1
feat(*): support not-flash-attn for pp and no-pp (#145)
* support not flash attention for no-pp

* support pipeline

* modify the config

* refactor the code

* refactor the code

* remove some unnecessary code
2023-07-28 16:13:04 +08:00
huangting4201 762ab297ee
feat(core/scheduler): support pipeline parallel (#98)
* feat(utils/writer.py): support tensorboard writer

* feat(utils/writer.py): add class comment

* feat(core): support pipeline parallel

* fix(core): fix demo running error

* feat(solver/optimizer): add pp zero optimizer

* fix(solver/optimizer): fix word spelling error

* feat(core/scheduler): add new dir scheduler in core/

* fix(core): fix ci lint error

* feat(solver/optimizer): merge pp and nopp optimizer

* doc(usage.md): update usage doc

* feat(core/scheduler): support post func

* feat(core/scheduler): add dtype para in pp sche and update func get_tensor_shape

* feat(core/scheduler): add _load_micro_batch in base scheduler

* feat(core/scheduler): support optimizer overlap communication in pp scheduler

* feat(core/scheduler): delete data process func code

* feat(core/trainer): schedule pre processing for all schedule

---------

Co-authored-by: 黄婷 <huangting3@CN0014010744M.local>
Co-authored-by: huangting.p <huangting@sensetime.com>
2023-07-24 20:52:09 +08:00