InternLM/configs
ytxiong 5ee651c2f1
feat(*): support not-flash-attn for pp and no-pp (#145)
* support not flash attention for no-pp

* support pipeline

* modify the config

* refactor the code

* refactor the code

* remove some unnecessary code
2023-07-28 16:13:04 +08:00
..
7B_sft.py feat(*): support not-flash-attn for pp and no-pp (#145) 2023-07-28 16:13:04 +08:00