InternLM/internlm/model
ytxiong fd398fae1a
refactor(rotaryEmbedding): refactor forward (#120)
* use fp16 in instruction (#80)

* delete torch_dtype of README's example code (#100)

* refactor the forward for rotary embedding

---------

Co-authored-by: WRH <12756472+wangruohui@users.noreply.github.com>
Co-authored-by: x54-729 <45304952+x54-729@users.noreply.github.com>
2023-07-25 15:25:48 +08:00
..
__init__.py initial commit 2023-07-06 12:55:23 +08:00
embedding.py refactor(rotaryEmbedding): refactor forward (#120) 2023-07-25 15:25:48 +08:00
linear.py initial commit 2023-07-06 12:55:23 +08:00
loss.py initial commit 2023-07-06 12:55:23 +08:00
modeling_internlm.py feat(core/scheduler): support pipeline parallel (#98) 2023-07-24 20:52:09 +08:00
multi_head_attention.py refactor(rotaryEmbedding): refactor forward (#120) 2023-07-25 15:25:48 +08:00
utils.py initial commit 2023-07-06 12:55:23 +08:00