ColossalAI/colossalai/nn
Hongxin Liu e5ce4c8ea6
[npu] add npu support for gemini and zero (#5067)
* [npu] setup device utils (#5047)

* [npu] add npu device support

* [npu] support low level zero

* [test] update npu zero plugin test

* [hotfix] fix import

* [test] recover tests

* [npu] gemini support npu (#5052)

* [npu] refactor device utils

* [gemini] support npu

* [example] llama2+gemini support npu

* [kernel] add arm cpu adam kernel (#5065)

* [kernel] add arm cpu adam

* [optim] update adam optimizer

* [kernel] arm cpu adam remove bf16 support
2023-11-20 16:12:41 +08:00
..
layer [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
loss [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
lr_scheduler [hotfix] fix lr scheduler bug in torch 2.0 (#4864) 2023-10-12 14:04:24 +08:00
optimizer [npu] add npu support for gemini and zero (#5067) 2023-11-20 16:12:41 +08:00
__init__.py [legacy] move communication and nn to legacy and refactor logger (#4671) 2023-09-11 16:24:28 +08:00
init.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00