ColossalAI/colossalai/amp/naive_amp
Hongxin Liu e5ce4c8ea6
[npu] add npu support for gemini and zero (#5067)
* [npu] setup device utils (#5047)

* [npu] add npu device support

* [npu] support low level zero

* [test] update npu zero plugin test

* [hotfix] fix import

* [test] recover tests

* [npu] gemini support npu (#5052)

* [npu] refactor device utils

* [gemini] support npu

* [example] llama2+gemini support npu

* [kernel] add arm cpu adam kernel (#5065)

* [kernel] add arm cpu adam

* [optim] update adam optimizer

* [kernel] arm cpu adam remove bf16 support
2023-11-20 16:12:41 +08:00
..
grad_scaler [npu] add npu support for gemini and zero (#5067) 2023-11-20 16:12:41 +08:00
mixed_precision_mixin [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
__init__.py [legacy] clean up legacy code (#4743) 2023-09-18 16:31:06 +08:00
mixed_precision_optimizer.py [feature] Add clip_grad_norm for hybrid_parallel_plugin (#4837) 2023-10-12 11:32:37 +08:00