Xuanlei Zhao
|
3acbf6d496
|
[npu] add npu support for hybrid plugin and llama (#5090)
* llama 3d
* update
* fix autocast
|
2023-11-22 19:23:21 +08:00 |
Hongxin Liu
|
079bf3cb26
|
[misc] update pre-commit and run all files (#4752)
* [misc] update pre-commit
* [misc] run pre-commit
* [misc] remove useless configuration files
* [misc] ignore cuda for clang-format
|
2023-09-19 14:20:26 +08:00 |
Jianghai
|
b366f1d99f
|
[NFC] Fix format for mixed precision (#4253)
* [NFC] polish colossalai/booster/mixed_precision/mixed_precision_base.py code style
|
2023-07-26 14:12:57 +08:00 |
Wenhao Chen
|
725af3eeeb
|
[booster] make optimizer argument optional for boost (#3993)
* feat: make optimizer optional in Booster.boost
* test: skip unet test if diffusers version > 0.10.2
|
2023-06-15 17:38:42 +08:00 |
jiangmingyan
|
e871e342b3
|
[API] add docstrings and initialization to apex amp, naive amp (#3783)
* [mixed_precison] add naive amp demo
* [mixed_precison] add naive amp demo
* [api] add docstrings and initialization to apex amp, naive amp
* [api] add docstring to apex amp/ naive amp
* [api] add docstring to apex amp/ naive amp
* [api] add docstring to apex amp/ naive amp
* [api] add docstring to apex amp/ naive amp
* [api] add docstring to apex amp/ naive amp
* [api] add docstring to apex amp/ naive amp
* [api] fix
* [api] fix
|
2023-05-23 15:17:24 +08:00 |
jiangmingyan
|
2703a37ac9
|
[amp] Add naive amp demo (#3774)
* [mixed_precison] add naive amp demo
* [mixed_precison] add naive amp demo
|
2023-05-18 16:33:14 +08:00 |
Frank Lee
|
73d3e4d309
|
[booster] implemented the torch ddd + resnet example (#3232)
* [booster] implemented the torch ddd + resnet example
* polish code
|
2023-03-27 10:24:14 +08:00 |
Frank Lee
|
ed19290560
|
[booster] implemented mixed precision class (#3151)
* [booster] implemented mixed precision class
* polish code
|
2023-03-17 11:00:15 +08:00 |