ColossalAI/colossalai/booster
flybird11111 21aa5de00b
[gemini] hotfix NaN loss while using Gemini + tensor_parallel (#5150)
* fix

aaa

fix

fix

fix

* fix

* fix

* test ci

* fix ci

fix
2023-12-08 11:10:51 +08:00
..
mixed_precision [npu] add npu support for hybrid plugin and llama (#5090) 2023-11-22 19:23:21 +08:00
plugin [gemini] hotfix NaN loss while using Gemini + tensor_parallel (#5150) 2023-12-08 11:10:51 +08:00
__init__.py [booster] implemented the torch ddd + resnet example (#3232) 2023-03-27 10:24:14 +08:00
accelerator.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
booster.py [lazy] support from_pretrained (#4801) 2023-09-26 11:04:11 +08:00