You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster/mixed_precision
Hongxin Liu d202cc28c0
[npu] change device to accelerator api (#5239)
11 months ago
..
__init__.py [misc] update pre-commit and run all files (#4752) 1 year ago
bf16.py [booster] implemented mixed precision class (#3151) 2 years ago
fp8.py [booster] implemented mixed precision class (#3151) 2 years ago
fp16_apex.py [misc] update pre-commit and run all files (#4752) 1 year ago
fp16_naive.py [misc] update pre-commit and run all files (#4752) 1 year ago
fp16_torch.py [npu] change device to accelerator api (#5239) 11 months ago
mixed_precision_base.py [NFC] Fix format for mixed precision (#4253) 1 year ago