You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster/mixed_precision
Xuanlei Zhao 3acbf6d496
[npu] add npu support for hybrid plugin and llama (#5090)
1 year ago
..
__init__.py [misc] update pre-commit and run all files (#4752) 1 year ago
bf16.py
fp8.py
fp16_apex.py [misc] update pre-commit and run all files (#4752) 1 year ago
fp16_naive.py [misc] update pre-commit and run all files (#4752) 1 year ago
fp16_torch.py [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
mixed_precision_base.py [NFC] Fix format for mixed precision (#4253) 1 year ago