mirror of https://github.com/hpcaitech/ColossalAI
d202cc28c0
* update accelerator * fix timer * fix amp * update * fix * update bug * add error raise * fix autocast * fix set device * remove doc accelerator * update doc * update doc * update doc * use nullcontext * update cpu * update null context * change time limit for example * udpate * update * update * update * [npu] polish accelerator code --------- Co-authored-by: Xuanlei Zhao <xuanlei.zhao@gmail.com> Co-authored-by: zxl <43881818+oahzxl@users.noreply.github.com> |
||
---|---|---|
.. | ||
__init__.py | ||
bf16.py | ||
fp8.py | ||
fp16_apex.py | ||
fp16_naive.py | ||
fp16_torch.py | ||
mixed_precision_base.py |