ColossalAI/colossalai/tensor
Hongxin Liu ccabcf6485
[fp8] support fp8 amp for hybrid parallel plugin (#5975)
* [fp8] support fp8 amp for hybrid parallel plugin

* [test] add fp8 hook test

* [fp8] fix fp8 linear compatibility
2024-08-07 18:21:08 +08:00
..
d_tensor [FP8] rebase main (#5963) 2024-08-06 16:29:37 +08:00
moe_tensor [MoE/ZeRO] Moe refactor with zero refactor (#5821) 2024-06-28 14:00:08 +08:00
padded_tensor [shardformer] refactor embedding resize (#5603) 2024-04-18 16:10:18 +08:00
__init__.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
colo_parameter.py [fp8] support fp8 amp for hybrid parallel plugin (#5975) 2024-08-07 18:21:08 +08:00
colo_tensor.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
comm_spec.py fix some typo (#5307) 2024-01-25 13:56:27 +08:00
param_op_hook.py [fp8] support fp8 amp for hybrid parallel plugin (#5975) 2024-08-07 18:21:08 +08:00
shape_consistency.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
sharding_spec.py [FP8] rebase main (#5963) 2024-08-06 16:29:37 +08:00
utils.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00