You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/tensor
Hongxin Liu ccabcf6485
[fp8] support fp8 amp for hybrid parallel plugin (#5975)
4 months ago
..
d_tensor [FP8] rebase main (#5963) 4 months ago
moe_tensor [MoE/ZeRO] Moe refactor with zero refactor (#5821) 5 months ago
padded_tensor [shardformer] refactor embedding resize (#5603) 8 months ago
__init__.py [misc] update pre-commit and run all files (#4752) 1 year ago
colo_parameter.py [fp8] support fp8 amp for hybrid parallel plugin (#5975) 4 months ago
colo_tensor.py [misc] update pre-commit and run all files (#4752) 1 year ago
comm_spec.py fix some typo (#5307) 10 months ago
param_op_hook.py [fp8] support fp8 amp for hybrid parallel plugin (#5975) 4 months ago
shape_consistency.py [misc] update pre-commit and run all files (#4752) 1 year ago
sharding_spec.py [FP8] rebase main (#5963) 4 months ago
utils.py [misc] update pre-commit and run all files (#4752) 1 year ago