You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/tensor
Hongxin Liu ccabcf6485
[fp8] support fp8 amp for hybrid parallel plugin (#5975)
4 months ago
..
d_tensor
moe_tensor
padded_tensor
__init__.py
colo_parameter.py [fp8] support fp8 amp for hybrid parallel plugin (#5975) 4 months ago
colo_tensor.py
comm_spec.py
param_op_hook.py [fp8] support fp8 amp for hybrid parallel plugin (#5975) 4 months ago
shape_consistency.py
sharding_spec.py
utils.py