You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/tensor
Haze188 416580b314
[MoE/ZeRO] Moe refactor with zero refactor (#5821)
5 months ago
..
d_tensor [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 7 months ago
moe_tensor [MoE/ZeRO] Moe refactor with zero refactor (#5821) 5 months ago
padded_tensor [shardformer] refactor embedding resize (#5603) 7 months ago
__init__.py [misc] update pre-commit and run all files (#4752) 1 year ago
colo_parameter.py [gemini] fix param op hook when output is tuple (#5355) 10 months ago
colo_tensor.py [misc] update pre-commit and run all files (#4752) 1 year ago
comm_spec.py fix some typo (#5307) 10 months ago
param_op_hook.py [gemini] fix param op hook when output is tuple (#5355) 10 months ago
shape_consistency.py [misc] update pre-commit and run all files (#4752) 1 year ago
sharding_spec.py [misc] update pre-commit and run all files (#4752) 1 year ago
utils.py [misc] update pre-commit and run all files (#4752) 1 year ago