You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/tensor
Edenzzzz 43995ee436
[Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694)
7 months ago
..
d_tensor [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 7 months ago
moe_tensor [moe] support mixtral (#5309) 10 months ago
padded_tensor [shardformer] refactor embedding resize (#5603) 8 months ago
__init__.py [misc] update pre-commit and run all files (#4752) 1 year ago
colo_parameter.py [gemini] fix param op hook when output is tuple (#5355) 10 months ago
colo_tensor.py [misc] update pre-commit and run all files (#4752) 1 year ago
comm_spec.py fix some typo (#5307) 10 months ago
param_op_hook.py [gemini] fix param op hook when output is tuple (#5355) 10 months ago
shape_consistency.py [misc] update pre-commit and run all files (#4752) 1 year ago
sharding_spec.py [misc] update pre-commit and run all files (#4752) 1 year ago
utils.py [misc] update pre-commit and run all files (#4752) 1 year ago