You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/tensor
flybird11111 295dd2d9fe
[zerobubble] rebase main (#6075)
2 months ago
..
d_tensor [compatibility] support torch 2.2 (#5875) 5 months ago
moe_tensor [MoE/ZeRO] Moe refactor with zero refactor (#5821) 5 months ago
padded_tensor [shardformer] refactor embedding resize (#5603) 7 months ago
__init__.py [misc] update pre-commit and run all files (#4752) 1 year ago
colo_parameter.py [zerobubble] rebase main (#6075) 2 months ago
colo_tensor.py [misc] update pre-commit and run all files (#4752) 1 year ago
comm_spec.py fix some typo (#5307) 10 months ago
param_op_hook.py [zerobubble] rebase main (#6075) 2 months ago
shape_consistency.py [misc] update pre-commit and run all files (#4752) 1 year ago
sharding_spec.py [Auto Parallel]: Speed up intra-op plan generation by 44% (#5446) 5 months ago
utils.py [misc] update pre-commit and run all files (#4752) 1 year ago