21 Commits (19e1a5cf16ead982eb8818cd69e41b06a5d23b20)

Author SHA1 Message Date
Xuanlei Zhao 3acbf6d496
[npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752) 1 year ago
digger yu 9c2feb2f0b
fix some typo with colossalai/device colossalai/tensor/ etc. (#4171) 1 year ago
github-actions[bot] c77b3b19be
[format] applied code formatting on changed files in pull request 4152 (#4157) 1 year ago
Frank Lee 611971248c [device] support init device mesh from process group (#3990) 1 year ago
Frank Lee ddcf58cacf
Revert "[sync] sync feature/shardformer with develop" 1 year ago
Frank Lee eb39154d40
[dtensor] updated api and doc (#3845) 1 year ago
digger yu 70c8cdecf4
[nfc] fix typo colossalai/cli fx kernel (#3847) 1 year ago
digger yu 7f8203af69
fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2 years ago
YuliangLiu0306 2059fdd6b0
[hotfix] add copyright for solver and device mesh (#2803) 2 years ago
YuliangLiu0306 aa0f6686f9
[autoparallel] accelerate gpt2 training (#2495) 2 years ago
YuliangLiu0306 2731531bc2
[autoparallel] integrate device mesh initialization into autoparallelize (#2393) 2 years ago
YuliangLiu0306 b5a3a4a65f [device] find best logical mesh 2 years ago
YuliangLiu0306 9c9246c0d9
[device] alpha beta profiler (#2311) 2 years ago
YuliangLiu0306 677e1e20d4
[device] update flatten device mesh usage (#2079) 2 years ago
Genghan Zhang d655eea515
[autoparallel] mix gather (#1977) 2 years ago
Genghan Zhang 6630d45546
[autoparallel] Add alpha beta (#1973) 2 years ago
YuliangLiu0306 b4cc59b61e
[autoparallel] add numerical test for node strategies (#1760) 2 years ago
YuliangLiu0306 4b03c25f85
[tensor]add 1D device mesh (#1492) 2 years ago
YuliangLiu0306 b73fb7a077
[tensor] support runtime ShardingSpec apply (#1453) 2 years ago
YuliangLiu0306 0442f940f0
[device] add DeviceMesh class to support logical device layout (#1394) 2 years ago