216 Commits (feature/async-io)

Author SHA1 Message Date
pre-commit-ci[bot] 7c2f79fa98
[pre-commit.ci] pre-commit autoupdate (#5572) 5 months ago
Hongxin Liu 7f8b16635b
[misc] refactor launch API and tensor constructor (#5666) 7 months ago
Edenzzzz d83c633ca6
[hotfix] Fix examples no pad token & auto parallel codegen bug; (#5606) 7 months ago
Stephan Kölker 5d380a1a21
[hotfix] Fix wrong import in meta_registry (#5392) 9 months ago
Hongxin Liu d202cc28c0
[npu] change device to accelerator api (#5239) 11 months ago
Hongxin Liu e5ce4c8ea6
[npu] add npu support for gemini and zero (#5067) 1 year ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752) 1 year ago
Hongxin Liu b5f9e37c70
[legacy] clean up legacy code (#4743) 1 year ago
Hongxin Liu 554aa9592e
[legacy] move communication and nn to legacy and refactor logger (#4671) 1 year ago
Hongxin Liu ac178ca5c1 [legacy] move builder and registry to legacy (#4603) 1 year ago
Lufang Chen 12c95a9fed
fix runtime prepare pass (#4502) 1 year ago
Wenhao Chen fee553288b [NFC] polish runtime_preparation_pass style (#4266) 1 year ago
YeAnbang 3883db452c [NFC] polish unary_elementwise_generator.py code style (#4267) 1 year ago
Yanjia0 c614a99d28 [NFC] polish colossalai/auto_parallel/offload/amp_optimizer.py code style (#4255) 1 year ago
Frank Lee c4b1b65931 [test] fixed tests failed due to dtensor change (#4082) 1 year ago
digger yu e2d81eba0d
[nfc] fix typo colossalai/ applications/ (#3831) 2 years ago
digger yu 7f8203af69
fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2 years ago
digger yu 9265f2d4d7
[NFC]fix typo colossalai/auto_parallel nn utils etc. (#3779) 2 years ago
digger yu 32f81f14d4
[NFC] fix typo colossalai/amp auto_parallel autochunk (#3756) 2 years ago
digger yu 1baeb39c72
[NFC] fix typo with colossalai/auto_parallel/tensor_shard (#3742) 2 years ago
digger-yu ad6460cf2c
[NFC] fix typo applications/ and colossalai/ (#3735) 2 years ago
digger-yu b9a8dff7e5
[doc] Fix typo under colossalai and doc(#3618) 2 years ago
YuliangLiu0306 ffcdbf0f65
[autoparallel]integrate auto parallel feature with new tracer (#3408) 2 years ago
ver217 26b7aac0be
[zero] reorganize zero/gemini folder structure (#3424) 2 years ago
Frank Lee 638a07a7f9
[test] fixed gemini plugin test (#3411) 2 years ago
YuliangLiu0306 fee2af8610
[autoparallel] adapt autoparallel with new analyzer (#3261) 2 years ago
Zihao 18dbe76cae
[auto-parallel] add auto-offload feature (#3154) 2 years ago
YuliangLiu0306 47fb214b3b
[hotfix] add shard dim to aviod backward communication error (#2954) 2 years ago
YuliangLiu0306 197d0bf4ed
[autoparallel] apply repeat block to reduce solving time (#2912) 2 years ago
YuliangLiu0306 819e25d8b1
[hotfix] fix autoparallel compatibility test issues (#2754) 2 years ago
YuliangLiu0306 0f392d7403
[autoparallel] find repeat blocks (#2854) 2 years ago
Boyuan Yao eae77c831d
[autoparallel] Patch meta information for nodes that will not be handled by SPMD solver (#2823) 2 years ago
Boyuan Yao c7764d3f22
[autoparallel] Patch meta information of `torch.where` (#2822) 2 years ago
Boyuan Yao fcc4097efa
[autoparallel] Patch meta information of `torch.tanh()` and `torch.nn.Dropout` (#2773) 2 years ago
Boyuan Yao 7ea6bc7f69
[autoparallel] Patch tensor related operations meta information (#2789) 2 years ago
YuliangLiu0306 2059fdd6b0
[hotfix] add copyright for solver and device mesh (#2803) 2 years ago
Boyuan Yao 8593ae1a3f
[autoparallel] rotor solver refactor (#2813) 2 years ago
Boyuan Yao a2b43e393d
[autoparallel] Patch meta information of `torch.nn.Embedding` (#2760) 2 years ago
YuliangLiu0306 1dc003c169
[autoparallel] distinguish different parallel strategies (#2699) 2 years ago
xyupeng 2fd528b9f4
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/graph_analysis.py code style (#2737) 2 years ago
Zangwei Zheng 1819373e5c
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/batch_norm_handler.py code style (#2728) 2 years ago
ziyuhuang123 d344313533
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/embedding_handler.py code style (#2725) 2 years ago
Xue Fuzhao e81caeb4bc
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/cost_graph.py code style (#2720) 2 years ago
yuxuan-lou 51c45c2460
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/where_handler.py code style (#2723) 2 years ago
YuliangLiu0306 21d6a48f4d
[autoparallel] add shard option (#2696) 2 years ago
YuliangLiu0306 5b24987fa7
[autoparallel] fix parameters sharding bug (#2716) 2 years ago
YuliangLiu0306 cb2c6a2415
[autoparallel] refactor runtime pass (#2644) 2 years ago
YuliangLiu0306 0b2a738393
[autoparallel] remove deprecated codes (#2664) 2 years ago
YuliangLiu0306 7fa6be49d2
[autoparallel] test compatibility for gemini and auto parallel (#2700) 2 years ago
Liu Ziming 6427c406cf
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/strategy_generator.py code style (#2695) 2 years ago