Hongxin Liu
079bf3cb26
[misc] update pre-commit and run all files ( #4752 )
...
* [misc] update pre-commit
* [misc] run pre-commit
* [misc] remove useless configuration files
* [misc] ignore cuda for clang-format
2023-09-19 14:20:26 +08:00
Lufang Chen
12c95a9fed
fix runtime prepare pass ( #4502 )
...
Co-authored-by: lufang.chen <lufang.chen@nio.com>
2023-08-30 17:29:38 +08:00
Wenhao Chen
fee553288b
[NFC] polish runtime_preparation_pass style ( #4266 )
2023-07-26 14:12:57 +08:00
digger yu
7f8203af69
fix typo colossalai/auto_parallel autochunk fx/passes etc. ( #3808 )
2023-05-24 09:01:50 +08:00
digger yu
32f81f14d4
[NFC] fix typo colossalai/amp auto_parallel autochunk ( #3756 )
2023-05-19 13:50:00 +08:00
digger-yu
ad6460cf2c
[NFC] fix typo applications/ and colossalai/ ( #3735 )
2023-05-15 11:46:25 +08:00
digger-yu
b9a8dff7e5
[doc] Fix typo under colossalai and doc( #3618 )
...
* Fixed several spelling errors under colossalai
* Fix the spelling error in colossalai and docs directory
* Cautious Changed the spelling error under the example folder
* Update runtime_preparation_pass.py
revert autograft to autograd
* Update search_chunk.py
utile to until
* Update check_installation.py
change misteach to mismatch in line 91
* Update 1D_tensor_parallel.md
revert to perceptron
* Update 2D_tensor_parallel.md
revert to perceptron in line 73
* Update 2p5D_tensor_parallel.md
revert to perceptron in line 71
* Update 3D_tensor_parallel.md
revert to perceptron in line 80
* Update README.md
revert to resnet in line 42
* Update reorder_graph.py
revert to indice in line 7
* Update p2p.py
revert to megatron in line 94
* Update initialize.py
revert to torchrun in line 198
* Update routers.py
change to detailed in line 63
* Update routers.py
change to detailed in line 146
* Update README.md
revert random number in line 402
2023-04-26 11:38:43 +08:00
YuliangLiu0306
ffcdbf0f65
[autoparallel]integrate auto parallel feature with new tracer ( #3408 )
...
* [autoparallel] integrate new analyzer in module level
* unify the profiling method
* polish
* fix no codegen bug
* fix pass bug
* fix liveness test
* polish
2023-04-04 17:40:45 +08:00
YuliangLiu0306
fee2af8610
[autoparallel] adapt autoparallel with new analyzer ( #3261 )
...
* [autoparallel] adapt autoparallel with new analyzer
* fix all node handler tests
* polish
* polish
2023-03-30 17:47:24 +08:00
YuliangLiu0306
5b24987fa7
[autoparallel] fix parameters sharding bug ( #2716 )
2023-02-15 12:25:50 +08:00
YuliangLiu0306
cb2c6a2415
[autoparallel] refactor runtime pass ( #2644 )
...
* [autoparallel] refactor runtime pass
* add unit test
* polish
2023-02-15 10:36:19 +08:00
YuliangLiu0306
7fa6be49d2
[autoparallel] test compatibility for gemini and auto parallel ( #2700 )
2023-02-15 09:43:29 +08:00
YuliangLiu0306
28398f1c70
add overlap option ( #2613 )
2023-02-08 15:02:31 +08:00
YuliangLiu0306
aa0f6686f9
[autoparallel] accelerate gpt2 training ( #2495 )
2023-01-29 11:13:15 +08:00
YuliangLiu0306
67e1912b59
[autoparallel] support origin activation ckpt on autoprallel system ( #2468 )
2023-01-16 16:25:13 +08:00
Boyuan Yao
5c2ef9fc76
[autoparallel] modify comm nodes' memory cost in construct chain ( #2263 )
...
* [autoparallel] align the data_ptr with the old version of auto activation checkpoint pipeline
* [autoparallel] using fwd_time and bwd_time instead of fwd_flop and bwd_flop
* [autoparallel] specifycomm nodes' memory cost in construct chain
2023-01-03 11:38:48 +08:00
Boyuan Yao
1ea99b869e
[autoparallel] align the data_ptr with the old version of auto activation checkpoint pipeline ( #2261 )
2023-01-03 10:30:15 +08:00
Super Daniel
3ccf58aa76
[autockpt] make it work. ( #2257 )
2023-01-02 23:37:45 +08:00
Boyuan Yao
ab38aebace
[autoparallel] Hook all meta information on ResNet nodes for auto activation checkpoint ( #2248 )
...
* [autoparallel] hook node meta on graph nodes for checkpoint solver
* [autoparallel] polish code
* [autoparallel] restore some node handlers
* colossalai/auto_parallel/passes/meta_info_prop.py
* [autoparallel] remove some unused import
* [autoparallel] hook bwd_mem_out
2023-01-02 16:25:18 +08:00
Super Daniel
b7d0990c61
[autoparallel] fix construct meta info. ( #2245 )
2022-12-30 19:56:44 +08:00
YuliangLiu0306
3b1b91eaf4
[autoparallel] record parameter attribute in colotracer ( #2217 )
...
* [autoparallel] record parameter attribute in collotracer
* [autoparallel] fix construct_meta_info bug
2022-12-28 19:29:08 +08:00
Boyuan Yao
24246f7aa5
[autoparallel] Attach input, buffer and output tensor to MetaInfo class ( #2162 )
...
* [fx] metainfo class for auto parallel
* [fx] add unit test for linear metainfo
* [fx] fix bwd param for linear
* [fx] modify unit test
* [fx] modify unit test
* [fx] modify import
* [fx] modify import
* [fx] modify import
* [fx] move meta profiler to auto parallel
* [fx] add conv metainfo class
* [fx] restore profiler
* [fx] restore meta profiler
* [autoparallel] modify unit test
* [fx] modify unit test
* [autoparallel] add batchnorm metainfo class
* [autoparallel] fix batchnorm unit test function declaration
* [fx] restore profiler
* [fx] add relu metainfo class
* [fx] restore profiler
* [autoparallel] modify metainfo input
* [autoparallel] add pooling metainfo
* [autoparallel] add F.linear metainfo generator
* [autoparallel] add binary elementwise metainfo
* [fx] recover profiler
* [autoparallel] fix forward memory calculation
* [autoparallel] modify constants.py
* [autoparallel] remove redundant print
* [autoparallel] add F.conv metainfo
* [autoparallel] linear fix
* [autoparallel] memory estimation for communication actions
* [autoparallel] fix docstring
* [autoparallel] fix variables name
* [autoparallel] attach tensor to metainfo class
* [autoparallel] fix dangerous try except
* [autoparallel] attach memory cost to shape consistency node
* [autoparallel] attach shape consistency node's metainfo to the node
* [autoparallel] remove todo in shape consistency memory estimation
* [autoparallel] fix the annotation
2022-12-28 13:37:40 +08:00
Boyuan Yao
d0bc5a1b34
[autoparallel] new metainfoprop based on metainfo class ( #2179 )
...
* [autoparallel] new metainfoprop to combine SPMD solver and checkpoint solver
* [autoparallel] new metainfoprop to combine SPMD solver and checkpoint solver
* [autoparallel] modify placeholder handler
* [autoparallel] modify metainfoprop
* [autoparallel] fix function typo
* [autoparallel] fix placeholder handler
2022-12-28 13:35:08 +08:00
YuliangLiu0306
78509124d3
[autoparallel] update getitem handler ( #2207 )
2022-12-27 19:58:32 +08:00
YuliangLiu0306
4851f2d607
[autoparallel] update_getattr_handler ( #2193 )
2022-12-26 21:57:39 +08:00
YuliangLiu0306
550f8f8905
[autoparallel] integrate_gpt_related_tests ( #2134 )
...
* [autoparallel] integrate_gpt_related_tests
* polish code
* polish code
* add GPT2Model into runtime test
2022-12-23 12:36:59 +08:00
YuliangLiu0306
a3c6924deb
[autoparallel] process size nodes in runtime pass ( #2130 )
...
* [autoparallel] process size nodes in runtime pass
* polish code
2022-12-14 16:10:50 +08:00
YuliangLiu0306
cd0af9f7f6
[autoparallel] gpt2lp runtimee test ( #2113 )
2022-12-12 18:06:40 +08:00
YuliangLiu0306
e4293e5077
[hotfix] update test for latest version ( #2060 )
2022-12-02 18:12:30 +08:00
YuliangLiu0306
0dbcd4a6f5
[autoparallel] add split handler ( #2032 )
...
* [autoparallel] add split handler
* add numerical test and runtime passes
2022-11-29 11:03:51 +08:00
YuliangLiu0306
81330b0352
[autoparallel] add experimental permute handler ( #2029 )
2022-11-27 20:26:52 +08:00
YuliangLiu0306
ea0f6b8df9
[autoparallel] add runtime pass and numerical test for view handler ( #2018 )
2022-11-25 15:50:16 +08:00
YuliangLiu0306
36c0f3ea5b
[autoparallel] remove redundancy comm node ( #1893 )
2022-11-15 10:53:41 +08:00
YuliangLiu0306
1b494ad73c
[autoparallel] fix linear logical convert issue ( #1857 )
2022-11-10 17:19:22 +08:00
YuliangLiu0306
f6032ddb17
[autoparallel] fix bias addition module ( #1800 )
2022-11-08 16:21:25 +08:00
YuliangLiu0306
b4cc59b61e
[autoparallel] add numerical test for node strategies ( #1760 )
...
* [autoparallel] add numerical test for node strategies
* polish code
* polish code
2022-10-27 10:42:54 +08:00
YuliangLiu0306
314d8c497f
[autoparallel] refactor the runtime apply pass and add docstring to passes ( #1757 )
...
* [autoparallel] refactor the runtime apply pass and add doc string to passes
* fix unit test
* polish
2022-10-25 14:32:22 +08:00