Hongxin Liu
|
2dd01e3a14
|
[gemini] fix param op hook when output is tuple (#5355)
* [gemini] fix param op hook when output is tuple
* [gemini] fix param op hook
|
2024-02-04 11:58:26 +08:00 |
Hongxin Liu
|
079bf3cb26
|
[misc] update pre-commit and run all files (#4752)
* [misc] update pre-commit
* [misc] run pre-commit
* [misc] remove useless configuration files
* [misc] ignore cuda for clang-format
|
2023-09-19 14:20:26 +08:00 |
Hongxin Liu
|
27061426f7
|
[gemini] improve compatibility and add static placement policy (#4479)
* [gemini] remove distributed-related part from colotensor (#4379)
* [gemini] remove process group dependency
* [gemini] remove tp part from colo tensor
* [gemini] patch inplace op
* [gemini] fix param op hook and update tests
* [test] remove useless tests
* [test] remove useless tests
* [misc] fix requirements
* [test] fix model zoo
* [test] fix model zoo
* [test] fix model zoo
* [test] fix model zoo
* [test] fix model zoo
* [misc] update requirements
* [gemini] refactor gemini optimizer and gemini ddp (#4398)
* [gemini] update optimizer interface
* [gemini] renaming gemini optimizer
* [gemini] refactor gemini ddp class
* [example] update gemini related example
* [example] update gemini related example
* [plugin] fix gemini plugin args
* [test] update gemini ckpt tests
* [gemini] fix checkpoint io
* [example] fix opt example requirements
* [example] fix opt example
* [example] fix opt example
* [example] fix opt example
* [gemini] add static placement policy (#4443)
* [gemini] add static placement policy
* [gemini] fix param offload
* [test] update gemini tests
* [plugin] update gemini plugin
* [plugin] update gemini plugin docstr
* [misc] fix flash attn requirement
* [test] fix gemini checkpoint io test
* [example] update resnet example result (#4457)
* [example] update bert example result (#4458)
* [doc] update gemini doc (#4468)
* [example] update gemini related examples (#4473)
* [example] update gpt example
* [example] update dreambooth example
* [example] update vit
* [example] update opt
* [example] update palm
* [example] update vit and opt benchmark
* [hotfix] fix bert in model zoo (#4480)
* [hotfix] fix bert in model zoo
* [test] remove chatglm gemini test
* [test] remove sam gemini test
* [test] remove vit gemini test
* [hotfix] fix opt tutorial example (#4497)
* [hotfix] fix opt tutorial example
* [hotfix] fix opt tutorial example
|
2023-08-24 09:29:25 +08:00 |
digger yu
|
0e484e6201
|
[nfc]fix typo colossalai/pipeline tensor nn (#3899)
* fix typo colossalai/autochunk auto_parallel amp
* fix typo colossalai/auto_parallel nn utils etc.
* fix typo colossalai/auto_parallel autochunk fx/passes etc.
* fix typo docs/
* change placememt_policy to placement_policy in docs/ and examples/
* fix typo colossalai/ applications/
* fix typo colossalai/cli fx kernel
* fix typo colossalai/nn
* revert change warmuped
* fix typo colossalai/pipeline tensor nn
|
2023-06-06 14:07:36 +08:00 |
YH
|
8f740deb53
|
Fix typo (#3448)
|
2023-04-06 09:43:31 +08:00 |
1SAA
|
33f3023e19
|
[hotfix] fix implement error in diffusers
|
2023-01-06 18:37:18 +08:00 |
HELSON
|
2458659919
|
[zero] fix error for BEiT models (#2169)
* [zero] fix error for BEiT models
* [ColoParameter] add unpack operation for tuple arguments
* fix bugs
* fix chunkv2 unit testing
* add assertion for gradient state
|
2022-12-26 15:03:54 +08:00 |
Jiarui Fang
|
e99edfcb51
|
[NFC] polish comments for Chunk class (#2116)
|
2022-12-12 15:39:31 +08:00 |
Jiarui Fang
|
b3b89865e2
|
[Gemini] ParamOpHook -> ColoParamOpHook (#2080)
|
2022-12-05 17:11:06 +08:00 |
YuliangLiu0306
|
49216d7ab1
|
[autoparallel] fix bugs caused by negative dim key (#1808)
* [autoparallel] fix bugs caused by negative dim key
* fix import error
* fix matmul test issue
* fix unit test issue
|
2022-11-08 17:03:50 +08:00 |
ver217
|
d068af81a3
|
[doc] update rst and docstring (#1351)
* update rst
* add zero docstr
* fix docstr
* remove fx.tracer.meta_patch
* fix docstr
* fix docstr
* update fx rst
* fix fx docstr
* remove useless rst
|
2022-07-21 15:54:53 +08:00 |
Jiarui Fang
|
ae7d3f4927
|
[refactor] move process group from _DistSpec to ColoTensor. (#1203)
|
2022-07-06 16:15:16 +08:00 |
Jiarui Fang
|
4b9bba8116
|
[ColoTensor] rename APIs and add output_replicate to ComputeSpec (#1168)
|
2022-06-24 13:08:54 +08:00 |
ver217
|
789cad301b
|
[hotfix] fix param op hook (#1131)
* fix param op hook
* update zero tp test
* fix bugs
|
2022-06-17 16:12:05 +08:00 |
ver217
|
895c1c5ee7
|
[tensor] refactor param op hook (#1097)
* refactor param op hook
* add docstr
* fix bug
|
2022-06-13 16:11:53 +08:00 |
ver217
|
9492a561c3
|
[tensor] ColoTensor supports ZeRo (#1015)
* impl chunk manager
* impl param op hook
* add reduce_chunk
* add zero hook v2
* add zero dp
* fix TensorInfo
* impl load balancing when using zero without chunk
* fix zero hook
* polish chunk
* fix bugs
* ddp ok
* zero ok
* polish code
* fix bugs about load balancing
* polish code
* polish code
* add ene-to-end test
* polish code
* polish code
* polish code
* fix typo
* add test_chunk
* fix bugs
* fix bugs
* polish code
|
2022-05-31 12:00:12 +08:00 |