Hongxin Liu
|
ccabcf6485
|
[fp8] support fp8 amp for hybrid parallel plugin (#5975)
* [fp8] support fp8 amp for hybrid parallel plugin
* [test] add fp8 hook test
* [fp8] fix fp8 linear compatibility
|
4 months ago |
Hongxin Liu
|
2dd01e3a14
|
[gemini] fix param op hook when output is tuple (#5355)
* [gemini] fix param op hook when output is tuple
* [gemini] fix param op hook
|
10 months ago |
Hongxin Liu
|
079bf3cb26
|
[misc] update pre-commit and run all files (#4752)
* [misc] update pre-commit
* [misc] run pre-commit
* [misc] remove useless configuration files
* [misc] ignore cuda for clang-format
|
1 year ago |
Hongxin Liu
|
27061426f7
|
[gemini] improve compatibility and add static placement policy (#4479)
* [gemini] remove distributed-related part from colotensor (#4379)
* [gemini] remove process group dependency
* [gemini] remove tp part from colo tensor
* [gemini] patch inplace op
* [gemini] fix param op hook and update tests
* [test] remove useless tests
* [test] remove useless tests
* [misc] fix requirements
* [test] fix model zoo
* [test] fix model zoo
* [test] fix model zoo
* [test] fix model zoo
* [test] fix model zoo
* [misc] update requirements
* [gemini] refactor gemini optimizer and gemini ddp (#4398)
* [gemini] update optimizer interface
* [gemini] renaming gemini optimizer
* [gemini] refactor gemini ddp class
* [example] update gemini related example
* [example] update gemini related example
* [plugin] fix gemini plugin args
* [test] update gemini ckpt tests
* [gemini] fix checkpoint io
* [example] fix opt example requirements
* [example] fix opt example
* [example] fix opt example
* [example] fix opt example
* [gemini] add static placement policy (#4443)
* [gemini] add static placement policy
* [gemini] fix param offload
* [test] update gemini tests
* [plugin] update gemini plugin
* [plugin] update gemini plugin docstr
* [misc] fix flash attn requirement
* [test] fix gemini checkpoint io test
* [example] update resnet example result (#4457)
* [example] update bert example result (#4458)
* [doc] update gemini doc (#4468)
* [example] update gemini related examples (#4473)
* [example] update gpt example
* [example] update dreambooth example
* [example] update vit
* [example] update opt
* [example] update palm
* [example] update vit and opt benchmark
* [hotfix] fix bert in model zoo (#4480)
* [hotfix] fix bert in model zoo
* [test] remove chatglm gemini test
* [test] remove sam gemini test
* [test] remove vit gemini test
* [hotfix] fix opt tutorial example (#4497)
* [hotfix] fix opt tutorial example
* [hotfix] fix opt tutorial example
|
1 year ago |
digger yu
|
0e484e6201
|
[nfc]fix typo colossalai/pipeline tensor nn (#3899)
* fix typo colossalai/autochunk auto_parallel amp
* fix typo colossalai/auto_parallel nn utils etc.
* fix typo colossalai/auto_parallel autochunk fx/passes etc.
* fix typo docs/
* change placememt_policy to placement_policy in docs/ and examples/
* fix typo colossalai/ applications/
* fix typo colossalai/cli fx kernel
* fix typo colossalai/nn
* revert change warmuped
* fix typo colossalai/pipeline tensor nn
|
2 years ago |
YH
|
8f740deb53
|
Fix typo (#3448)
|
2 years ago |
1SAA
|
33f3023e19
|
[hotfix] fix implement error in diffusers
|
2 years ago |
HELSON
|
2458659919
|
[zero] fix error for BEiT models (#2169)
* [zero] fix error for BEiT models
* [ColoParameter] add unpack operation for tuple arguments
* fix bugs
* fix chunkv2 unit testing
* add assertion for gradient state
|
2 years ago |
Jiarui Fang
|
e99edfcb51
|
[NFC] polish comments for Chunk class (#2116)
|
2 years ago |
Jiarui Fang
|
b3b89865e2
|
[Gemini] ParamOpHook -> ColoParamOpHook (#2080)
|
2 years ago |
YuliangLiu0306
|
49216d7ab1
|
[autoparallel] fix bugs caused by negative dim key (#1808)
* [autoparallel] fix bugs caused by negative dim key
* fix import error
* fix matmul test issue
* fix unit test issue
|
2 years ago |
ver217
|
d068af81a3
|
[doc] update rst and docstring (#1351)
* update rst
* add zero docstr
* fix docstr
* remove fx.tracer.meta_patch
* fix docstr
* fix docstr
* update fx rst
* fix fx docstr
* remove useless rst
|
2 years ago |
Jiarui Fang
|
ae7d3f4927
|
[refactor] move process group from _DistSpec to ColoTensor. (#1203)
|
2 years ago |
Jiarui Fang
|
4b9bba8116
|
[ColoTensor] rename APIs and add output_replicate to ComputeSpec (#1168)
|
2 years ago |
ver217
|
789cad301b
|
[hotfix] fix param op hook (#1131)
* fix param op hook
* update zero tp test
* fix bugs
|
2 years ago |
ver217
|
895c1c5ee7
|
[tensor] refactor param op hook (#1097)
* refactor param op hook
* add docstr
* fix bug
|
2 years ago |
ver217
|
9492a561c3
|
[tensor] ColoTensor supports ZeRo (#1015)
* impl chunk manager
* impl param op hook
* add reduce_chunk
* add zero hook v2
* add zero dp
* fix TensorInfo
* impl load balancing when using zero without chunk
* fix zero hook
* polish chunk
* fix bugs
* ddp ok
* zero ok
* polish code
* fix bugs about load balancing
* polish code
* polish code
* add ene-to-end test
* polish code
* polish code
* polish code
* fix typo
* add test_chunk
* fix bugs
* fix bugs
* polish code
|
3 years ago |