Hongxin Liu
|
079bf3cb26
|
[misc] update pre-commit and run all files (#4752)
* [misc] update pre-commit
* [misc] run pre-commit
* [misc] remove useless configuration files
* [misc] ignore cuda for clang-format
|
1 year ago |
Hongxin Liu
|
b5f9e37c70
|
[legacy] clean up legacy code (#4743)
* [legacy] remove outdated codes of pipeline (#4692)
* [legacy] remove cli of benchmark and update optim (#4690)
* [legacy] remove cli of benchmark and update optim
* [doc] fix cli doc test
* [legacy] fix engine clip grad norm
* [legacy] remove outdated colo tensor (#4694)
* [legacy] remove outdated colo tensor
* [test] fix test import
* [legacy] move outdated zero to legacy (#4696)
* [legacy] clean up utils (#4700)
* [legacy] clean up utils
* [example] update examples
* [legacy] clean up amp
* [legacy] fix amp module
* [legacy] clean up gpc (#4742)
* [legacy] clean up context
* [legacy] clean core, constants and global vars
* [legacy] refactor initialize
* [example] fix examples ci
* [example] fix examples ci
* [legacy] fix tests
* [example] fix gpt example
* [example] fix examples ci
* [devops] fix ci installation
* [example] fix examples ci
|
1 year ago |
Jiarui Fang
|
b3b89865e2
|
[Gemini] ParamOpHook -> ColoParamOpHook (#2080)
|
2 years ago |
YuliangLiu0306
|
49216d7ab1
|
[autoparallel] fix bugs caused by negative dim key (#1808)
* [autoparallel] fix bugs caused by negative dim key
* fix import error
* fix matmul test issue
* fix unit test issue
|
2 years ago |
YuliangLiu0306
|
3f068d1409
|
[autoparallel] update CommSpec (#1667)
|
2 years ago |
Jiarui Fang
|
36824a304c
|
[Doc] add more doc for ColoTensor. (#1458)
|
2 years ago |
HELSON
|
943a96323e
|
[hotfix] fix no optimizer in save/load (#1363)
|
2 years ago |
Jiarui Fang
|
9bcd2fd4af
|
[tensor] a shorter shard and replicate spec (#1245)
|
2 years ago |
Jiarui Fang
|
ae7d3f4927
|
[refactor] move process group from _DistSpec to ColoTensor. (#1203)
|
2 years ago |
Jiarui Fang
|
c463f8adf9
|
[tensor] remove gpc in tensor tests (#1186)
|
2 years ago |
Jiarui Fang
|
372f791444
|
[refactor] move chunk and chunkmgr to directory gemini (#1182)
|
2 years ago |
Jiarui Fang
|
7487215b95
|
[ColoTensor] add independent process group (#1179)
|
2 years ago |
Jiarui Fang
|
f4ef224358
|
[Tensor] remove ParallelAction, use ComputeSpec instread (#1166)
|
2 years ago |
ver217
|
895c1c5ee7
|
[tensor] refactor param op hook (#1097)
* refactor param op hook
* add docstr
* fix bug
|
2 years ago |
Jiarui Fang
|
a00644079e
|
reorgnize colotensor directory (#1062)
* reorgnize colotensor directory
* polish code
|
3 years ago |
ver217
|
9492a561c3
|
[tensor] ColoTensor supports ZeRo (#1015)
* impl chunk manager
* impl param op hook
* add reduce_chunk
* add zero hook v2
* add zero dp
* fix TensorInfo
* impl load balancing when using zero without chunk
* fix zero hook
* polish chunk
* fix bugs
* ddp ok
* zero ok
* polish code
* fix bugs about load balancing
* polish code
* polish code
* add ene-to-end test
* polish code
* polish code
* polish code
* fix typo
* add test_chunk
* fix bugs
* fix bugs
* polish code
|
3 years ago |
Ziyue Jiang
|
6c5996a56e
|
[Tensor] add module check and bert test (#1031)
* add Embedding
* Add bert test
* polish
* add check module test
* polish
* polish
* polish
* polish
|
3 years ago |
Ziyue Jiang
|
32291dd73f
|
[Tensor] add module handler for linear (#1021)
* add module spec for linear
* polish
* polish
* polish
|
3 years ago |
ver217
|
ad536e308e
|
[tensor] refactor colo-tensor (#992)
* refactor colo-tensor and update linear op
* polish code
* polish code
* update ops and unit tests
* update unit tests
* polish code
* rename dist_spec module
* polish code
* polish code
* remove unneeded import
* fix pipelinable
|
3 years ago |
ver217
|
67c33f57eb
|
[tensor] design DistSpec and DistSpecManager for ColoTensor (#934)
* add dist spec
* update linear op
* polish code
* polish code
* update embedding op
* polish unit tests
* polish unit tests
* polish comments
* polish code
* add test_dist_spec_mgr
* polish code
* refactor folder structure
* polish unit tests
* add get_process_group() for TensorSpec
* polish code
|
3 years ago |
Jiarui Fang
|
ab95ec9aea
|
[Tensor] init ColoParameter (#914)
|
3 years ago |
Jiarui Fang
|
d16671da75
|
[Tensor] initialize the ColoOptimizer (#898)
* [Tensor] activation is an attr of ColoTensor
* [Tensor] add optimizer
* only detach parameters in context
* polish code
|
3 years ago |
Ziyue Jiang
|
cb182da7c5
|
[tensor] refine linear and add gather for laynorm (#893)
* refine linear and add function to ColoTensor
* add gather for layernorm
* polish
* polish
|
3 years ago |
Jiarui Fang
|
e43f83aa5c
|
[Tensor] get named parameters for model using ColoTensors (#874)
|
3 years ago |
Ziyue Jiang
|
26d4ab8b03
|
[Tensor] Add function to spec and update linear 1Drow and unit tests (#869)
|
3 years ago |
Jiarui Fang
|
0ce8924ceb
|
[tensor] reorganize files (#820)
|
3 years ago |