Commit Graph

499 Commits (0e4e62d30d75888ea4c774f42e4cc611adb9f8b0)

Author SHA1 Message Date
Frank Lee 0e4e62d30d
[tensor] added __repr__ to spec (#1147)
2 years ago
YuliangLiu0306 70dd88e2ee
[pipeline]add customized policy (#1139)
2 years ago
YuliangLiu0306 18091581c0
[pipeline]support more flexible pipeline (#1138)
2 years ago
ver217 ccf3c58c89
embedding op use gather_out (#1143)
2 years ago
ver217 6690a61b4d
[hotfix] prevent nested ZeRO (#1140)
2 years ago
Frank Lee 15aab1476e
[zero] avoid zero hook spam by changing log to debug level (#1137)
2 years ago
Frank Lee 73ad05fc8c
[zero] added error message to handle on-the-fly import of torch Module class (#1135)
2 years ago
ver217 e4f555f29a
[optim] refactor fused sgd (#1134)
2 years ago
ver217 d26902645e
[ddp] add save/load state dict for ColoDDP (#1127)
2 years ago
YuliangLiu0306 946dbd629d
[hotfix]fix bugs caused by refactored pipeline (#1133)
2 years ago
ver217 789cad301b
[hotfix] fix param op hook (#1131)
2 years ago
ver217 a1a7899cae
[hotfix] fix zero init ctx numel (#1128)
2 years ago
ver217 f0a954f16d
[ddp] add set_params_to_ignore for ColoDDP (#1122)
2 years ago
YuliangLiu0306 3175bcb4d8
[pipeline]support List of Dict data (#1125)
2 years ago
Frank Lee 91a5999825
[ddp] supported customized torch ddp configuration (#1123)
2 years ago
YuliangLiu0306 fcf55777dd
[fx]add autoparallel passes (#1121)
2 years ago
ver217 e127b4375b
cast colo ddp v2 inputs/outputs (#1120)
2 years ago
Frank Lee 16302a5359
[fx] added unit test for coloproxy (#1119)
2 years ago
ver217 7d14b473f0
[gemini] gemini mgr supports "cpu" placement policy (#1118)
2 years ago
ver217 f99f56dff4
fix colo parameter torch function (#1117)
2 years ago
Frank Lee e1620ddac2
[fx] added coloproxy (#1115)
2 years ago
Frank Lee 6f82ac9bcb
[pipeline] supported more flexible dataflow control for pipeline parallel training (#1108)
2 years ago
ver217 895c1c5ee7
[tensor] refactor param op hook (#1097)
2 years ago
YuliangLiu0306 1e9f9c227f
[hotfix]change to fit latest p2p (#1100)
2 years ago
Frank Lee 72bd7c696b
[amp] included dict for type casting of model output (#1102)
2 years ago
Frank Lee 7f2d2b2b5b
[engine] fixed empty op hook check (#1096)
3 years ago
Frank Lee 14e5b11d7f
[zero] fixed api consistency (#1098)
3 years ago
Frank Lee cb18922c47
[doc] added documentation to chunk and chunk manager (#1094)
3 years ago
ver217 1f894e033f
[gemini] zero supports gemini (#1093)
3 years ago
Frank Lee 2b2dc1c86b
[pipeline] refactor the pipeline module (#1087)
3 years ago
Frank Lee bad5d4c0a1
[context] support lazy init of module (#1088)
3 years ago
ver217 be01db37c8
[tensor] refactor chunk mgr and impl MemStatsCollectorV2 (#1077)
3 years ago
Frank Lee 50ec3a7e06
[test] skip tests when not enough GPUs are detected (#1090)
3 years ago
Ziyue Jiang 0653c63eaa
[Tensor] 1d row embedding (#1075)
3 years ago
junxu d66ffb4df4
Remove duplication registry (#1078)
3 years ago
Jiarui Fang bcab249565
fix issue #1080 (#1071)
3 years ago
ver217 1b17859328
[tensor] chunk manager monitor mem usage (#1076)
3 years ago
ver217 98cdbf49c6
[hotfix] fix chunk comm src rank (#1072)
3 years ago
Frank Lee bfdc5ccb7b
[context] maintain the context object in with statement (#1073)
3 years ago
ver217 c5cd3b0f35
[zero] zero optim copy chunk rather than copy tensor (#1070)
3 years ago
Ziyue Jiang 4fc748f69b
[Tensor] fix optimizer for CPU parallel (#1069)
3 years ago
Jiarui Fang 49832b2344
[refactory] add nn.parallel module (#1068)
3 years ago
Ziyue Jiang 6754f1b77f
fix module utils bug (#1066)
3 years ago
Jiarui Fang a00644079e
reorgnize colotensor directory (#1062)
3 years ago
Frank Lee 3d10be33bd
[cudnn] set False to cudnn benchmark by default (#1063)
3 years ago
Ziyue Jiang df9dcbbff6
[Tensor] add hybrid device demo and fix bugs (#1059)
3 years ago
YuliangLiu0306 b167258b6a
[pipeline]refactor ppschedule to support tensor list (#1050)
3 years ago
ver217 e3fde4ee6b
fix import error in sharded model v2 (#1053)
3 years ago
ver217 e1922ea4f6
[zero] add chunk size search for chunk manager (#1052)
3 years ago
アマデウス 2c42b230f3
updated collective ops api (#1054)
3 years ago