Commit Graph

3705 Commits (5ecc27e1509575e605c47279f243491e20968558)
 

Author SHA1 Message Date
binmakeswell deaf99f4c9
[readme] sync CN readme (#766)
3 years ago
ver217 6e553748a7
polish sharded optim docstr and warning (#770)
3 years ago
LuGY 80e37eec42
fix the ckpt bugs when using DDP (#769)
3 years ago
Jiarui Fang 1f698f4406
[readme] polish readme (#764)
3 years ago
Frank Lee 920fe31526
[compatibility] used backward-compatible API for global process group (#758)
3 years ago
Frank Lee 4ea49cb536
[test] added a decorator for address already in use error with backward compatibility (#760)
3 years ago
Jiarui Fang 10ef8afdd2
[gemini] init genimi individual directory (#754)
3 years ago
ver217 dcca614eee
[hotfix] fix test_stateful_tensor_mgr (#762)
3 years ago
github-actions[bot] 6978980f6d
Automated submodule synchronization (#751)
3 years ago
ver217 a93a7d7364
[hotfix] fix reuse_fp16_shard of sharded model (#756)
3 years ago
ver217 8f7ce94b8e
[hotfix] fix auto tensor placement policy (#753)
3 years ago
HELSON 84c6700b2a
[zero] refactor memstats_collector (#746)
3 years ago
アマデウス b8899e0905
[TP] allow layernorm without bias (#750)
3 years ago
Jiarui Fang 3d7dc46d33
[zero] use factory pattern for tensor_placement_policy (#752)
3 years ago
ver217 4b048a8728
fix prepare grads in sharded optim (#749)
3 years ago
ver217 097772546e fix initialize about zero
3 years ago
ver217 e396bb71f2
[zero] add tensor placement policies (#743)
3 years ago
HELSON 22c4b88d56
[zero] refactor ShardedParamV2 for convenience (#742)
3 years ago
HELSON 340e59f968
[utils] add synchronized cuda memory monitor (#740)
3 years ago
ver217 e6212f56cd
[hotfix] fix memory leak in backward of sharded model (#741)
3 years ago
Frank Lee f4f42d4c3c
[bug] fixed DDP compatibility with torch 1.8 (#739)
3 years ago
Frank Lee a4e91bc87f
[bug] fixed grad scaler compatibility with torch 1.8 (#735)
3 years ago
Jiarui Fang 53cb584808
[utils] correct cpu memory used and capacity in the context of multi-process (#726)
3 years ago
Jiarui Fang 7db3ccc79b
[hotfix] remove duplicated param register to stateful tensor manager (#728)
3 years ago
binmakeswell 600e769a42
add video (#732)
3 years ago
Frank Lee a5c3f072f6
[bug] removed zero installation requirements (#731)
3 years ago
HELSON b9b469ea50
[moe] add checkpoint for moe zero test (#729)
3 years ago
Frank Lee 6f7d1362c9
[doc] removed outdated installation command (#730)
3 years ago
FrankLeeeee e88a498c9c [test] removed trivial outdated test
3 years ago
FrankLeeeee 62b4ce7326 [test] added missing decorators to model checkpointing tests
3 years ago
Frank Lee 1cb7bdad3b
[util] fixed communication API depth with PyTorch 1.9 (#721)
3 years ago
Frank Lee 2412429d54
[util] fixed activation checkpointing on torch 1.9 (#719)
3 years ago
Frank Lee 04ff5ea546
[utils] support detection of number of processes on current node (#723)
3 years ago
Jiarui Fang 4d90a7b513
[refactor] zero directory (#724)
3 years ago
Frank Lee 20ab1f5520
[bug] fixed broken test_found_inf (#725)
3 years ago
Jiarui Fang 193dc8dacb
[refactor] refactor the memory utils (#715)
3 years ago
HELSON dbd96fe90a
[zero] check whether gradients have inf and nan in gpu (#712)
3 years ago
ver217 715b86eadd
[hotfix] fix stm cuda model data size (#710)
3 years ago
LuGY 140263a394
[hotfix]fixed bugs of assigning grad states to non leaf nodes (#711)
3 years ago
Frank Lee eda30a058e
[compatibility] fixed tensor parallel compatibility with torch 1.9 (#700)
3 years ago
HELSON a9b8300d54
[zero] improve adaptability for not-shard parameters (#708)
3 years ago
ver217 ab8c6b4a0e
[zero] refactor memstats collector (#706)
3 years ago
アマデウス 3fc8a204dc
[]Corrected 3d vocab parallel embedding (#707)
3 years ago
HELSON ee112fe1da
[zero] adapt zero hooks for unsharded module (#699)
3 years ago
binmakeswell 896ade15d6
add PaLM link (#704) (#705)
3 years ago
binmakeswell 270157e9e7
add PaLM link (#704)
3 years ago
ver217 3c9cd5bb5e
[zero] stateful tensor manager (#687)
3 years ago
ver217 70e8dd418b
[hotfix] update requirements-test (#701)
3 years ago
Frank Lee 1ae94ea85a
[ci] remove ipc config for rootless docker (#694)
3 years ago
github-actions[bot] d878d843ad
Automated submodule synchronization (#695)
3 years ago