Commit Graph

53 Commits (b72b8445c69f56f6522791585897c595dc89df7d)

Author SHA1 Message Date
Jiarui Fang 496cbb0760
[hotfix] fix initialize bug with zero (#442)
3 years ago
Jiarui Fang 640a6cd304
[refactory] refactory the initialize method for new zero design (#431)
3 years ago
ver217 fce9432f08 sync before creating empty grad
3 years ago
ver217 ea6905a898 free param.grad
3 years ago
ver217 9506a8beb2 use double buffer to handle grad
3 years ago
Jiarui Fang adebb3e041
[zero] cuda margin space for OS (#418)
3 years ago
Jiarui Fang 56bb412e72
[polish] use GLOBAL_MODEL_DATA_TRACER (#417)
3 years ago
Jiarui Fang 23ba3fc450
[zero] refactory ShardedOptimV2 init method (#416)
3 years ago
Frank Lee e79ea44247
[fp16] refactored fp16 optimizer (#392)
3 years ago
Jiarui Fang 21dc54e019
[zero] memtracer to record cuda memory usage of model data and overall system (#395)
3 years ago
Jiarui Fang 370f567e7d
[zero] new interface for ShardedOptimv2 (#406)
3 years ago
ver217 63469c0f91 polish code
3 years ago
ver217 88804aee49 add bucket tensor shard strategy
3 years ago
HELSON 7c079d9c33
[hotfix] fixed bugs in ShardStrategy and PcieProfiler (#394)
3 years ago
Jiarui Fang 3af13a2c3e [zero] polish ShardedOptimV2 unittest (#385)
3 years ago
Jiarui Fang 272ebfb57d [bug] shard param during initializing the ShardedModelV2 (#381)
3 years ago
Jiarui Fang b5f43acee3 [zero] find miss code (#378)
3 years ago
Jiarui Fang 6b6002962a [zero] zero init context collect numel of model (#375)
3 years ago
jiaruifang d9217e1960 Revert "[zero] bucketized tensor cpu gpu copy (#368)"
3 years ago
Jiarui Fang 00670c870e [zero] bucketized tensor cpu gpu copy (#368)
3 years ago
Jiarui Fang 44e4891f57 [zero] able to place params on cpu after zero init context (#365)
3 years ago
ver217 253e54d98a fix grad shape
3 years ago
Jiarui Fang ea2872073f [zero] global model data memory tracer (#360)
3 years ago
Jiarui Fang cb34cd384d [test] polish zero related unitest (#351)
3 years ago
ver217 d0ae0f2215 [zero] update sharded optim v2 (#334)
3 years ago
jiaruifang 5663616921 polish code
3 years ago
jiaruifang 7977422aeb add bert for unitest and sharded model is not able to pass the bert case
3 years ago
ver217 1388671699 [zero] Update sharded model v2 using sharded param v2 (#323)
3 years ago
Jiarui Fang 11bddb6e55 [zero] update zero context init with the updated test utils (#327)
3 years ago
Jiarui Fang de0468c7a8 [zero] zero init context (#321)
3 years ago
LuGY a3269de5c9 [zero] cpu adam kernel (#288)
3 years ago
Jiarui Fang 90d3aef62c [zero] yet an improved sharded param (#311)
3 years ago
Jiarui Fang c9e7d9582d [zero] polish shard strategy (#310)
3 years ago
ver217 3092317b80 polish code
3 years ago
ver217 36f9a74ab2 fix sharded param hook and unit test
3 years ago
ver217 001ca624dd impl shard optim v2 and add unit test
3 years ago
Jiarui Fang 74f77e314b [zero] a shard strategy in granularity of tensor (#307)
3 years ago
Jiarui Fang 80364c7686 [zero] sharded tensor (#305)
3 years ago
ver217 b105371ace rename shared adam to sharded optim v2
3 years ago
ver217 70814dc22f fix master params dtype
3 years ago
ver217 795210dd99 add fp32 master params in sharded adam
3 years ago
ver217 a109225bc2 add sharded adam
3 years ago
Jiarui Fang e17e92c54d Polish sharded parameter (#297)
3 years ago
ver217 7aef75ca42 [zero] add sharded grad and refactor grad hooks for ShardedModel (#287)
3 years ago
Frank Lee 9afb5c8b2d fixed typo in ShardParam (#294)
3 years ago
Frank Lee e17e54e32a added buffer sync to naive amp model wrapper (#291)
3 years ago
Jiarui Fang 5a560a060a Feature/zero (#279)
3 years ago
HELSON 0f8c7f9804
Fixed docstring in colossalai (#171)
3 years ago
ver217 9ef05ed1fc
try import deepspeed when using zero (#130)
3 years ago
Frank Lee 91c327cb44
fixed zero level 3 dtype bug (#76)
3 years ago