Commit Graph

88 Commits (92f4224867581e741d5a673b6b807b644127fbf7)

Author SHA1 Message Date
Jiarui Fang 272ebfb57d [bug] shard param during initializing the ShardedModelV2 (#381)
3 years ago
Jiarui Fang b5f43acee3 [zero] find miss code (#378)
3 years ago
Jiarui Fang 6b6002962a [zero] zero init context collect numel of model (#375)
3 years ago
jiaruifang d9217e1960 Revert "[zero] bucketized tensor cpu gpu copy (#368)"
3 years ago
Jiarui Fang 00670c870e [zero] bucketized tensor cpu gpu copy (#368)
3 years ago
Jiarui Fang 44e4891f57 [zero] able to place params on cpu after zero init context (#365)
3 years ago
ver217 253e54d98a fix grad shape
3 years ago
Jiarui Fang ea2872073f [zero] global model data memory tracer (#360)
3 years ago
Jiarui Fang cb34cd384d [test] polish zero related unitest (#351)
3 years ago
ver217 d0ae0f2215 [zero] update sharded optim v2 (#334)
3 years ago
jiaruifang 5663616921 polish code
3 years ago
jiaruifang 7977422aeb add bert for unitest and sharded model is not able to pass the bert case
3 years ago
ver217 1388671699 [zero] Update sharded model v2 using sharded param v2 (#323)
3 years ago
Jiarui Fang 11bddb6e55 [zero] update zero context init with the updated test utils (#327)
3 years ago
Jiarui Fang de0468c7a8 [zero] zero init context (#321)
3 years ago
LuGY a3269de5c9 [zero] cpu adam kernel (#288)
3 years ago
Jiarui Fang 90d3aef62c [zero] yet an improved sharded param (#311)
3 years ago
Jiarui Fang c9e7d9582d [zero] polish shard strategy (#310)
3 years ago
ver217 3092317b80 polish code
3 years ago
ver217 36f9a74ab2 fix sharded param hook and unit test
3 years ago
ver217 001ca624dd impl shard optim v2 and add unit test
3 years ago
Jiarui Fang 74f77e314b [zero] a shard strategy in granularity of tensor (#307)
3 years ago
Jiarui Fang 80364c7686 [zero] sharded tensor (#305)
3 years ago
ver217 b105371ace rename shared adam to sharded optim v2
3 years ago
ver217 70814dc22f fix master params dtype
3 years ago
ver217 795210dd99 add fp32 master params in sharded adam
3 years ago
ver217 a109225bc2 add sharded adam
3 years ago
Jiarui Fang e17e92c54d Polish sharded parameter (#297)
3 years ago
ver217 7aef75ca42 [zero] add sharded grad and refactor grad hooks for ShardedModel (#287)
3 years ago
Frank Lee 9afb5c8b2d fixed typo in ShardParam (#294)
3 years ago
Frank Lee e17e54e32a added buffer sync to naive amp model wrapper (#291)
3 years ago
Jiarui Fang 5a560a060a Feature/zero (#279)
3 years ago
HELSON 0f8c7f9804
Fixed docstring in colossalai (#171)
3 years ago
ver217 9ef05ed1fc
try import deepspeed when using zero (#130)
3 years ago
Frank Lee 91c327cb44
fixed zero level 3 dtype bug (#76)
3 years ago
Frank Lee 35813ed3c4
update examples and sphnix docs for the new api (#63)
3 years ago
ver217 7d3711058f
fix zero3 fp16 and add zero3 model context (#62)
3 years ago
Frank Lee da01c234e1
Develop/experiments (#59)
3 years ago