Jiarui Fang
|
214da761d4
|
[zero] add stateful tensor (#549)
|
2022-03-30 13:51:37 +08:00 |
Jiarui Fang
|
c11ff81b15
|
[zero] get memory usage of sharded optim v2. (#542)
|
2022-03-29 09:08:18 +08:00 |
Jiarui Fang
|
a590ed0ba3
|
[zero] improve the accuracy of get_memory_usage of sharded param (#538)
|
2022-03-28 16:19:19 +08:00 |
Jiarui Fang
|
37cb70feec
|
[zero] get memory usage for sharded param (#536)
|
2022-03-28 15:01:21 +08:00 |
ver217
|
9ec1ce6ab1
|
[zero] sharded model support the reuse of fp16 shard (#495)
* sharded model supports reuse fp16 shard
* rename variable
* polish code
* polish code
* polish code
|
2022-03-23 14:59:59 +08:00 |
ver217
|
62b0a8d644
|
[zero] sharded optim support hybrid cpu adam (#486)
* sharded optim support hybrid cpu adam
* update unit test
* polish docstring
|
2022-03-22 14:56:59 +08:00 |
Jiarui Fang
|
b334822163
|
[zero] polish sharded param name (#484)
* [zero] polish sharded param name
* polish code
* polish
* polish code
* polish
* polsih
* polish
|
2022-03-22 14:36:16 +08:00 |
ver217
|
9506a8beb2
|
use double buffer to handle grad
|
2022-03-16 14:24:09 +08:00 |
ver217
|
1388671699
|
[zero] Update sharded model v2 using sharded param v2 (#323)
|
2022-03-11 15:50:28 +08:00 |
Jiarui Fang
|
11bddb6e55
|
[zero] update zero context init with the updated test utils (#327)
|
2022-03-11 15:50:28 +08:00 |
Jiarui Fang
|
de0468c7a8
|
[zero] zero init context (#321)
* add zero init context
* add more flags for zero init context
fix bug of repeated converting param to ShardedParamV2
* polish code
|
2022-03-11 15:50:28 +08:00 |
Jiarui Fang
|
90d3aef62c
|
[zero] yet an improved sharded param (#311)
|
2022-03-11 15:50:28 +08:00 |
ver217
|
36f9a74ab2
|
fix sharded param hook and unit test
|
2022-03-11 15:50:28 +08:00 |
Jiarui Fang
|
80364c7686
|
[zero] sharded tensor (#305)
* init shard param from shape tuple
* add more unitest for shard param
* add set_payload method for ShardedParam
* [zero] add shareded tensor class
* polish code
|
2022-03-11 15:50:28 +08:00 |
Jiarui Fang
|
e17e92c54d
|
Polish sharded parameter (#297)
* init shard param from shape tuple
* add more unitest for shard param
* add more unittests to shareded param
|
2022-03-11 15:50:28 +08:00 |