12 Commits (8241c0c054b38a109ed3ce7be1052a1e600b8471)

Author SHA1 Message Date
Hongxin Liu da15fdb9ca
[doc] add lazy init docs (#4808) 1 year ago
Hongxin Liu 4965c0dabd
[lazy] support from_pretrained (#4801) 1 year ago
Hongxin Liu 3e05c07bb8
[lazy] support torch 2.0 (#4763) 1 year ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752) 1 year ago
Hongxin Liu 890774b2fb [shardformer] support lazy init (#4202) 1 year ago
Hongxin Liu fc5cef2c79
[lazy] support init on cuda (#4269) 1 year ago
Frank Lee c4b1b65931 [test] fixed tests failed due to dtensor change (#4082) 1 year ago
Frank Lee 8eb09a4c69 [shardformer] support module saving and loading (#4062) 1 year ago
Frank Lee ddcf58cacf
Revert "[sync] sync feature/shardformer with develop" 1 year ago
Frank Lee eb39154d40
[dtensor] updated api and doc (#3845) 1 year ago
Hongxin Liu 9c88b6cbd1
[lazy] fix compatibility problem on torch 1.13 (#3911) 1 year ago
Hongxin Liu dbb32692d2
[lazy] refactor lazy init (#3891) 1 year ago
Hongxin Liu 4341f5e8e6
[lazyinit] fix clone and deepcopy (#3553) 2 years ago
Hongxin Liu 152239bbfa
[gemini] gemini supports lazy init (#3379) 2 years ago
ver217 f8289d4221
[lazyinit] combine lazy tensor with dtensor (#3204) 2 years ago
ver217 6ae8ed0407
[lazyinit] add correctness verification (#3147) 2 years ago
ver217 ed8f60b93b
[lazyinit] refactor lazy tensor and lazy init ctx (#3131) 2 years ago
Super Daniel 35c0c0006e
[utils] lazy init. (#2148) 2 years ago