ColossalAI/colossalai/utils
ver217 f8289d4221
[lazyinit] combine lazy tensor with dtensor (#3204)
* [lazyinit] lazy tensor add distribute

* [lazyinit] refactor distribute

* [lazyinit] add test dist lazy init

* [lazyinit] add verbose info for dist lazy init

* [lazyinit] fix rnn flatten weight op

* [lazyinit] polish test

* [lazyinit] polish test

* [lazyinit] fix lazy tensor data setter

* [lazyinit] polish test

* [lazyinit] fix clean

* [lazyinit] make materialize inplace

* [lazyinit] refactor materialize

* [lazyinit] refactor test distribute

* [lazyinit] fix requires_grad

* [lazyinit] fix tolist after materialization

* [lazyinit] refactor distribute module

* [lazyinit] polish docstr

* [lazyinit] polish lazy init context

* [lazyinit] temporarily skip test

* [lazyinit] polish test

* [lazyinit] add docstr
2023-03-23 10:53:06 +08:00
..
checkpoint
checkpoint_io
data_sampler
model [lazyinit] combine lazy tensor with dtensor (#3204) 2023-03-23 10:53:06 +08:00
multi_tensor_apply
profiler
rank_recorder
tensor_detector
__init__.py
activation_checkpoint.py
checkpointing.py
common.py Fix port exception type (#2925) 2023-02-28 11:00:43 +08:00
cuda.py
memory.py
moe.py
timer.py