mirror of https://github.com/hpcaitech/ColossalAI
1468e4bcfc
* fixes memory leak when paramter is in fp16 in ZeroDDP init. * bans chunk releasement in CUDA. Only when a chunk is about to offload, it is allowed to release. * adds a constant placement policy. With it, users can allocate a reserved caching memory space for parameters. |
||
---|---|---|
.. | ||
layers | ||
__init__.py | ||
data_parallel.py | ||
reducer.py | ||
utils.py |