ColossalAI/colossalai/nn
HELSON 1468e4bcfc
[zero] add constant placement policy (#1705)
* fixes memory leak when paramter is in fp16 in ZeroDDP init.
* bans chunk releasement in CUDA. Only when a chunk is about to offload, it is allowed to release.
* adds a constant placement policy. With it, users can allocate a reserved caching memory space for parameters.
2022-10-14 17:53:16 +08:00
..
_ops [hotfix[ fix colotensor.type() raise NotImplementedError (#1682) 2022-10-10 10:13:31 +08:00
graph [NFC] polish doc style for ColoTensor (#1457) 2022-08-16 09:21:05 +08:00
layer [moe] fix moe bugs (#1633) 2022-09-23 15:33:57 +08:00
loss [NFC] polish colossalai/nn/loss/loss_2p5d.py code style (#1553) 2022-09-08 22:11:04 +08:00
lr_scheduler [NFC] polish colossalai/nn/lr_scheduler/multistep.py code style (#1572) 2022-09-08 22:11:04 +08:00
metric [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 2022-04-02 16:12:04 +08:00
optimizer add optimizer README for tutorials (#1707) 2022-10-14 09:10:18 +00:00
parallel [zero] add constant placement policy (#1705) 2022-10-14 17:53:16 +08:00
__init__.py [pipeline] refactor the pipeline module (#1087) 2022-06-10 11:27:38 +08:00
init.py [NFC] polish colossalai/nn/init.py code style (#1292) 2022-07-13 10:51:55 +08:00