ColossalAI/colossalai/nn
Frank Lee 3defa32aee
Support TP-compatible Torch AMP and Update trainer API (#27)
* Add gradient accumulation, fix lr scheduler

* fix FP16 optimizer and adapted torch amp with tensor parallel (#18)

* fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes

* fixed trainer

* Revert "fixed trainer"

This reverts commit 2e0b0b7699.

* improved consistency between trainer, engine and schedule (#23)

Co-authored-by: 1SAA <c2h214748@gmail.com>

Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: ver217 <lhx0217@gmail.com>
2021-11-18 19:45:06 +08:00
..
data Migrated project 2021-10-28 18:21:23 +02:00
layer Support TP-compatible Torch AMP and Update trainer API (#27) 2021-11-18 19:45:06 +08:00
loss Migrated project 2021-10-28 18:21:23 +02:00
lr_scheduler Support TP-compatible Torch AMP and Update trainer API (#27) 2021-11-18 19:45:06 +08:00
model Migrated project 2021-10-28 18:21:23 +02:00
multi_tensor_apply Migrated project 2021-10-28 18:21:23 +02:00
optimizer Support TP-compatible Torch AMP and Update trainer API (#27) 2021-11-18 19:45:06 +08:00
__init__.py Migrated project 2021-10-28 18:21:23 +02:00