ColossalAI/tests
Frank Lee 3defa32aee
Support TP-compatible Torch AMP and Update trainer API (#27)
* Add gradient accumulation, fix lr scheduler

* fix FP16 optimizer and adapted torch amp with tensor parallel (#18)

* fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes

* fixed trainer

* Revert "fixed trainer"

This reverts commit 2e0b0b7699.

* improved consistency between trainer, engine and schedule (#23)

Co-authored-by: 1SAA <c2h214748@gmail.com>

Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: ver217 <lhx0217@gmail.com>
2021-11-18 19:45:06 +08:00
..
test_config Migrated project 2021-10-28 18:21:23 +02:00
test_context Migrated project 2021-10-28 18:21:23 +02:00
test_data Migrated project 2021-10-28 18:21:23 +02:00
test_data_pipeline_tensor_parallel Support TP-compatible Torch AMP and Update trainer API (#27) 2021-11-18 19:45:06 +08:00
test_engine Support TP-compatible Torch AMP and Update trainer API (#27) 2021-11-18 19:45:06 +08:00
test_fp16_optimizer Support TP-compatible Torch AMP and Update trainer API (#27) 2021-11-18 19:45:06 +08:00
test_layers Migrated project 2021-10-28 18:21:23 +02:00
test_lr_scheduler Migrated project 2021-10-28 18:21:23 +02:00
test_models Support TP-compatible Torch AMP and Update trainer API (#27) 2021-11-18 19:45:06 +08:00
test_trainer Support TP-compatible Torch AMP and Update trainer API (#27) 2021-11-18 19:45:06 +08:00
test_utils Migrated project 2021-10-28 18:21:23 +02:00
test_zero_data_parallel Migrated project 2021-10-28 18:21:23 +02:00
test_zero_tensor_parallel Support TP-compatible Torch AMP and Update trainer API (#27) 2021-11-18 19:45:06 +08:00