Commit Graph

11 Commits (f8b9aaef47d5a2b66db87b5d2b093639a66a131f)

Author SHA1 Message Date
Jiarui Fang 4d9332b4c5
[refactor] moving memtracer to gemini (#801) 2022-04-19 10:13:08 +08:00
ver217 f69507dd22
update rst (#615) 2022-04-01 15:46:38 +08:00
Liang Bowen 2c45efc398
html refactor (#555) 2022-03-31 11:36:56 +08:00
LuGY c44d797072
[docs] updatad docs of hybrid adam and cpu adam (#552) 2022-03-30 18:14:59 +08:00
ver217 ffca99d187
[doc] update apidoc (#530) 2022-03-25 18:29:43 +08:00
ver217 7e30068a22
[doc] update rst (#470)
* update rst

* remove empty rst
2022-03-21 10:52:45 +08:00
Frank Lee be85a0f366 removed tutorial markdown and refreshed rst files for consistency 2022-01-19 17:01:37 +08:00
Frank Lee 35813ed3c4
update examples and sphnix docs for the new api (#63) 2021-12-13 22:07:01 +08:00
Frank Lee 3defa32aee
Support TP-compatible Torch AMP and Update trainer API (#27)
* Add gradient accumulation, fix lr scheduler

* fix FP16 optimizer and adapted torch amp with tensor parallel (#18)

* fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes

* fixed trainer

* Revert "fixed trainer"

This reverts commit 2e0b0b7699.

* improved consistency between trainer, engine and schedule (#23)

Co-authored-by: 1SAA <c2h214748@gmail.com>

Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: ver217 <lhx0217@gmail.com>
2021-11-18 19:45:06 +08:00
ver217 3c7604ba30 update documentation 2021-10-29 09:29:20 +08:00
zbian 404ecbdcc6 Migrated project 2021-10-28 18:21:23 +02:00