Jiarui Fang
504419d261
[FAW] add cache manager for the cached embedding ( #1419 )
2022-08-09 15:17:17 +08:00
Frank Lee
cf6d1c9284
[CLI] refactored the launch CLI and fixed bugs in multi-node launching ( #844 )
...
* [cli] fixed multi-node job launching
* [cli] fixed a bug in version comparison
* [cli] support launching with env var
* [cli] fixed multi-node job launching
* [cli] fixed a bug in version comparison
* [cli] support launching with env var
* added docstring
* [cli] added extra launch arguments
* [cli] added default launch rdzv args
* [cli] fixed version comparison
* [cli] added docstring examples and requierment
* polish docstring
* polish code
* polish code
2022-04-24 13:26:26 +08:00
Frank Lee
01e9f834f5
[dependency] removed torchvision ( #833 )
...
* [dependency] removed torchvision
* fixed transforms
2022-04-22 15:24:35 +08:00
Frank Lee
05d9ae5999
[cli] add missing requirement ( #805 )
2022-04-19 13:56:59 +08:00
Jiarui Fang
54229cd33e
[log] better logging display with rich ( #426 )
...
* better logger using rich
* remove deepspeed in zero requirements
2022-03-16 09:51:15 +08:00
BoxiangW
a2f1565672
Update GitHub action and pre-commit settings ( #196 )
...
* Update GitHub action and pre-commit settings
* Update GitHub action and pre-commit settings (#198 )
2022-01-28 16:59:53 +08:00
Frank Lee
3defa32aee
Support TP-compatible Torch AMP and Update trainer API ( #27 )
...
* Add gradient accumulation, fix lr scheduler
* fix FP16 optimizer and adapted torch amp with tensor parallel (#18 )
* fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes
* fixed trainer
* Revert "fixed trainer"
This reverts commit 2e0b0b7699
.
* improved consistency between trainer, engine and schedule (#23 )
Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: ver217 <lhx0217@gmail.com>
2021-11-18 19:45:06 +08:00
zbian
404ecbdcc6
Migrated project
2021-10-28 18:21:23 +02:00