Commit Graph

21 Commits (7080a8edb08400d97ba4c31458f532e4ceeacf4b)

Author SHA1 Message Date
Jiarui Fang bc0e271e71
[buider] use builder() for cpu adam and fused optim in setup.py (#2187) 2022-12-23 16:05:13 +08:00
Frank Lee 81e0da7fa8
[setup] supported conda-installed torch (#2048)
* [setup] supported conda-installed torch

* polish code
2022-11-30 16:45:15 +08:00
Jiarui Fang 6fa71d65d3
[fx] skip diffusers unitest if it is not installed (#1799) 2022-11-08 11:45:23 +08:00
Super Daniel 5ea89f6456
[CI] downgrade fbgemm. (#1778) 2022-10-31 18:18:45 +08:00
oahzxl 25952b67d7
[feat] add flash attention (#1762) 2022-10-26 16:15:52 +08:00
Super Daniel b893342f95
[fx] test tracer on diffuser modules. (#1750)
* [fx] test tracer on diffuser modules.

* [fx] shorter seq_len.

* Update requirements-test.txt
2022-10-20 18:25:05 +08:00
Jiarui Fang 504419d261
[FAW] add cache manager for the cached embedding (#1419) 2022-08-09 15:17:17 +08:00
Super Daniel be229217ce
[fx] add torchaudio test (#1369)
* [fx]add torchaudio test

* [fx]add torchaudio test

* [fx] add torchaudio test

* [fx] add torchaudio test

* [fx] add torchaudio test

* [fx] add torchaudio test

* [fx] add torchaudio test

* [fx] add torchaudio test and test patches

* Delete ~

* [fx] add patches and patches test

* [fx] add patches and patches test

* [fx] fix patches

* [fx] fix rnn patches

* [fx] fix rnn patches

* [fx] fix rnn patches

* [fx] fix rnn patches

* [fx] merge upstream

* [fx] fix import errors
2022-07-27 11:03:14 +08:00
Boyuan Yao bb640ec728
[fx] Add colotracer compatibility test on torchrec (#1370) 2022-07-26 17:54:39 +08:00
Frank Lee b2475d8c5c
[fx] fixed unit tests for torch 1.12 (#1327) 2022-07-15 18:22:15 +08:00
YuliangLiu0306 9feff0f760
[titans]remove model zoo (#1042)
* [CLI] add CLI launcher

* Revert "[CLI] add CLI launcher"

This reverts commit df7e6506d4.

* rm model zoo
2022-05-31 10:40:47 +08:00
Frank Lee cf6d1c9284
[CLI] refactored the launch CLI and fixed bugs in multi-node launching (#844)
* [cli] fixed multi-node job launching

* [cli] fixed a bug in version comparison

* [cli] support launching with env var

* [cli] fixed multi-node job launching

* [cli] fixed a bug in version comparison

* [cli] support launching with env var

* added docstring

* [cli] added extra launch arguments

* [cli] added default launch rdzv args

* [cli] fixed version comparison

* [cli] added docstring examples and requierment

* polish docstring

* polish code

* polish code
2022-04-24 13:26:26 +08:00
Frank Lee 01e9f834f5
[dependency] removed torchvision (#833)
* [dependency] removed torchvision

* fixed transforms
2022-04-22 15:24:35 +08:00
Frank Lee 05d9ae5999
[cli] add missing requirement (#805) 2022-04-19 13:56:59 +08:00
Frank Lee 6f7d1362c9
[doc] removed outdated installation command (#730) 2022-04-12 11:56:45 +08:00
ver217 70e8dd418b
[hotfix] update requirements-test (#701) 2022-04-08 16:52:36 +08:00
Jiarui Fang 54229cd33e
[log] better logging display with rich (#426)
* better logger using rich

* remove deepspeed in zero requirements
2022-03-16 09:51:15 +08:00
ver217 578ea0583b update setup and workflow (#222) 2022-02-15 11:31:13 +08:00
BoxiangW a2f1565672
Update GitHub action and pre-commit settings (#196)
* Update GitHub action and pre-commit settings

* Update GitHub action and pre-commit settings (#198)
2022-01-28 16:59:53 +08:00
Frank Lee 3defa32aee
Support TP-compatible Torch AMP and Update trainer API (#27)
* Add gradient accumulation, fix lr scheduler

* fix FP16 optimizer and adapted torch amp with tensor parallel (#18)

* fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes

* fixed trainer

* Revert "fixed trainer"

This reverts commit 2e0b0b7699.

* improved consistency between trainer, engine and schedule (#23)

Co-authored-by: 1SAA <c2h214748@gmail.com>

Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: ver217 <lhx0217@gmail.com>
2021-11-18 19:45:06 +08:00
zbian 404ecbdcc6 Migrated project 2021-10-28 18:21:23 +02:00