Commit Graph

2269 Commits (d0fbd4b86fcfa653db5c5b7d312f249ce6dad619)

Author SHA1 Message Date
Boyuan Yao 90a9fdd91d
[autoparallel] Patch meta information of `torch.matmul` (#2584)
* [autoparallel] matmul metainfo

* [auto_parallel] remove unused print

* [tests] skip test_matmul_handler when torch version is lower than 1.12.0
2023-02-08 11:05:31 +08:00
Frank Lee 4ae02c4b1c
[tutorial] added energonai to opt inference requirements (#2625) 2023-02-07 16:58:06 +08:00
oahzxl 6ba8364881
[autochunk] support diffusion for autochunk (#2621)
* add alphafold benchmark

* renae alphafold test

* rename tests

* rename diffuser

* renme

* rename

* update transformer

* update benchmark

* update benchmark

* update bench memory

* update transformer benchmark

* rename

* support diffuser

* support unet metainfo prop

* fix bug and simplify code

* update linear and support some op

* optimize max region search, support conv

* update unet test

* support some op

* support groupnorm and interpolate

* update flow search

* add fix dim in node flow

* fix utils

* rename

* support diffusion

* update diffuser

* update chunk search

* optimize imports

* import

* finish autochunk
2023-02-07 16:32:45 +08:00
Frank Lee 291b051171
[doc] fixed broken badge (#2623) 2023-02-07 16:15:17 +08:00
binmakeswell 0556f5d468
[tutorial] add video link (#2619) 2023-02-07 15:14:51 +08:00
Frank Lee 93fdd35b5e
[build] fixed the doc build process (#2618) 2023-02-07 14:36:34 +08:00
Frank Lee 8518263b80
[test] fixed the triton version for testing (#2608) 2023-02-07 13:49:38 +08:00
Frank Lee aa7e9e4794
[workflow] fixed the test coverage report (#2614)
* [workflow] fixed the test coverage report

* polish code
2023-02-07 11:50:53 +08:00
Frank Lee b3973b995a
[workflow] fixed test coverage report (#2611) 2023-02-07 11:02:56 +08:00
github-actions[bot] ae86be1fd2
Automated submodule synchronization (#2607)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-07 09:33:27 +08:00
Frank Lee f566b0ce6b
[workflow] fixed broken rellease workflows (#2604) 2023-02-06 21:40:19 +08:00
Frank Lee f7458d3ec7
[release] v0.2.1 (#2602)
* [release] v0.2.1

* polish code
2023-02-06 20:46:18 +08:00
Frank Lee 719c4d5553
[doc] updated readme for CI/CD (#2600) 2023-02-06 17:42:15 +08:00
Frank Lee 4d582893a7
[workflow] added cuda extension build test before release (#2598)
* [workflow] added cuda extension build test before release

* polish code
2023-02-06 17:07:41 +08:00
Frank Lee 0c03802bff
[workflow] hooked pypi release with lark (#2596) 2023-02-06 16:29:04 +08:00
Frank Lee fd90245399
[workflow] hooked docker release with lark (#2594) 2023-02-06 16:15:46 +08:00
Frank Lee d6cc8f313e
[workflow] added test-pypi check before release (#2591)
* [workflow] added test-pypi check before release

* polish code
2023-02-06 15:42:08 +08:00
Frank Lee 2059408edc
[workflow] fixed the typo in the example check workflow (#2589) 2023-02-06 15:03:54 +08:00
Frank Lee 5767f8e394
[workflow] hook compatibility test failure to lark (#2586) 2023-02-06 14:56:31 +08:00
Frank Lee 186ddce2c4
[workflow] hook example test alert with lark (#2585) 2023-02-06 14:38:35 +08:00
Frank Lee 788e138960
[workflow] added notification if scheduled build fails (#2574)
* [workflow] added notification if scheduled build fails

* polish code

* polish code
2023-02-06 14:03:13 +08:00
Frank Lee fba08743a8
[setup] fixed inconsistent version meta (#2578) 2023-02-06 13:48:20 +08:00
Frank Lee 8af5a0799b
[workflow] added discussion stats to community report (#2572)
* [workflow] added discussion stats to community report

* polish code
2023-02-06 13:47:59 +08:00
Frank Lee b0c29d1b4c
[workflow] refactored compatibility test workflow for maintenability (#2560) 2023-02-06 13:47:50 +08:00
Frank Lee 76edb04b0d
[workflow] adjust the GPU memory threshold for scheduled unit test (#2558)
* [workflow] adjust the GPU memory threshold for scheduled unit test

* polish code
2023-02-06 13:47:25 +08:00
Frank Lee ba47517342
[workflow] fixed example check workflow (#2554)
* [workflow] fixed example check workflow

* polish yaml
2023-02-06 13:46:52 +08:00
Frank Lee fb1a4c0d96
[doc] fixed issue link in pr template (#2577) 2023-02-06 10:29:24 +08:00
binmakeswell 039b0c487b
[tutorial] polish README (#2568) 2023-02-04 17:49:52 +08:00
Frank Lee 2eb4268b47
[workflow] fixed typos in the leaderboard workflow (#2567) 2023-02-03 17:25:56 +08:00
Frank Lee 7b4ad6e0fc
[workflow] added contributor and user-engagement report (#2564)
* [workflow] added contributor and user-engagement report

* polish code

* polish code
2023-02-03 17:12:35 +08:00
oahzxl 4f5ef73a43
[tutorial] update fastfold tutorial (#2565)
* update readme

* update

* update
2023-02-03 16:54:28 +08:00
Fazzie-Maqianli 79079a9d0c
Merge pull request #2561 from Fazziekey/v2
bug/fix diffusion ckpt problem
2023-02-03 15:42:49 +08:00
Fazzie cad1f50512 fix ckpt 2023-02-03 15:39:59 +08:00
HELSON 552183bb74
[polish] polish ColoTensor and its submodules (#2537) 2023-02-03 11:44:10 +08:00
github-actions[bot] 51d4d6e718
Automated submodule synchronization (#2492)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-03 10:48:15 +08:00
Frank Lee 4af31d263d
[doc] updated the CHANGE_LOG.md for github release page (#2552) 2023-02-03 10:47:27 +08:00
Frank Lee 578374d0de
[doc] fixed the typo in pr template (#2556) 2023-02-03 10:47:00 +08:00
Frank Lee dd14783f75
[kernel] fixed repeated loading of kernels (#2549)
* [kernel] fixed repeated loading of kernels

* polish code

* polish code
2023-02-03 09:47:13 +08:00
Frank Lee 8438c35a5f
[doc] added pull request template (#2550)
* [doc] added pull  request template

* polish code

* polish code
2023-02-02 18:16:03 +08:00
ver217 5b1854309a
[hotfix] fix zero ddp warmup check (#2545) 2023-02-02 16:42:38 +08:00
oahzxl fa3d66feb9
support unet metainfo prop (#2544) 2023-02-02 16:19:26 +08:00
oahzxl c4b15661d7
[autochunk] add benchmark for transformer and alphafold (#2543) 2023-02-02 15:06:43 +08:00
binmakeswell 9885ec2b2e
[git] remove invalid submodule (#2540) 2023-02-01 17:54:03 +08:00
oahzxl 05671fcb42
[autochunk] support multi outputs chunk search (#2538)
Support multi outputs chunk search. Previously we only support single output chunk search. It is more flexible and improve performance by a large margin. For transformer, we reduce memory by 40% than previous search strategy.

1. rewrite search strategy to support multi outputs chunk search
2. fix many, many bugs
3. update tests
2023-02-01 13:18:51 +08:00
YuliangLiu0306 f477a14f4a
[hotfix] fix autoparallel demo (#2533) 2023-01-31 17:42:45 +08:00
oahzxl 63199c6687
[autochunk] support transformer (#2526) 2023-01-31 16:00:06 +08:00
HELSON 6e0faa70e0
[gemini] add profiler in the demo (#2534) 2023-01-31 14:21:22 +08:00
Fazzie-Maqianli df437ca039
Merge pull request #2532 from Fazziekey/fix
fix README
2023-01-31 10:56:35 +08:00
Fazzie f35326881c fix README 2023-01-31 10:51:13 +08:00
HELSON a4ed9125ac
[hotfix] fix lightning error (#2529) 2023-01-31 10:40:39 +08:00