Jiarui Fang
cd5a0d56fa
[Gemini] make gemini usage simple ( #1821 )
2 years ago
ver217
99870726b1
[CheckpointIO] a uniform checkpoint I/O module ( #1689 )
2 years ago
Boyuan Yao
629172b319
[autoparallel] add batch norm metainfo ( #1815 )
...
* [fx] metainfo class for auto parallel
* [fx] add unit test for linear metainfo
* [fx] fix bwd param for linear
* [fx] modify unit test
* [fx] modify unit test
* [fx] modify import
* [fx] modify import
* [fx] modify import
* [fx] move meta profiler to auto parallel
* [fx] add conv metainfo class
* [fx] restore profiler
* [fx] restore meta profiler
* [autoparallel] modify unit test
* [fx] modify unit test
* [autoparallel] add batchnorm metainfo class
* [autoparallel] fix batchnorm unit test function declaration
* [fx] restore profiler
2 years ago
Maruyama_Aya
a648d061ba
Merge pull request #1817 from MaruyamaAya/main
...
add ColoDiffusion code: /ldm/module/, /ldm/data/, /scripts/test/
2 years ago
Maruyama_Aya
a7e8159da6
add ColoDiffusion codes: /ldm/module/, /ldm/data/, /scripts/test/
2 years ago
Super Daniel
441d584e4a
[fx] add a symbolic_trace api. ( #1812 )
...
* [fx] add a symbolic_trace api.
* [fx] fix import errors.
2 years ago
Jiarui Fang
350ccc0481
[example] opt does not depend on Titans ( #1811 )
2 years ago
Jiarui Fang
6fa71d65d3
[fx] skip diffusers unitest if it is not installed ( #1799 )
2 years ago
Jiarui Fang
203ca57aed
[example] add GPT
2 years ago
Jiarui Fang
fd2c8d8156
[example] add opt model in lauguage ( #1809 )
2 years ago
xcnick
e0da01ea71
[hotfix] fix build error when torch version >= 1.13 ( #1803 )
2 years ago
Jiarui Fang
f5a92c288c
[example] add diffusion to example ( #1805 )
2 years ago
oahzxl
9639ea88fc
[kernel] more flexible flashatt interface ( #1804 )
2 years ago
Zihao
20e255d4e8
MemStatsCollectorStatic ( #1765 )
2 years ago
Boyuan Yao
327d07c44a
[autoparallel] add conv metainfo class for auto parallel ( #1796 )
...
* [fx] metainfo class for auto parallel
* [fx] add unit test for linear metainfo
* [fx] fix bwd param for linear
* [fx] modify unit test
* [fx] modify unit test
* [fx] modify import
* [fx] modify import
* [fx] modify import
* [fx] move meta profiler to auto parallel
* [fx] add conv metainfo class
* [fx] restore profiler
* [fx] restore meta profiler
* [autoparallel] modify unit test
* [fx] modify unit test
2 years ago
oahzxl
501a9e9cd2
[hotfix] polish flash attention ( #1802 )
2 years ago
Jiarui Fang
218c75fd9d
[NFC] polish type hint for shape consistency ( #1801 )
...
* [NFC] polish type hint for shape consistency
* polish code
* polish code
2 years ago
Jiarui Fang
c248800359
[kernel] skip tests of flash_attn and triton when they are not available ( #1798 )
2 years ago
YuliangLiu0306
e34e850a4c
[autoparallel]add essential CommActions for broadcast oprands ( #1793 )
2 years ago
Boyuan Yao
05ce3d369f
[fx] Add linear metainfo class for auto parallel ( #1783 )
...
* [fx] metainfo class for auto parallel
* [fx] add unit test for linear metainfo
* [fx] fix bwd param for linear
* [fx] modify unit test
* [fx] modify unit test
* [fx] modify import
* [fx] modify import
* [fx] modify import
* [fx] move meta profiler to auto parallel
2 years ago
Super Daniel
e8a9bebc87
[autoparallel] refactor and add rotorc. ( #1789 )
...
* [autoparallel] refactor and add rotorc.
* [autoparallel] refactor and add rotorc.
2 years ago
github-actions[bot]
4d6e1284cb
Automated submodule synchronization ( #1785 )
...
Co-authored-by: github-actions <github-actions@github.com>
2 years ago
YuliangLiu0306
2c4c7b3618
[autoparallel] add getattr handler ( #1767 )
...
* [autoparallel] add getattr haandler
* polish code
* add extra processes for Parameters
* add unit test for param resharding cost
* add docstring and polish test
2 years ago
HELSON
c6a1a62636
[hotfix] fix zero's incompatibility with checkpoint in torch-1.12 ( #1786 )
...
* [hotfix] fix zero's incompatibility with checkpoint in torch-1.12
* [zero] add cpu shard init
* [zero] add tiny example test
* [colo_tensor] fix bugs for torch-1.11
2 years ago
Jiarui Fang
32c1b843a9
skip torchrec unittests if not installed ( #1790 )
2 years ago
kurisusnowdeng
0b8161fab8
updated tp layers
2 years ago
Jiarui Fang
cb5a587e9a
[hotfix] polish chunk import ( #1787 )
2 years ago
YuliangLiu0306
e859380bf7
[fx] support module with bias addition ( #1780 )
...
* [autoparallel] refactor tracer to fix bias addition issue
* [fx] support module with bias addition
* create bias_addition_module
* refactor file structure
* polish code
* fix unit test
2 years ago
Frank Lee
f3f19a5c47
[autoparallel] added matmul handler ( #1763 )
...
* [autoparallel] added matmul handler
* polish code
2 years ago
Ziyue Jiang
4df0194976
[Pipeline]Adapt to Pipelinable OPT ( #1782 )
2 years ago
YuliangLiu0306
27de252334
[autoparallel] fix conv handler numerical test ( #1771 )
2 years ago
Super Daniel
1e88811c7a
[autoparallel] move ckpt solvers to autoparallel folder / refactor code ( #1764 )
...
* [autoparallel] first move.
* [autoparallel] add solver rotor.
* [autoparallel] add ckpt solvers.
* [autoparallel] modify codegen.
* [fx] fix annotation in test.
* [fx] remove check.
* [autoparallel] polish docstring.
* [fx] refactor MetaTensor.
2 years ago
github-actions[bot]
2b859502d5
Automated submodule synchronization ( #1781 )
...
Co-authored-by: github-actions <github-actions@github.com>
2 years ago
Super Daniel
5ea89f6456
[CI] downgrade fbgemm. ( #1778 )
2 years ago
Jiarui Fang
f34dab4270
[compatibility] ChunkMgr import error ( #1772 )
2 years ago
YuliangLiu0306
a4d1f59c78
[autoparallel] add numerical test for handlers ( #1769 )
2 years ago
YuliangLiu0306
b0f7c8bde8
[autoparallel] update CommSpec to CommActions ( #1768 )
...
* [autoparallel] update CommSpec to CommActions
* polish code
2 years ago
binmakeswell
16b0abf94f
[doc] add FastFold ( #1766 )
2 years ago
YuliangLiu0306
b4cc59b61e
[autoparallel] add numerical test for node strategies ( #1760 )
...
* [autoparallel] add numerical test for node strategies
* polish code
* polish code
2 years ago
oahzxl
25952b67d7
[feat] add flash attention ( #1762 )
2 years ago
Super Daniel
0584654c79
[fx] refactor memory utils and extend shard utils. ( #1754 )
...
* [fx] change memory.py to memory_utils.py.
* [fx] add shard utils.
* [fx] fix import.
* [fx] check code style.
* [fx] add comment.
* [autoparallel] first move.
* [fx] add time computations.
2 years ago
Ziyue Jiang
63f250bbd4
fix file name ( #1759 )
...
Co-authored-by: Ziyue Jiang <ziyue.jiang@gmail.com>
2 years ago
YuliangLiu0306
314d8c497f
[autoparallel] refactor the runtime apply pass and add docstring to passes ( #1757 )
...
* [autoparallel] refactor the runtime apply pass and add doc string to passes
* fix unit test
* polish
2 years ago
Frank Lee
f9a613d660
[autoparallel] added binary elementwise node handler ( #1758 )
...
* [autoparallel] added binary elementwise node handler
* polish code
2 years ago
YuliangLiu0306
d2fc067231
[autoparallel] fix param hook issue in transform pass ( #1755 )
2 years ago
Frank Lee
262652c8bc
[autoparallel] added addbmm handler ( #1751 )
2 years ago
YuliangLiu0306
980ed21723
[autoparallel] shard param and buffer as expected ( #1753 )
...
* [autoparallel] shard param and buffer as expected
* fix unit test issue
2 years ago
YuliangLiu0306
cdb7d5e7d2
[hotfix] autoparallel unit test ( #1752 )
2 years ago
YuliangLiu0306
a4ce180e85
[autoparallel] add sequential order to communication actions ( #1735 )
2 years ago
Super Daniel
b893342f95
[fx] test tracer on diffuser modules. ( #1750 )
...
* [fx] test tracer on diffuser modules.
* [fx] shorter seq_len.
* Update requirements-test.txt
2 years ago