binmakeswell
e9635eb493
add explanation specified version
2 years ago
HELSON
72c9448920
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/operator_handler.py code style ( #1845 )
2 years ago
Genghan Zhang
b25030cc07
[NFC] polish ./colossalai/amp/torch_amp/__init__.py code style ( #1836 )
2 years ago
xyupeng
b0a138aa22
[NFC] polish .github/workflows/build.yml code style ( #1837 )
2 years ago
Sze-qq
95ac4f88ea
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/conv_handler.py code style ( #1829 )
...
Co-authored-by: siqi <siqi@siqis-MacBook-Pro.local>
2 years ago
Ziyue Jiang
5da03c936d
[NFC] polish colossalai/amp/torch_amp/_grad_scaler.py code style ( #1823 )
...
Co-authored-by: Ziyue Jiang <ziyue.jiang@gmail.com>
2 years ago
Maruyama_Aya
90833b45dd
[NFC] polish .github/workflows/release_docker.yml code style
2 years ago
shenggan
b0706fbb00
[NFC] polish .github/workflows/submodule.yml code style ( #1822 )
2 years ago
Arsmart1
fc8d8b1b9c
[NFC] polish .github/workflows/draft_github_release_post.yml code style ( #1820 )
2 years ago
Fazzie-Maqianli
399f84d8f6
[NFC] polish colossalai/amp/naive_amp/_fp16_optimizer.py code style ( #1819 )
2 years ago
CsRic
9623ec1b02
[NFC] polish colossalai/amp/naive_amp/_utils.py code style ( #1816 )
...
* [NFC] polish colossalai/nn/metric/accuracy_2p5d.py code style (#1714 )
* [NFC] polish colossalai/zero/sharded_param/__init__.py code style
* [NFC] polish colossalai/amp/naive_amp/_utils.py code style
Co-authored-by: shenggan <csg19971016@gmail.com>
Co-authored-by: ric <mkkt_bkkt@mail.ustc.edu.cn>
2 years ago
Zangwei Zheng
25993db98a
[NFC] polish .github/workflows/build_gpu_8.yml code style ( #1813 )
2 years ago
Zirui Zhu
244fa3108a
[NFC] polish MANIFEST.in code style ( #1814 )
2 years ago
binmakeswell
3c3714fc2a
[NFC] polish strategies_constructor.py code style ( #1806 )
2 years ago
Jiarui Fang
3ce4463fe6
[utils] remove lazy_memory_allocate from ColoInitContext ( #1844 )
2 years ago
Fazzie-Maqianli
fabed0df3b
Merge pull request #1842 from feifeibear/jiarui/polish
...
[example] polish diffusion readme
2 years ago
jiaruifang
27211d6267
[example] polish diffusion readme
2 years ago
jiaruifang
cddb4b6f6f
Merge branch 'main' of https://github.com/hpcaitech/ColossalAI into main
2 years ago
binmakeswell
4ac7d3ec3b
[doc] polish diffusion README ( #1840 )
2 years ago
binmakeswell
9d3124ac8b
[doc] remove obsolete API demo ( #1833 )
2 years ago
Jiarui Fang
fba34efb5a
version to 0.1.11rc2 ( #1832 )
2 years ago
jiaruifang
267b55f0a6
version to 0.1.11rc2
2 years ago
Jiarui Fang
8a6d28b6c2
[example] remove useless readme in diffusion ( #1831 )
...
* [NFC] update gitignore remove DS_Store
* [version] upgrade the version to 0.1.11rc2
2 years ago
Jiarui Fang
f86a703bcf
[NFC] update gitignore remove DS_Store ( #1830 )
2 years ago
Jiarui Fang
a25f755331
[example] add TP to GPT example ( #1828 )
2 years ago
YuliangLiu0306
49216d7ab1
[autoparallel] fix bugs caused by negative dim key ( #1808 )
...
* [autoparallel] fix bugs caused by negative dim key
* fix import error
* fix matmul test issue
* fix unit test issue
2 years ago
アマデウス
4268ae017b
[kernel] added jit warmup ( #1792 )
2 years ago
binmakeswell
76e64cb67c
[doc] add diffusion ( #1827 )
2 years ago
YuliangLiu0306
f6032ddb17
[autoparallel] fix bias addition module ( #1800 )
2 years ago
Fazzie-Maqianli
6e9730d7ab
[example] add stable diffuser ( #1825 )
2 years ago
Jiarui Fang
b1263d32ba
[example] simplify the GPT2 huggingface example ( #1826 )
2 years ago
Jiarui Fang
cd5a0d56fa
[Gemini] make gemini usage simple ( #1821 )
2 years ago
ver217
99870726b1
[CheckpointIO] a uniform checkpoint I/O module ( #1689 )
2 years ago
Boyuan Yao
629172b319
[autoparallel] add batch norm metainfo ( #1815 )
...
* [fx] metainfo class for auto parallel
* [fx] add unit test for linear metainfo
* [fx] fix bwd param for linear
* [fx] modify unit test
* [fx] modify unit test
* [fx] modify import
* [fx] modify import
* [fx] modify import
* [fx] move meta profiler to auto parallel
* [fx] add conv metainfo class
* [fx] restore profiler
* [fx] restore meta profiler
* [autoparallel] modify unit test
* [fx] modify unit test
* [autoparallel] add batchnorm metainfo class
* [autoparallel] fix batchnorm unit test function declaration
* [fx] restore profiler
2 years ago
Maruyama_Aya
a648d061ba
Merge pull request #1817 from MaruyamaAya/main
...
add ColoDiffusion code: /ldm/module/, /ldm/data/, /scripts/test/
2 years ago
Maruyama_Aya
a7e8159da6
add ColoDiffusion codes: /ldm/module/, /ldm/data/, /scripts/test/
2 years ago
Super Daniel
441d584e4a
[fx] add a symbolic_trace api. ( #1812 )
...
* [fx] add a symbolic_trace api.
* [fx] fix import errors.
2 years ago
Jiarui Fang
350ccc0481
[example] opt does not depend on Titans ( #1811 )
2 years ago
Jiarui Fang
6fa71d65d3
[fx] skip diffusers unitest if it is not installed ( #1799 )
2 years ago
Jiarui Fang
203ca57aed
[example] add GPT
2 years ago
Jiarui Fang
fd2c8d8156
[example] add opt model in lauguage ( #1809 )
2 years ago
xcnick
e0da01ea71
[hotfix] fix build error when torch version >= 1.13 ( #1803 )
2 years ago
Jiarui Fang
f5a92c288c
[example] add diffusion to example ( #1805 )
2 years ago
oahzxl
9639ea88fc
[kernel] more flexible flashatt interface ( #1804 )
2 years ago
Zihao
20e255d4e8
MemStatsCollectorStatic ( #1765 )
2 years ago
Boyuan Yao
327d07c44a
[autoparallel] add conv metainfo class for auto parallel ( #1796 )
...
* [fx] metainfo class for auto parallel
* [fx] add unit test for linear metainfo
* [fx] fix bwd param for linear
* [fx] modify unit test
* [fx] modify unit test
* [fx] modify import
* [fx] modify import
* [fx] modify import
* [fx] move meta profiler to auto parallel
* [fx] add conv metainfo class
* [fx] restore profiler
* [fx] restore meta profiler
* [autoparallel] modify unit test
* [fx] modify unit test
2 years ago
oahzxl
501a9e9cd2
[hotfix] polish flash attention ( #1802 )
2 years ago
Jiarui Fang
218c75fd9d
[NFC] polish type hint for shape consistency ( #1801 )
...
* [NFC] polish type hint for shape consistency
* polish code
* polish code
2 years ago
Jiarui Fang
c248800359
[kernel] skip tests of flash_attn and triton when they are not available ( #1798 )
2 years ago
YuliangLiu0306
e34e850a4c
[autoparallel]add essential CommActions for broadcast oprands ( #1793 )
2 years ago