Commit Graph

1602 Commits (f1bc2418c44c9ddb2b7b0551bd12fd2b83e4531b)
 

Author SHA1 Message Date
Frank Lee f1bc2418c4
[setup] make cuda extension build optional (#2336)
2 years ago
Frank Lee 8711310cda
[setup] remove torch dependency (#2333)
2 years ago
Fazzie-Maqianli 89f26331e9
[example] diffusion update diffusion,Dreamblooth (#2329)
2 years ago
Frank Lee 6e34cc0830
[workflow] fixed pypi release workflow error (#2328)
2 years ago
Frank Lee 2916eed34a
[workflow] fixed pypi release workflow error (#2327)
2 years ago
Frank Lee 8d8dec09ba
[workflow] added workflow to release to pypi upon version change (#2320)
2 years ago
Frank Lee 693ef121a1
[workflow] removed unused assign reviewer workflow (#2318)
2 years ago
binmakeswell e512ca9c24
[doc] update stable diffusion link (#2322)
2 years ago
Frank Lee e8dfa2e2e0
[workflow] rebuild cuda kernels when kernel-related files change (#2317)
2 years ago
Jiarui Fang db6eea3583
[builder] reconfig op_builder for pypi install (#2314)
2 years ago
Fazzie-Maqianli a9b27b9265
[exmaple] fix dreamblooth format (#2315)
2 years ago
Sze-qq da1c47f060
update ColossalAI logo (#2316)
2 years ago
Junming Wu 4a79c10750 [NFC] polish colossalai/cli/benchmark/__init__.py code style (#2308)
2 years ago
Ofey Chan 87d2defda6 [NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/layer_norm_handler.py code style (#2305)
2 years ago
ver217 116e3d0b8f [NFC] polish communication/p2p_v2.py code style (#2303)
2 years ago
xyupeng b965585d05 [NFC] polish colossalai/amp/torch_amp/torch_amp.py code style (#2290)
2 years ago
Zangwei Zheng d1e5bafcd4 [NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/__init__.py code style (#2291)
2 years ago
shenggan 950685873f [NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/reshape_handler.py code style (#2292)
2 years ago
Ziheng Qin 3041014089 [NFC] polish colossalai/amp/naive_amp/grad_scaler/dynamic_grad_scaler.py code style (#2299)
2 years ago
アマデウス 49715a78f0 [NFC] polish colossalai/cli/benchmark/benchmark.py code style (#2287)
2 years ago
Zirui Zhu 1c29b173c9 [NFC] polish colossalai/auto_parallel/tensor_shard/node_handler/getitem_handler.py code style (#2289)
2 years ago
Zihao 3a02b46447
[auto-parallel] refactoring ColoTracer (#2118)
2 years ago
Jiarui Fang 32253315b4
[example] update diffusion readme with official lightning (#2304)
2 years ago
HELSON 5d3a2be3af
[amp] add gradient clipping for unit tests (#2283)
2 years ago
HELSON e00cedd181
[example] update gemini benchmark bash (#2306)
2 years ago
Frank Lee 9b765e7a69
[setup] removed the build dependency on colossalai (#2307)
2 years ago
Boyuan Yao d45695d94e
Merge pull request #2258 from hpcaitech/debug/ckpt-autoparallel
2 years ago
binmakeswell c8144223b8
[doc] update diffusion doc (#2296)
2 years ago
binmakeswell 2fac699923
[doc] update news (#2295)
2 years ago
binmakeswell 4b72b2d4d3
[doc] update news
2 years ago
Jiarui Fang 16cc8e6aa7
[builder] MOE builder (#2277)
2 years ago
Boyuan Yao b904748210
[autoparallel] bypass MetaInfo when unavailable and modify BCAST_FUNC_OP metainfo (#2293)
2 years ago
Jiarui Fang 26e171af6c
[version] 0.1.14 -> 0.2.0 (#2286)
2 years ago
Super Daniel 8ea50d999e
[hotfix] pass a parameter. (#2288)
2 years ago
ZijianYY df1d6dc553
[examples] using args and combining two versions for PaLM (#2284)
2 years ago
zbian e94c79f15b improved allgather & reducescatter for 3d
2 years ago
binmakeswell c719798abe
[doc] add feature diffusion v2, bloom, auto-parallel (#2282)
2 years ago
HELSON 62c38e3330
[zero] polish low level zero optimizer (#2275)
2 years ago
Ziyue Jiang ac863a01d6
[example] add benchmark (#2276)
2 years ago
Boyuan Yao 22e947f982
[autoparallel] fix runtime apply memory estimation (#2281)
2 years ago
BlueRum 1405b4381e
[example] fix save_load bug for dreambooth (#2280)
2 years ago
Super Daniel 8e8900ff3f
[autockpt] considering parameter and optimizer weights. (#2279)
2 years ago
YuliangLiu0306 f027ef7913
[hotfix] fix fp16 optimzier bug (#2273)
2 years ago
YuliangLiu0306 fb87322773
[autoparallel] fix spelling error (#2270)
2 years ago
Jiarui Fang af32022f74
[Gemini] fix the convert_to_torch_module bug (#2269)
2 years ago
Jiarui Fang 879df8b943
[example] GPT polish readme (#2274)
2 years ago
Ziyue Jiang 9654df0e9a
Add GPT PP Example (#2272)
2 years ago
Super Daniel b0d21d0c4f
[autockpt] linearize / merge shape-consistency nodes. (#2271)
2 years ago
YuliangLiu0306 4b29112ab2
[autoparallel] gpt2 autoparallel examples (#2267)
2 years ago
Ziyue Jiang 8b045b3c1f
[Pipeline Middleware] Reduce comm redundancy by getting accurate output (#2232)
2 years ago