Commit Graph

2062 Commits (e0a1c1321ce6751686e184476d520e173c1d6b8e)

Author SHA1 Message Date
Frank Lee e0a1c1321c
[doc] added reference to related works (#2994)
* [doc] added reference to related works

* polish code
2023-03-04 17:32:22 +08:00
Yasyf Mohamedali 19fa0e57f6
Remove extraneous comma (#2993)
Prevents `TypeError: category must be a Warning subclass, not 'str'`.
2023-03-04 14:44:06 +08:00
Frank Lee 3a5d93bc2c
[kernel] cached the op kernel and fixed version check (#2886)
* [kernel] cached the op kernel and fixed version check

* polish code

* polish code
2023-03-03 21:45:05 +08:00
ver217 0ff8406b00
[chatgpt] allow shard init and display warning (#2986) 2023-03-03 16:27:59 +08:00
BlueRum f5ca0397dd
[chatgpt] fix lora gemini conflict in RM training (#2984)
* fix lora bug

* polish

* fix lora gemini
2023-03-03 15:58:16 +08:00
ver217 19ad49fb3b
[chatgpt] making experience support dp (#2971)
* [chatgpt] making experience support dp

* [chatgpt] update example test ci

* [chatgpt] update example test ci

* [chatgpt] update example test ci

* [chatgpt] update example test ci

* [chatgpt] update sampler

* [chatgpt] update example test ci

* [chatgpt] refactor sampler

* [chatgpt] update example test ci
2023-03-03 15:51:19 +08:00
github-actions[bot] 827a0af8cc
Automated submodule synchronization (#2982)
Co-authored-by: github-actions <github-actions@github.com>
2023-03-03 10:55:45 +08:00
binmakeswell 9b4ceefc21
[doc] update news (#2983)
* [doc] update news

* [doc] update news
2023-03-03 10:41:58 +08:00
BlueRum c9e27f0d1b
[chatgpt]fix lora bug (#2974)
* fix lora bug

* polish
2023-03-02 17:51:44 +08:00
BlueRum 82149e9d1b
[chatgpt] fix inference demo loading bug (#2969)
* [chatgpt] fix inference demo loading bug

* polish
2023-03-02 16:18:33 +08:00
Fazzie-Maqianli bbf9c827c3
[ChatGPT] fix README (#2966)
* Update README.md

* fix README

* Update README.md

* Update README.md

---------

Co-authored-by: fastalgo <youyang@cs.berkeley.edu>
Co-authored-by: BlueRum <70618399+ht-zhou@users.noreply.github.com>
2023-03-02 15:00:05 +08:00
binmakeswell b0a8766381
[doc] fix chatgpt inference typo (#2964) 2023-03-02 11:22:08 +08:00
github-actions[bot] 0d07514988
Automated submodule synchronization (#2951)
Co-authored-by: github-actions <github-actions@github.com>
2023-03-02 09:15:21 +08:00
YuliangLiu0306 e414e4092b
[DTensor] implementation of dtensor (#2946)
* [DTensor] implementation of dtensor

* test layout convert

* polish
2023-03-01 16:34:58 +08:00
BlueRum 489a9566af
[chatgpt]add inference example (#2944)
* [chatgpt] support inference example

* Create inference.sh

* Update README.md

* Delete inference.sh

* Update inference.py
2023-03-01 13:39:39 +08:00
YuliangLiu0306 47fb214b3b
[hotfix] add shard dim to aviod backward communication error (#2954) 2023-03-01 11:41:53 +08:00
ver217 090f14fd6b
[misc] add reference (#2930)
* [misc] add reference

* [misc] add license
2023-02-28 18:07:24 +08:00
github-actions[bot] dca98937f8
[format] applied code formatting on changed files in pull request 2933 (#2939)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-28 15:41:52 +08:00
binmakeswell 8264cd7ef1
[doc] add env scope (#2933) 2023-02-28 15:39:51 +08:00
Frank Lee b8804aa60c
[doc] added readme for documentation (#2935) 2023-02-28 14:04:52 +08:00
Frank Lee 9e3b8b7aff
[doc] removed read-the-docs (#2932) 2023-02-28 11:28:24 +08:00
Frank Lee 77b88a3849
[workflow] added auto doc test on PR (#2929)
* [workflow] added auto doc test on PR

* [workflow] added doc test workflow

* polish code

* polish code

* polish code

* polish code

* polish code

* polish code

* polish code
2023-02-28 11:10:38 +08:00
YuliangLiu0306 197d0bf4ed
[autoparallel] apply repeat block to reduce solving time (#2912) 2023-02-28 11:03:30 +08:00
YH a848091141
Fix port exception type (#2925) 2023-02-28 11:00:43 +08:00
zbian 61e687831d fixed using zero with tp cannot access weight correctly 2023-02-28 10:52:30 +08:00
github-actions[bot] eb5cf94332
Automated submodule synchronization (#2927)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-28 10:35:23 +08:00
github-actions[bot] da056285f2
[format] applied code formatting on changed files in pull request 2922 (#2923)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-27 19:29:06 +08:00
binmakeswell 12bafe057f
[doc] update installation for GPT (#2922) 2023-02-27 18:28:34 +08:00
binmakeswell 0afb55fc5b
[doc] add os scope, update tutorial install and tips (#2914) 2023-02-27 14:59:27 +08:00
YH 7b13f7db18
[zero] trivial zero optimizer refactoring (#2869)
* Fix mionr grad store interface

* Apply lint
2023-02-27 14:04:53 +08:00
fastalgo dbc01b9c04
Update README.md 2023-02-25 12:27:10 +08:00
Frank Lee e33c043dec
[workflow] moved pre-commit to post-commit (#2895) 2023-02-24 14:41:33 +08:00
Jiatong (Julius) Han 8c8a39be95
[hotfix]: Remove math.prod dependency (#2837)
* Remove math.prod dependency

* Fix style

* Fix style

---------

Co-authored-by: Jiatong Han <jiatong.han@u.nus.edu>
2023-02-23 23:56:15 +08:00
YuliangLiu0306 819e25d8b1
[hotfix] fix autoparallel compatibility test issues (#2754) 2023-02-23 17:28:36 +08:00
YuliangLiu0306 0f392d7403
[autoparallel] find repeat blocks (#2854)
* [autoparallel] find repeat blocks

* polish

* polish

* polish
2023-02-23 17:28:19 +08:00
BlueRum 2e16f842a9
[chatgpt]support opt & gpt for rm training (#2876) 2023-02-22 16:58:11 +08:00
junxu c52edcf0eb
Rename class method of ZeroDDP (#2692) 2023-02-22 15:05:53 +08:00
HELSON 6e4ac08172
[hotfix] fix chunk size can not be divided (#2867)
* [hotfix] fix chunk size can not be divided

* [hotfix] use numpy for python3.8
2023-02-22 15:04:46 +08:00
Alex_996 a4fc125c34
Fix typos (#2863)
Fix typos, `6.7 -> 6.7b`
2023-02-22 10:59:48 +08:00
dawei-wang 55424a16a5
[doc] fix GPT tutorial (#2860)
Fix hpcaitech/ColossalAI#2851
2023-02-22 10:58:52 +08:00
Boyuan Yao eae77c831d
[autoparallel] Patch meta information for nodes that will not be handled by SPMD solver (#2823)
* [autoparallel] non spmd meta information generator

* [autoparallel] patch meta information for non spmd nodes
2023-02-22 10:28:56 +08:00
Boyuan Yao c7764d3f22
[autoparallel] Patch meta information of `torch.where` (#2822)
* [autoparallel] patch meta information of torch.where

* [autoparallel] pre-commit modified
2023-02-22 10:28:21 +08:00
Boyuan Yao fcc4097efa
[autoparallel] Patch meta information of `torch.tanh()` and `torch.nn.Dropout` (#2773)
* [autoparallel] tanh meta information

* [autoparallel] remove redundant code

* [autoparallel] patch meta information of torch.nn.Dropout
2023-02-22 10:27:59 +08:00
BlueRum 34ca324b0d
[chatgpt] Support saving ckpt in examples (#2846)
* [chatgpt]fix train_rm bug with lora

* [chatgpt]support colossalai strategy to train rm

* fix pre-commit

* fix pre-commit 2

* [chatgpt]fix rm eval typo

* fix rm eval

* fix pre commit

* add support of saving ckpt in examples

* fix single-gpu save
2023-02-22 10:00:26 +08:00
Zheng Zeng 597914317b
[doc] fix typo in opt inference tutorial (#2849) 2023-02-21 17:16:13 +08:00
Frank Lee 935346430f
[cli] handled version check exceptions (#2848)
* [cli] handled version check exceptions

* polish code
2023-02-21 17:04:49 +08:00
BlueRum 3eebc4dff7
[chatgpt] fix rm eval (#2829)
* [chatgpt]fix train_rm bug with lora

* [chatgpt]support colossalai strategy to train rm

* fix pre-commit

* fix pre-commit 2

* [chatgpt]fix rm eval typo

* fix rm eval

* fix pre commit
2023-02-21 11:35:45 +08:00
Frank Lee 918bc94b6b
[triton] added copyright information for flash attention (#2835)
* [triton] added copyright information for flash attention

* polish code
2023-02-21 11:25:57 +08:00
Boyuan Yao 7ea6bc7f69
[autoparallel] Patch tensor related operations meta information (#2789)
* [autoparallel] tensor related meta information prototype

* [autoparallel] tensor related meta information

* [autoparallel] tensor related meta information

* [autoparallel] tensor related meta information

* [autoparallel] tensor related meta information
2023-02-20 17:38:55 +08:00
github-actions[bot] a5721229d9
Automated submodule synchronization (#2740)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-20 17:35:46 +08:00