Commit Graph

2746 Commits (d8ceeac14e54c5c568e916c061b86d9a53a54f30)

Author SHA1 Message Date
Jie Zhu f867365aba bug fix: pass hook_list to engine (#273)
* bug fix: pass hook_list to engine

* change parameter name
2022-03-11 15:50:28 +08:00
Jiarui Fang 5a560a060a Feature/zero (#279)
* add zero1 (#209)

* add zero1

* add test zero1

* update zero stage 1 develop (#212)

* Implement naive zero3 (#240)

* naive zero3 works well

* add zero3 param manager

* add TODOs in comments

* add gather full param ctx

* fix sub module streams

* add offload

* fix bugs of hook and add unit tests

* fix bugs of hook and add unit tests (#252)

* add gather full param ctx

* fix sub module streams

* add offload

* fix bugs of hook and add unit tests

* polish code and add state dict hook

* fix bug

* update unit test

* refactor reconstructed zero code

* clip_grad support zero3 and add unit test

* add unit test for Zero3ParameterManager

* [WIP] initialize the shard param class

* [WIP] Yet another sharded model implementation (#274)

* [WIP] initialize the shard param class

* [WIP] Yes another implementation of shardModel. Using a better hook method.

* torch.concat -> torch.cat

* fix test_zero_level_1.py::test_zero_level_1 unitest

* remove deepspeed implementation and refactor for the reconstructed zero module

* polish zero dp unittests

Co-authored-by: ver217 <lhx0217@gmail.com>
Co-authored-by: Frank Lee <somerlee.9@gmail.com>
2022-03-11 15:50:28 +08:00
binmakeswell 08eccfe681 add community group and update issue template(#271) 2022-03-11 15:50:28 +08:00
Sze-qq 3312d716a0 update experimental visualization (#253) 2022-03-11 15:50:28 +08:00
binmakeswell 753035edd3 add Chinese README 2022-03-11 15:50:28 +08:00
1SAA 82023779bb Added TPExpert for special situation 2022-03-11 15:50:28 +08:00
HELSON 36b8477228 Fixed parameter initialization in FFNExpert (#251) 2022-03-11 15:50:28 +08:00
アマデウス e13293bb4c fixed CI dataset directory; fixed import error of 2.5d accuracy (#255) 2022-03-11 15:50:28 +08:00
1SAA 219df6e685 Optimized MoE layer and fixed some bugs;
Decreased moe tests;

Added FFNExperts and ViTMoE model
2022-03-11 15:50:28 +08:00
zbian 3dba070580 fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial 2022-03-11 15:50:28 +08:00
ver217 24f8583cc4 update setup info (#233) 2022-03-11 15:50:28 +08:00
github-actions b9f8521f8c Automated submodule synchronization 2022-02-15 11:35:37 +08:00
Frank Lee f5ca88ec97 fixed apex import (#227) 2022-02-15 11:31:13 +08:00
Frank Lee eb3fda4c28 updated readme and change log (#224) 2022-02-15 11:31:13 +08:00
ver217 578ea0583b update setup and workflow (#222) 2022-02-15 11:31:13 +08:00
Frank Lee 3a1a9820b0 fixed mkdir conflict and align yapf config with flake (#220) 2022-02-15 11:31:13 +08:00
Frank Lee 65e72983dc added flake8 config (#219) 2022-02-15 11:31:13 +08:00
アマデウス 9ee197d0e9 moved env variables to global variables; (#215)
added branch context;
added vocab parallel layers;
moved split_batch from load_batch to tensor parallel embedding layers;
updated gpt model;
updated unit test cases;
fixed few collective communicator bugs
2022-02-15 11:31:13 +08:00
Frank Lee b82d60be02 updated github action for develop branch (#214) 2022-02-15 11:31:13 +08:00
BoxiangW 7d15ec7fe2
Update github actions (#205) 2022-02-04 15:04:55 +08:00
github-actions[bot] 5420809f43
Automated submodule synchronization (#203)
Co-authored-by: github-actions <github-actions@github.com>
2022-02-04 10:19:38 +08:00
Frank Lee fd570ab285
add changelog and contributing doc (#202) 2022-02-03 15:38:00 +08:00
Frank Lee 02f13fa9d1
add code quality badge (#201) 2022-02-03 14:01:09 +08:00
Frank Lee 812357d63c
fixed utils docstring and add example to readme (#200) 2022-02-03 11:37:17 +08:00
Frank Lee b9a761b9b8
added github action to synchronize submodule commits automatically (#193) 2022-02-03 11:23:45 +08:00
BoxiangW a2f1565672
Update GitHub action and pre-commit settings (#196)
* Update GitHub action and pre-commit settings

* Update GitHub action and pre-commit settings (#198)
2022-01-28 16:59:53 +08:00
Frank Lee 765db512b5
fixed ddp bug on torch 1.8 (#194) 2022-01-28 15:14:04 +08:00
Jiarui Fang 569357fea0
add pytorch hooks (#179)
* add pytorch hooks
fix #175

* remove licenses in src code

* add gpu memory tracer

* replacing print with logger in ophooks.
2022-01-25 22:20:54 +08:00
ver217 708404d5f8
fix pipeline forward return tensors (#176) 2022-01-21 15:46:02 +08:00
WANG-CR 6fb550acdb update logo 2022-01-21 12:31:07 +08:00
HELSON 0f8c7f9804
Fixed docstring in colossalai (#171) 2022-01-21 10:44:30 +08:00
Frank Lee e2089c5c15
adapted for sequence parallel (#163) 2022-01-20 13:44:51 +08:00
Frank Lee a2e649da39
update readme (#168) 2022-01-20 13:26:38 +08:00
Frank Lee 9684bdce5c
fixed submodule url (#167)
* added examples as submodule

* update submodule url
2022-01-19 22:33:39 +08:00
BoxiangW bd4840f1f1
Update workflow files and README.md (#166) 2022-01-19 20:15:14 +08:00
ver217 1949d3a889
update doc requirements and rtd conf (#165) 2022-01-19 19:46:43 +08:00
Frank Lee be85a0f366 removed tutorial markdown and refreshed rst files for consistency 2022-01-19 17:01:37 +08:00
Frank Lee ca4ae52d6b
Set examples as submodule (#162)
* remove examples folder

* added examples as submodule

* update .gitmodules
2022-01-19 16:35:36 +08:00
binmakeswell 17ce8569a8
add logo at homepage, add forum in issue template (#161) 2022-01-19 14:29:31 +08:00
puck_WCR 9473a1b9c8
AMP docstring/markdown update (#160) 2022-01-18 18:33:36 +08:00
Frank Lee 2499faa2db
update benchmark commit id (#159) 2022-01-18 17:14:00 +08:00
LuGY_mac d143396cac Added rand augment and update the dataloader 2022-01-18 16:14:46 +08:00
Frank Lee c7b8ece736
set benchmarks as a git submodule (#156)
* remove benchmark folder

* added benchmark submodule

* update .gitmodules
2022-01-18 15:48:07 +08:00
Frank Lee f3802d6b06
fixed jit default setting (#154) 2022-01-18 13:37:20 +08:00
Frank Lee a1da3900c8
added docker documentation (#152) 2022-01-18 13:35:18 +08:00
ver217 7bf1e98b97
pipeline last stage supports multi output (#151) 2022-01-17 15:57:47 +08:00
HELSON 1ff5be36c2
Added moe parallel example (#140) 2022-01-17 15:34:04 +08:00
ver217 f68eddfb3d
refactor kernel (#142) 2022-01-13 16:47:17 +08:00
BoxiangW 4a3d3446b0
Update layer integration documentations (#108)
Update the documentations of layer integration

Update _log_hook.py

Update _operation.py
2022-01-10 18:05:58 +08:00
binmakeswell 3a61d785b5
add doc issue template (#133) 2022-01-10 10:21:12 +08:00