ColossalAI/colossalai
Frank Lee d857f3dbba [shardformer] supported T5 and its variants (#4045) 2023-07-04 16:05:01 +08:00
..
_C [setup] support pre-build and jit-build of cuda kernels (#2374) 2023-01-06 20:50:26 +08:00
_analyzer [example] add train resnet/vit with booster example (#3694) 2023-05-08 10:42:30 +08:00
amp [bf16] add bf16 support (#3882) 2023-06-05 15:58:31 +08:00
auto_parallel [nfc] fix typo colossalai/ applications/ (#3831) 2023-05-25 16:19:41 +08:00
autochunk fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2023-05-24 09:01:50 +08:00
booster [gemini] fix argument naming during chunk configuration searching 2023-06-25 13:34:15 +08:00
builder [NFC] polish colossalai/builder/__init__.py code style (#1560) 2022-09-08 22:11:04 +08:00
checkpoint_io [hotfix] fix import bug in checkpoint_io (#4142) 2023-07-03 22:14:37 +08:00
cli Modify torch version requirement to adapt torch 2.0 (#3896) 2023-06-05 15:57:35 +08:00
cluster fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2023-05-24 09:01:50 +08:00
communication [CI] fix some spelling errors (#3707) 2023-05-10 17:12:03 +08:00
context [CI] fix some spelling errors (#3707) 2023-05-10 17:12:03 +08:00
device [device] support init device mesh from process group (#3990) 2023-07-04 16:05:01 +08:00
engine [nfc]fix ColossalaiOptimizer is not defined (#4122) 2023-06-30 17:23:22 +08:00
fx [nfc] fix typo colossalai/cli fx kernel (#3847) 2023-06-02 15:02:45 +08:00
interface [booster] implemented the torch ddd + resnet example (#3232) 2023-03-27 10:24:14 +08:00
kernel fix Tensor is not defined (#4129) 2023-07-03 17:10:18 +08:00
lazy Revert "[sync] sync feature/shardformer with develop" 2023-06-09 09:41:27 +08:00
logging [logger] hotfix, missing _FORMAT (#2231) 2022-12-29 22:59:39 +08:00
nn [shardformer] integrated linear 1D with dtensor (#3996) 2023-07-04 16:05:01 +08:00
pipeline [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2023-06-06 14:07:36 +08:00
registry Remove duplication registry (#1078) 2022-06-08 07:47:24 +08:00
shardformer [shardformer] supported T5 and its variants (#4045) 2023-07-04 16:05:01 +08:00
tensor [shardformer] removed inplace tensor sharding (#4018) 2023-07-04 16:05:01 +08:00
testing [shardformer] supported T5 and its variants (#4045) 2023-07-04 16:05:01 +08:00
trainer fix typo with colossalai/trainer utils zero (#3908) 2023-06-07 16:08:37 +08:00
utils fix typo with colossalai/trainer utils zero (#3908) 2023-06-07 16:08:37 +08:00
zero [gemini] fix argument naming during chunk configuration searching 2023-06-25 13:34:15 +08:00
__init__.py [setup] supported conda-installed torch (#2048) 2022-11-30 16:45:15 +08:00
constants.py updated tp layers 2022-11-02 12:19:38 +08:00
core.py [Tensor] distributed view supports inter-process hybrid parallel (#1169) 2022-06-27 09:45:26 +08:00
global_variables.py [NFC] polish colossalai/global_variables.py code style (#3259) 2023-03-29 15:22:21 +08:00
initialize.py [nfc] fix typo colossalai/zero (#3923) 2023-06-08 00:01:29 +08:00