ColossalAI/colossalai
Maruyama_Aya 42e3232bc0 roll back 2023-06-02 17:00:57 +08:00
..
_C [setup] support pre-build and jit-build of cuda kernels (#2374) 2023-01-06 20:50:26 +08:00
_analyzer [example] add train resnet/vit with booster example (#3694) 2023-05-08 10:42:30 +08:00
amp [NFC] fix typo colossalai/amp auto_parallel autochunk (#3756) 2023-05-19 13:50:00 +08:00
auto_parallel [nfc] fix typo colossalai/ applications/ (#3831) 2023-05-25 16:19:41 +08:00
autochunk fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2023-05-24 09:01:50 +08:00
booster [booster] add warning for torch fsdp plugin doc (#3833) 2023-05-25 14:00:02 +08:00
builder [NFC] polish colossalai/builder/__init__.py code style (#1560) 2022-09-08 22:11:04 +08:00
checkpoint_io [booster] torch fsdp fix ckpt (#3788) 2023-05-23 16:58:45 +08:00
cli roll back 2023-06-02 17:00:57 +08:00
cluster fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2023-05-24 09:01:50 +08:00
communication [CI] fix some spelling errors (#3707) 2023-05-10 17:12:03 +08:00
context [CI] fix some spelling errors (#3707) 2023-05-10 17:12:03 +08:00
device fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2023-05-24 09:01:50 +08:00
engine fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2023-05-24 09:01:50 +08:00
fx fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2023-05-24 09:01:50 +08:00
interface [booster] implemented the torch ddd + resnet example (#3232) 2023-03-27 10:24:14 +08:00
kernel [doc] Fix typo under colossalai and doc(#3618) 2023-04-26 11:38:43 +08:00
logging [logger] hotfix, missing _FORMAT (#2231) 2022-12-29 22:59:39 +08:00
nn [NFC]fix typo colossalai/auto_parallel nn utils etc. (#3779) 2023-05-23 15:28:20 +08:00
pipeline [pipeline] Add Simplified Alpa DP Partition (#2507) 2023-03-07 10:34:31 +08:00
registry Remove duplication registry (#1078) 2022-06-08 07:47:24 +08:00
tensor [tensor] Refactor handle_trans_spec in DistSpecManager 2023-05-06 17:55:37 +08:00
testing [NFC] fix typo with colossalai/auto_parallel/tensor_shard (#3742) 2023-05-17 11:13:23 +08:00
trainer [polish] remove useless file _mem_tracer_hook.py (#1963) 2022-11-16 15:55:10 +08:00
utils [NFC]fix typo colossalai/auto_parallel nn utils etc. (#3779) 2023-05-23 15:28:20 +08:00
zero [NFC]fix typo colossalai/auto_parallel nn utils etc. (#3779) 2023-05-23 15:28:20 +08:00
__init__.py [setup] supported conda-installed torch (#2048) 2022-11-30 16:45:15 +08:00
constants.py updated tp layers 2022-11-02 12:19:38 +08:00
core.py [Tensor] distributed view supports inter-process hybrid parallel (#1169) 2022-06-27 09:45:26 +08:00
global_variables.py [NFC] polish colossalai/global_variables.py code style (#3259) 2023-03-29 15:22:21 +08:00
initialize.py [zero] reorganize zero/gemini folder structure (#3424) 2023-04-04 13:48:16 +08:00