ColossalAI/colossalai
dayellow a50d39a143 [NFC] fix: format (#4270)
* [NFC] polish colossalai/fx/profiler/experimental/profiler_module/embedding.py code style

* [NFC] polish colossalai/communication/utils.py code style

---------

Co-authored-by: Minghao Huang <huangminghao@luchentech.com>
2023-07-26 14:12:57 +08:00
..
_C [setup] support pre-build and jit-build of cuda kernels (#2374) 2023-01-06 20:50:26 +08:00
_analyzer [example] add train resnet/vit with booster example (#3694) 2023-05-08 10:42:30 +08:00
amp [bf16] add bf16 support (#3882) 2023-06-05 15:58:31 +08:00
auto_parallel [NFC] polish runtime_preparation_pass style (#4266) 2023-07-26 14:12:57 +08:00
autochunk fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2023-05-24 09:01:50 +08:00
booster [NFC] polish colossalai/booster/plugin/low_level_zero_plugin.py code style (#4256) 2023-07-26 14:12:57 +08:00
builder [NFC] polish colossalai/builder/__init__.py code style (#1560) 2022-09-08 22:11:04 +08:00
checkpoint_io [checkpointio] Sharded Optimizer Checkpoint for Gemini Plugin (#4302) 2023-07-21 14:39:01 +08:00
cli [NFC] polish colossalai/cli/benchmark/utils.py code style (#4254) 2023-07-26 14:12:57 +08:00
cluster fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2023-05-24 09:01:50 +08:00
communication [NFC] fix: format (#4270) 2023-07-26 14:12:57 +08:00
context [CI] fix some spelling errors (#3707) 2023-05-10 17:12:03 +08:00
device [format] applied code formatting on changed files in pull request 4152 (#4157) 2023-07-04 16:07:47 +08:00
engine [nfc]fix ColossalaiOptimizer is not defined (#4122) 2023-06-30 17:23:22 +08:00
fx [nfc] fix typo colossalai/cli fx kernel (#3847) 2023-06-02 15:02:45 +08:00
interface Next commit [checkpointio] Unsharded Optimizer Checkpoint for Gemini Plugin (#4141) 2023-07-07 16:33:06 +08:00
kernel [Kernels] added triton-implemented of self attention for colossal-ai (#4241) 2023-07-18 23:53:38 +08:00
lazy [lazy] support init on cuda (#4269) 2023-07-19 16:43:01 +08:00
logging [logger] hotfix, missing _FORMAT (#2231) 2022-12-29 22:59:39 +08:00
nn [shardformer] integrated linear 1D with dtensor (#3996) 2023-07-04 16:05:01 +08:00
pipeline [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2023-06-06 14:07:36 +08:00
registry Remove duplication registry (#1078) 2022-06-08 07:47:24 +08:00
shardformer revise shardformer readme (#4246) 2023-07-17 17:30:57 +08:00
tensor [dtensor] fixed readme file name and removed deprecated file (#4162) 2023-07-04 18:21:11 +08:00
testing Next commit [checkpointio] Unsharded Optimizer Checkpoint for Gemini Plugin (#4141) 2023-07-07 16:33:06 +08:00
trainer fix typo with colossalai/trainer utils zero (#3908) 2023-06-07 16:08:37 +08:00
utils fix typo with colossalai/trainer utils zero (#3908) 2023-06-07 16:08:37 +08:00
zero [checkpointio] Sharded Optimizer Checkpoint for Gemini Plugin (#4302) 2023-07-21 14:39:01 +08:00
__init__.py [setup] supported conda-installed torch (#2048) 2022-11-30 16:45:15 +08:00
constants.py updated tp layers 2022-11-02 12:19:38 +08:00
core.py [Tensor] distributed view supports inter-process hybrid parallel (#1169) 2022-06-27 09:45:26 +08:00
global_variables.py [NFC] polish colossalai/global_variables.py code style (#3259) 2023-03-29 15:22:21 +08:00
initialize.py [nfc] fix typo colossalai/zero (#3923) 2023-06-08 00:01:29 +08:00