ColossalAI/colossalai
Hongxin Liu dac127d0ee
[fx] fix meta tensor registration (#3589)
* [meta] fix torch 1.13.1

* [meta] fix torch 2.0.0

* [meta] fix torch 1.13.0

* [meta] polish code
2023-04-18 16:20:36 +08:00
..
_C [setup] support pre-build and jit-build of cuda kernels (#2374) 2023-01-06 20:50:26 +08:00
_analyzer [fx] fix meta tensor registration (#3589) 2023-04-18 16:20:36 +08:00
amp [NFC] polish colossalai/amp/__init__.py code style (#3272) 2023-03-29 15:22:21 +08:00
auto_parallel [autoparallel]integrate auto parallel feature with new tracer (#3408) 2023-04-04 17:40:45 +08:00
autochunk [autochunk] support vit (#3084) 2023-03-10 10:23:26 +08:00
booster [misc] add verbose arg for zero and op builder (#3552) 2023-04-17 11:25:35 +08:00
builder [NFC] polish colossalai/builder/__init__.py code style (#1560) 2022-09-08 22:11:04 +08:00
checkpoint_io [checkpoint] Shard saved checkpoint need to be compatible with the naming format of hf checkpoint files (#3479) 2023-04-12 16:02:17 +08:00
cli [test] refactor tests with spawn (#3452) 2023-04-06 14:51:35 +08:00
cluster [booster] implemented the torch ddd + resnet example (#3232) 2023-03-27 10:24:14 +08:00
communication [NFC] polish communication/p2p_v2.py code style (#2303) 2023-01-04 15:09:57 +08:00
context [NFC] polish colossalai/context/random/__init__.py code style (#3327) 2023-03-30 14:19:26 +08:00
device [hotfix] add copyright for solver and device mesh (#2803) 2023-02-18 21:14:38 +08:00
engine [format] Run lint on colossalai.engine (#3367) 2023-04-05 23:24:43 +08:00
fx [autoparallel] adapt autoparallel with new analyzer (#3261) 2023-03-30 17:47:24 +08:00
interface [booster] implemented the torch ddd + resnet example (#3232) 2023-03-27 10:24:14 +08:00
kernel updated flash attention usage 2023-03-20 17:57:04 +08:00
logging [logger] hotfix, missing _FORMAT (#2231) 2022-12-29 22:59:39 +08:00
nn [gemini] gemini supports lazy init (#3379) 2023-04-12 16:03:25 +08:00
pipeline [pipeline] Add Simplified Alpa DP Partition (#2507) 2023-03-07 10:34:31 +08:00
registry Remove duplication registry (#1078) 2022-06-08 07:47:24 +08:00
tensor Fix typo (#3448) 2023-04-06 09:43:31 +08:00
testing [test] refactor tests with spawn (#3452) 2023-04-06 14:51:35 +08:00
trainer [polish] remove useless file _mem_tracer_hook.py (#1963) 2022-11-16 15:55:10 +08:00
utils [lazyinit] fix clone and deepcopy (#3553) 2023-04-17 11:25:13 +08:00
zero [gemini] support save state dict in shards (#3581) 2023-04-17 17:11:09 +08:00
__init__.py [setup] supported conda-installed torch (#2048) 2022-11-30 16:45:15 +08:00
constants.py updated tp layers 2022-11-02 12:19:38 +08:00
core.py [Tensor] distributed view supports inter-process hybrid parallel (#1169) 2022-06-27 09:45:26 +08:00
global_variables.py [NFC] polish colossalai/global_variables.py code style (#3259) 2023-03-29 15:22:21 +08:00
initialize.py [zero] reorganize zero/gemini folder structure (#3424) 2023-04-04 13:48:16 +08:00