ColossalAI/colossalai/nn/parallel
HELSON 66dfcf5281
[gemini] update the gpt example (#2527)
2023-01-30 17:58:05 +08:00
..
layers [embedding] rename FreqAwareEmbedding -> CachedEmbedding (#1699) 2022-10-13 22:22:27 +08:00
__init__.py [zero] add zero wrappers (#2523) 2023-01-29 17:52:58 +08:00
data_parallel.py [gemini] update ddp strict mode (#2518) 2023-01-28 14:35:25 +08:00
gemini_parallel.py [gemini] update ddp strict mode (#2518) 2023-01-28 14:35:25 +08:00
reducer.py [ddp] ColoDDP uses bucket all-reduce (#1177) 2022-06-29 10:34:13 +08:00
utils.py [polish] polish code for get_static_torch_model (#2405) 2023-01-09 17:41:38 +08:00
zero_wrapper.py [gemini] update the gpt example (#2527) 2023-01-30 17:58:05 +08:00