ColossalAI/colossalai/nn/parallel
Jiarui Fang af32022f74
[Gemini] fix the convert_to_torch_module bug (#2269)
2023-01-03 15:55:35 +08:00
..
layers [embedding] rename FreqAwareEmbedding -> CachedEmbedding (#1699) 2022-10-13 22:22:27 +08:00
__init__.py [Gemini] make gemini usage simple (#1821) 2022-11-08 15:53:13 +08:00
data_parallel.py [Gemini] fix the convert_to_torch_module bug (#2269) 2023-01-03 15:55:35 +08:00
gemini_parallel.py [Gemini] chunk init using runtime visited param order (#2115) 2022-12-12 18:06:16 +08:00
reducer.py [ddp] ColoDDP uses bucket all-reduce (#1177) 2022-06-29 10:34:13 +08:00
utils.py [Gemini] fix the convert_to_torch_module bug (#2269) 2023-01-03 15:55:35 +08:00