You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/parallel
Jiarui Fang 2827f41898
[Gemini] GeminiDPP convert to PyTorch Module. (#2151)
2 years ago
..
layers [embedding] rename FreqAwareEmbedding -> CachedEmbedding (#1699) 2 years ago
__init__.py [Gemini] make gemini usage simple (#1821) 2 years ago
data_parallel.py [Gemini] chunk init using runtime visited param order (#2115) 2 years ago
gemini_parallel.py [Gemini] chunk init using runtime visited param order (#2115) 2 years ago
reducer.py [ddp] ColoDDP uses bucket all-reduce (#1177) 2 years ago
utils.py [Gemini] GeminiDPP convert to PyTorch Module. (#2151) 2 years ago