ColossalAI/colossalai/nn/parallel
HELSON ea13a201bb
[polish] polish code for get_static_torch_model (#2405)
* [gemini] polish code

* [testing] remove code

* [gemini] make more robust
2023-01-09 17:41:38 +08:00
..
layers [embedding] rename FreqAwareEmbedding -> CachedEmbedding (#1699) 2022-10-13 22:22:27 +08:00
__init__.py [Gemini] make gemini usage simple (#1821) 2022-11-08 15:53:13 +08:00
data_parallel.py [polish] polish code for get_static_torch_model (#2405) 2023-01-09 17:41:38 +08:00
gemini_parallel.py [Gemini] chunk init using runtime visited param order (#2115) 2022-12-12 18:06:16 +08:00
reducer.py [ddp] ColoDDP uses bucket all-reduce (#1177) 2022-06-29 10:34:13 +08:00
utils.py [polish] polish code for get_static_torch_model (#2405) 2023-01-09 17:41:38 +08:00