You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/context/process_group_initializer
アマデウス 9ee197d0e9
moved env variables to global variables; (#215)
3 years ago
..
__init__.py Added MoE parallel (#127) 3 years ago
initializer_1d.py moved env variables to global variables; (#215) 3 years ago
initializer_2d.py moved env variables to global variables; (#215) 3 years ago
initializer_2p5d.py moved env variables to global variables; (#215) 3 years ago
initializer_3d.py moved env variables to global variables; (#215) 3 years ago
initializer_data.py Fixed docstring in colossalai (#171) 3 years ago
initializer_model.py Fixed docstring in colossalai (#171) 3 years ago
initializer_moe.py Fixed docstring in colossalai (#171) 3 years ago
initializer_pipeline.py Fixed docstring in colossalai (#171) 3 years ago
initializer_sequence.py Fixed docstring in colossalai (#171) 3 years ago
initializer_tensor.py Fixed docstring in colossalai (#171) 3 years ago
process_group_initializer.py Fixed docstring in colossalai (#171) 3 years ago