You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/context/process_group_initializer
HELSON 7544347145
[MOE] add unitest for MOE experts layout, gradient handler and kernel (#469)
3 years ago
..
__init__.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
initializer_1d.py fix format ColossalAI\colossalai\context\process_group_initializer 3 years ago
initializer_2d.py fix format ColossalAI\colossalai\context\process_group_initializer 3 years ago
initializer_2p5d.py fix format ColossalAI\colossalai\context\process_group_initializer 3 years ago
initializer_3d.py fix format ColossalAI\colossalai\context\process_group_initializer 3 years ago
initializer_data.py Fixed docstring in colossalai (#171) 3 years ago
initializer_model.py fix format ColossalAI\colossalai\context\process_group_initializer 3 years ago
initializer_pipeline.py Fixed docstring in colossalai (#171) 3 years ago
initializer_sequence.py Fixed docstring in colossalai (#171) 3 years ago
initializer_tensor.py Fixed docstring in colossalai (#171) 3 years ago
process_group_initializer.py Fixed docstring in colossalai (#171) 3 years ago