You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/context/process_group_initializer
アマデウス 297b8baae2
[model checkpoint] add gloo groups for cpu tensor communication (#589)
3 years ago
..
__init__.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
initializer_1d.py [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
initializer_2d.py [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
initializer_2p5d.py [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
initializer_3d.py [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
initializer_data.py [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
initializer_model.py [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
initializer_pipeline.py [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
initializer_sequence.py [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
initializer_tensor.py [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
process_group_initializer.py Refactored docstring to google style 3 years ago