You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/context/process_group_initializer
kurisusnowdeng 0b8161fab8
updated tp layers
2 years ago
..
__init__.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
initializer_1d.py [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
initializer_2d.py [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
initializer_2p5d.py [usability] improved error messages in the context module (#856) 3 years ago
initializer_3d.py updated tp layers 2 years ago
initializer_data.py [NFC] polish <colossalai/context/process_group_initializer/initializer_data.py> code stype (#626) 3 years ago
initializer_model.py [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
initializer_pipeline.py [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
initializer_sequence.py [NFC] polish colossalai/context/process_group_initializer/initializer_sequence.py colossalai/context/process_group_initializer initializer_tensor.py code style (#639) 3 years ago
initializer_tensor.py [NFC] polish colossalai/context/process_group_initializer/initializer_sequence.py colossalai/context/process_group_initializer initializer_tensor.py code style (#639) 3 years ago
process_group_initializer.py [NFC] polish colossalai/context/process_group_initializer/process_group_initializer.py code stype (#617) 3 years ago