You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/context
HELSON 84fd7c1d4d
add moe context, moe utilities and refactor gradient handler (#455)
3 years ago
..
process_group_initializer fix format ColossalAI\colossalai\context\process_group_initializer 3 years ago
random add moe context, moe utilities and refactor gradient handler (#455) 3 years ago
__init__.py add moe context, moe utilities and refactor gradient handler (#455) 3 years ago
config.py update default logger (#100) (#101) 3 years ago
moe_context.py add moe context, moe utilities and refactor gradient handler (#455) 3 years ago
parallel_context.py add moe context, moe utilities and refactor gradient handler (#455) 3 years ago
parallel_mode.py adapted for sequence parallel (#163) 3 years ago