You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/context
Jiarui Fang 65c0f380c2
[format] polish name format for MOE (#481)
3 years ago
..
process_group_initializer [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
random add moe context, moe utilities and refactor gradient handler (#455) 3 years ago
__init__.py add moe context, moe utilities and refactor gradient handler (#455) 3 years ago
config.py update default logger (#100) (#101) 3 years ago
moe_context.py [format] polish name format for MOE (#481) 3 years ago
parallel_context.py add moe context, moe utilities and refactor gradient handler (#455) 3 years ago
parallel_mode.py adapted for sequence parallel (#163) 3 years ago