You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/context
Jiarui Fang a445e118cf
[polish] polish singleton and global context (#500)
3 years ago
..
process_group_initializer [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
random add moe context, moe utilities and refactor gradient handler (#455) 3 years ago
__init__.py [polish] polish singleton and global context (#500) 3 years ago
config.py update default logger (#100) (#101) 3 years ago
moe_context.py [polish] polish singleton and global context (#500) 3 years ago
parallel_context.py [polish] polish singleton and global context (#500) 3 years ago
parallel_mode.py [MOE] remove old MoE legacy (#493) 3 years ago
singleton_meta.py [polish] polish singleton and global context (#500) 3 years ago