You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/context
アマデウス 297b8baae2
[model checkpoint] add gloo groups for cpu tensor communication (#589)
3 years ago
..
process_group_initializer [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
random html refactor (#555) 3 years ago
__init__.py [polish] polish singleton and global context (#500) 3 years ago
config.py Refactored docstring to google style 3 years ago
moe_context.py [polish] polish singleton and global context (#500) 3 years ago
parallel_context.py [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
parallel_mode.py [MOE] remove old MoE legacy (#493) 3 years ago
singleton_meta.py [polish] polish singleton and global context (#500) 3 years ago