You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/context
アマデウス 99d9713b02
Revert "Update parallel_context.py (#2408)"
2 years ago
..
process_group_initializer updated tp layers 2 years ago
random
__init__.py
config.py
moe_context.py [zero] add constant placement policy (#1705) 2 years ago
parallel_context.py Revert "Update parallel_context.py (#2408)" 2 years ago
parallel_mode.py updated tp layers 2 years ago
singleton_meta.py [autoparallel] refactored shape consistency to remove redundancy (#1591) 2 years ago