You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/context
Zirui Zhu c9e3ee389e
[NFC] polish colossalai/context/process_group_initializer/initializer_2d.py code style (#2726)
2 years ago
..
process_group_initializer [NFC] polish colossalai/context/process_group_initializer/initializer_2d.py code style (#2726) 2 years ago
random [doc] update rst and docstring (#1351) 2 years ago
__init__.py [polish] polish singleton and global context (#500) 3 years ago
config.py Refactored docstring to google style 3 years ago
moe_context.py [NFC] polish colossalai/context/moe_context.py code style (#2693) 2 years ago
parallel_context.py Revert "Update parallel_context.py (#2408)" 2 years ago
parallel_mode.py updated tp layers 2 years ago
singleton_meta.py [autoparallel] refactored shape consistency to remove redundancy (#1591) 2 years ago