You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/legacy/context
Hongxin Liu 1f5d2e8062
[hotfix] fix torch 2.0 compatibility (#4936)
1 year ago
..
process_group_initializer
random
__init__.py
parallel_context.py [hotfix] fix torch 2.0 compatibility (#4936) 1 year ago
parallel_mode.py