You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/zero/legacy/sharded_param/__init__.py

5 lines
131 B

from .sharded_param import ShardedParamV2
from .sharded_tensor import ShardedTensor
__all__ = ['ShardedTensor', 'ShardedParamV2']