You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer/parallel_1d
アマデウス e52f9d9109
[tensorparallel] fixed tp layers (#1938)
2 years ago
..
__init__.py [model checkpoint] updated saving/loading for 1d layers (#594) 3 years ago
_operation.py updated tp layers 2 years ago
_utils.py [Tensor] 1d row embedding (#1075) 3 years ago
layers.py [tensorparallel] fixed tp layers (#1938) 2 years ago