You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/legacy/nn/layer
Hongxin Liu e5ce4c8ea6
[npu] add npu support for gemini and zero (#5067)
1 year ago
..
colossalai_layer
parallel_1d [npu] add npu support for gemini and zero (#5067) 1 year ago
parallel_2d [npu] add npu support for gemini and zero (#5067) 1 year ago
parallel_2p5d [npu] add npu support for gemini and zero (#5067) 1 year ago
parallel_3d [npu] add npu support for gemini and zero (#5067) 1 year ago
parallel_sequence
utils
vanilla [npu] add npu support for gemini and zero (#5067) 1 year ago
wrapper
__init__.py
base_layer.py