You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/legacy/nn/layer
Hongxin Liu e5ce4c8ea6
[npu] add npu support for gemini and zero (#5067)
1 year ago
..
colossalai_layer [misc] update pre-commit and run all files (#4752) 1 year ago
parallel_1d [npu] add npu support for gemini and zero (#5067) 1 year ago
parallel_2d [npu] add npu support for gemini and zero (#5067) 1 year ago
parallel_2p5d [npu] add npu support for gemini and zero (#5067) 1 year ago
parallel_3d [npu] add npu support for gemini and zero (#5067) 1 year ago
parallel_sequence [misc] update pre-commit and run all files (#4752) 1 year ago
utils [misc] update pre-commit and run all files (#4752) 1 year ago
vanilla [npu] add npu support for gemini and zero (#5067) 1 year ago
wrapper [misc] update pre-commit and run all files (#4752) 1 year ago
__init__.py [legacy] move communication and nn to legacy and refactor logger (#4671) 1 year ago
base_layer.py [misc] update pre-commit and run all files (#4752) 1 year ago