You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Xuanlei Zhao dd2c28a323
[npu] use extension for op builder (#5172)
11 months ago
..
_C
_analyzer
accelerator [accelerator] init the accelerator module (#5129) 1 year ago
amp
auto_parallel
autochunk
booster [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
checkpoint_io
cli
cluster
context
device [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
fx
inference
interface
kernel [npu] use extension for op builder (#5172) 11 months ago
lazy
legacy [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
logging
moe
nn [npu] use extension for op builder (#5172) 11 months ago
pipeline
shardformer [npu] use extension for op builder (#5172) 11 months ago
tensor
testing [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
utils [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
zero
__init__.py [accelerator] init the accelerator module (#5129) 1 year ago
initialize.py