You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
梁爽 abe4f971e0
[NFC] polish colossalai/booster/plugin/low_level_zero_plugin.py code style (#4256)
1 year ago
..
_C
_analyzer
amp
auto_parallel [NFC] polish colossalai/auto_parallel/offload/amp_optimizer.py code style (#4255) 1 year ago
autochunk
booster [NFC] polish colossalai/booster/plugin/low_level_zero_plugin.py code style (#4256) 1 year ago
builder
checkpoint_io [checkpointio] Sharded Optimizer Checkpoint for Gemini Plugin (#4302) 1 year ago
cli [NFC] polish colossalai/cli/benchmark/utils.py code style (#4254) 1 year ago
cluster
communication Fix/format (#4261) 1 year ago
context
device
engine
fx
interface
kernel [Kernels] added triton-implemented of self attention for colossal-ai (#4241) 1 year ago
lazy [lazy] support init on cuda (#4269) 1 year ago
logging
nn
pipeline
registry
shardformer
tensor
testing
trainer
utils
zero [checkpointio] Sharded Optimizer Checkpoint for Gemini Plugin (#4302) 1 year ago
__init__.py
constants.py
core.py
global_variables.py
initialize.py