You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
ziyuhuang123 d344313533
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/embedding_handler.py code style (#2725)
2 years ago
..
_C
amp
auto_parallel [NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/embedding_handler.py code style (#2725) 2 years ago
autochunk
builder
cli [NFC] polish colossalai/cli/launcher/__init__.py code style (#2709) 2 years ago
communication
context [NFC] posh colossalai/context/process_group_initializer/initializer_sequence.py code style (#2712) 2 years ago
device
engine [NFC] polish colossalai/engine/gradient_handler/utils.py code style (#2708) 2 years ago
fx
gemini [NFC] polish colossalai/gemini/gemini_context.py code style (#2690) 2 years ago
kernel
logging
nn [gemini] add fake_release_chunk for keep-gathered chunk in the inference mode (#2671) 2 years ago
pipeline
registry
tensor
testing
trainer
utils
zero
__init__.py
constants.py
core.py
global_variables.py
initialize.py