You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Boyuan Yao a2b43e393d
[autoparallel] Patch meta information of `torch.nn.Embedding` (#2760)
2 years ago
..
_C [setup] support pre-build and jit-build of cuda kernels (#2374) 2 years ago
amp [test] fixed the triton version for testing (#2608) 2 years ago
auto_parallel [autoparallel] Patch meta information of `torch.nn.Embedding` (#2760) 2 years ago
autochunk [autochunk] support diffusion for autochunk (#2621) 2 years ago
builder
cli [NFC] polish colossalai/cli/cli.py code style (#2734) 2 years ago
communication [NFC] polish communication/p2p_v2.py code style (#2303) 2 years ago
context [NFC] polish colossalai/context/process_group_initializer/initializer_2d.py code style (#2726) 2 years ago
device [autoparallel] accelerate gpt2 training (#2495) 2 years ago
engine [NFC] polish colossalai/engine/gradient_handler/utils.py code style (#2708) 2 years ago
fx [autoparallel] Patch meta information of `torch.matmul` (#2584) 2 years ago
gemini [NFC] polish colossalai/gemini/gemini_context.py code style (#2690) 2 years ago
kernel [kernel] fixed repeated loading of kernels (#2549) 2 years ago
logging [logger] hotfix, missing _FORMAT (#2231) 2 years ago
nn [gemini] add fake_release_chunk for keep-gathered chunk in the inference mode (#2671) 2 years ago
pipeline polish pp middleware (#2476) 2 years ago
registry
tensor [polish] polish ColoTensor and its submodules (#2537) 2 years ago
testing [amp] add gradient clipping for unit tests (#2283) 2 years ago
trainer [polish] remove useless file _mem_tracer_hook.py (#1963) 2 years ago
utils Don't use `torch._six` (#2775) 2 years ago
zero [zero] fix wrong import (#2777) 2 years ago
__init__.py [setup] supported conda-installed torch (#2048) 2 years ago
constants.py updated tp layers 2 years ago
core.py
global_variables.py updated tp layers 2 years ago
initialize.py Fix False warning in initialize.py (#2456) 2 years ago