You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
zbian 61e687831d
fixed using zero with tp cannot access weight correctly
2 years ago
..
_C [setup] support pre-build and jit-build of cuda kernels (#2374) 2 years ago
amp [test] fixed the triton version for testing (#2608) 2 years ago
auto_parallel [hotfix] fix autoparallel compatibility test issues (#2754) 2 years ago
autochunk [autochunk] support diffusion for autochunk (#2621) 2 years ago
builder
cli [cli] handled version check exceptions (#2848) 2 years ago
communication
context [NFC] polish colossalai/context/process_group_initializer/initializer_2d.py code style (#2726) 2 years ago
device [hotfix] add copyright for solver and device mesh (#2803) 2 years ago
engine [NFC] polish colossalai/engine/schedule/_pipeline_schedule.py code style (#2744) 2 years ago
fx [autoparallel] Patch meta information of `torch.matmul` (#2584) 2 years ago
gemini [hotfix] fix chunk size can not be divided (#2867) 2 years ago
kernel [triton] added copyright information for flash attention (#2835) 2 years ago
logging
nn fixed using zero with tp cannot access weight correctly 2 years ago
pipeline polish pp middleware (#2476) 2 years ago
registry
tensor [hotfix]: Remove math.prod dependency (#2837) 2 years ago
testing
trainer
utils Don't use `torch._six` (#2775) 2 years ago
zero [zero] trivial zero optimizer refactoring (#2869) 2 years ago
__init__.py
constants.py
core.py
global_variables.py
initialize.py Fix False warning in initialize.py (#2456) 2 years ago