You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/legacy
Rocky Duan cbe34c557c
Fix ColoTensorSpec for py11 (#5440)
8 months ago
..
amp [feat] refactored extension module (#5298) 10 months ago
builder [misc] update pre-commit and run all files (#4752) 1 year ago
communication [npu] change device to accelerator api (#5239) 11 months ago
context [hotfix] fix torch 2.0 compatibility (#4936) 1 year ago
engine [npu] change device to accelerator api (#5239) 11 months ago
inference [hotfix] fix typo s/keywrods/keywords etc. (#5429) 9 months ago
nn [feat] refactored extension module (#5298) 10 months ago
pipeline [misc] update pre-commit and run all files (#4752) 1 year ago
registry [misc] update pre-commit and run all files (#4752) 1 year ago
tensor Fix ColoTensorSpec for py11 (#5440) 8 months ago
trainer [npu] change device to accelerator api (#5239) 11 months ago
utils [feat] refactored extension module (#5298) 10 months ago
zero Merge branch 'main' into sync/npu 10 months ago
__init__.py [bug] fix get_default_parser in examples (#4764) 1 year ago
constants.py [misc] update pre-commit and run all files (#4752) 1 year ago
core.py [misc] update pre-commit and run all files (#4752) 1 year ago
global_variables.py [misc] update pre-commit and run all files (#4752) 1 year ago
initialize.py [npu] change device to accelerator api (#5239) 11 months ago