mirror of https://github.com/hpcaitech/ColossalAI
![]() * update flash-context-attention * adding kernels * fix * reset * add build script * add building process * add llama2 exmaple * add colossal-llama2 test * clean * fall back test setting * fix test file * clean * clean * clean --------- Co-authored-by: cuiqing.li <lixx336@gmail.com> |
||
---|---|---|
.. | ||
__init__.py | ||
_utils.py | ||
bloom.py | ||
chatglm2.py | ||
llama.py |