ColossalAI/colossalai
Hongxin Liu 4f68b3f10c
[kernel] support pure fp16 for cpu adam and update gemini optim tests (#4921)
* [kernel] support pure fp16 for cpu adam (#4896)

* [kernel] fix cpu adam kernel for pure fp16 and update tests (#4919)

* [kernel] fix cpu adam

* [test] update gemini optim test
2023-10-16 21:56:53 +08:00
..
_C
_analyzer
amp [feature] Add clip_grad_norm for hybrid_parallel_plugin (#4837) 2023-10-12 11:32:37 +08:00
auto_parallel
autochunk
booster [feature] support no master weights option for low level zero plugin (#4816) 2023-10-13 07:57:45 +00:00
checkpoint_io [checkpointio] hotfix torch 2.0 compatibility (#4824) 2023-10-07 10:45:52 +08:00
cli
cluster
context
device
fx
inference [inference] Add smmoothquant for llama (#4904) 2023-10-16 11:28:44 +08:00
interface [lazy] support from_pretrained (#4801) 2023-09-26 11:04:11 +08:00
kernel [kernel] support pure fp16 for cpu adam and update gemini optim tests (#4921) 2023-10-16 21:56:53 +08:00
lazy [doc] add lazy init docs (#4808) 2023-09-27 10:24:04 +08:00
legacy
logging
nn [kernel] support pure fp16 for cpu adam and update gemini optim tests (#4921) 2023-10-16 21:56:53 +08:00
pipeline [Pipeline Inference] Sync pipeline inference branch to main (#4820) 2023-10-11 11:40:06 +08:00
shardformer [infer] fix test bug (#4838) 2023-10-04 10:01:03 +08:00
tensor
testing [gemini] support amp o3 for gemini (#4872) 2023-10-12 10:39:08 +08:00
utils
zero [feature] support no master weights option for low level zero plugin (#4816) 2023-10-13 07:57:45 +00:00
__init__.py
initialize.py