You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/examples/language/llama
binmakeswell ef4b99ebcd
add llama example CI
1 year ago
..
README.md [example] add llama pretraining (#4257) 1 year ago
test_ci.sh add llama example CI 1 year ago

README.md

Pretraining LLaMA: best practices for building LLaMA-like base models

  • 65-billion-parameter large model pretraining accelerated by 38% [code] [blog]

Since the main branch is being updated, in order to maintain the stability of the code, this example is temporarily kept as an independent branch.