다국어 Multi Language

LLaMa2

Llama 2 is actually a collection of four LLMs, each with a different number of parameters — the smallest with seven billion parameters, the largest with 70 billion. About 90% of the training data was in English, with roughly 9% in unknown languages. The remaining 11% of training data was written in a wide range of languages, including German (0.17%), French (0.16%), and Chinese (0.13%).