491.7K Downloads Updated 1 year ago
ollama run yi
curl http://localhost:11434/api/chat \ -d '{ "model": "yi", "messages": [{"role": "user", "content": "Hello!"}] }'
from ollama import chat response = chat( model='yi', messages=[{'role': 'user', 'content': 'Hello!'}], ) print(response.message.content)
import ollama from 'ollama' const response = await ollama.chat({ model: 'yi', messages: [{role: 'user', content: 'Hello!'}], }) console.log(response.message.content)
Updated 1 year ago
1 year ago
a7f031bb846f · 3.5GB ·
Yi is a series of large language models trained on a high-quality corpus of 3 trillion tokens that support both the English and Chinese languages.
HuggingFace