368.3K Downloads Updated 1 year ago
ollama run codeqwen
curl http://localhost:11434/api/chat \ -d '{ "model": "codeqwen", "messages": [{"role": "user", "content": "Hello!"}] }'
from ollama import chat response = chat( model='codeqwen', messages=[{'role': 'user', 'content': 'Hello!'}], ) print(response.message.content)
import ollama from 'ollama' const response = await ollama.chat({ model: 'codeqwen', messages: [{role: 'user', content: 'Hello!'}], }) console.log(response.message.content)
Updated 1 year ago
1 year ago
df352abf55b1 · 4.2GB ·
CodeQwen1.5 is based on Qwen1.5. It is trained on 3 trillion tokens of code data. Its major features include:
Blog Post
GitHub
HuggingFace