183K Downloads Updated 1 year ago
ollama run llava-phi3
curl http://localhost:11434/api/chat \ -d '{ "model": "llava-phi3", "messages": [{"role": "user", "content": "Hello!"}] }'
from ollama import chat response = chat( model='llava-phi3', messages=[{'role': 'user', 'content': 'Hello!'}], ) print(response.message.content)
import ollama from 'ollama' const response = await ollama.chat({ model: 'llava-phi3', messages: [{role: 'user', content: 'Hello!'}], }) console.log(response.message.content)
Updated 1 year ago
1 year ago
c7edd7b87593 · 2.9GB ·
llava-phi3 is a LLaVA model fine-tuned from Phi 3 Mini 4k, with strong performance benchmarks on par with the original LLaVA model:
llava-phi3
Hugging Face
GitHub