Fork me on GitHub

Public Ollama Models 2025-07-05

How to chat with Ollama models

Select an IP and model from the table below, then use them in this command:

# Start a conversation with a model
# Replace <IP> with an IP from the table below
# Replace <MODEL> with one of the models listed for that IP
curl -X POST http://<IP>:11434/api/chat -d '{
    "model": "<MODEL>",
    "messages": [{
        "role": "user",
        "content": "Hello, how are you?"
    }]
}'

Available Models

IP Models
222.70.88.44 qwen3:8b
deepseek-r1:8b
deepseek-r1:7b
qwen3:30b-a3b
nomic-embed-text:latest
bge-m3:latest
117.50.179.196 smollm2:135m
hf.co/IlyaGusev/saiga_nemo_12b_gguf:Q5_K_M
117.50.197.100 qwen2.5:7b
bge-m3:567m
qwen2.5vl:7b
117.50.164.136 qwen3:1.7b
nomic-embed-text:v1.5
qwen3:4b
llama3.2:3b-instruct-q5_K_M
106.14.202.11 mxbai-embed-large:latest
163.228.156.198 qwen3:8b-nothink
qwen3:8b
deepseek-r1:8b
qwen2.5:32b
qwen2.5-coder:7b
Qwen2.5-7B-Instruct-Distill-ds-r1-110k:latest
Qwen2.5-7B-Instruct:7b
Qwen2.5-7B-Distill-ds-r1-110k:7b
qwq:latest
smollm2:135m
qwen2.5:3B-Trained
deepseek-r1:32b
deepseek-r1:14b
llava:latest
nomic-embed-text:latest
qwen2.5:latest
deepseek-r1:7b
218.1.151.175 smollm2:135m
nomic-embed-text:latest
qwen2.5:latest
218.78.108.171 deepseek-r1:14b
deepseek-r1:7b
deepseek-r1:1.5b
117.50.245.70 smollm2:135m
qwen2.5:32b
qwen2.5:7b
gemma2:27b
gemma2:2b
qwen2.5:14b
deepseek-r1:14b
deepseek-r1:7b
gemma3:4b
nomic-embed-text:latest
deepseek-r1:1.5b
gemma3:12b
gemma3:27b
qwen2.5-coder:latest
unsloth.F16.gguf:latest
unsloth.Q8_0.gguf:latest
117.50.194.3 dengcao/Qwen3-Embedding-8B:Q5_K_M
dengcao/Qwen3-Embedding-4B:Q5_K_M
61.165.183.106 huihui_ai/deepseek-r1-abliterated:70b-llama-distill-q8_0
124.71.154.35 llama3.2:3b-instruct-q5_K_M
deepseek-r1:1.5b
nomic-embed-text:latest
117.50.174.178 smollm2:135m
qwen2.5:7b
deepseek-r1:8b
117.50.175.121 changji_medical_deepseek_r1:14b
changji_medical_deepseek_r1:32b
101.132.102.117 smollm2:135m
bge-m3:567m
deepseek-r1:1.5b
117.50.250.245 qwen3:8b_nothink
qwen3:8b
143.64.160.92 llama3.2:3b-instruct-q5_K_M
MartinRizzo/Ayla-Light-v2:12b-q4_K_M
58.246.1.174 llama3.2:3b-instruct-q5_K_M
qwen2.5:32b
61.172.167.153 deepseek-r1:7b
61.172.167.211 deepseek-r1:7b
61.169.115.204 nomic-embed-text:latest
deepseek-r1:32b
47.116.202.9 qwen3-no-think:latest
qwen3:latest
qwen3:8b
qwen:7b
llava:latest
mistral:7b-instruct
nomic-embed-text:latest
qllama/bge-reranker-v2-m3:latest
bge-large:latest
deepseek-r1:7b
bge-m3:latest
deepseek-r1:latest
deepseek-r1:1.5b
qwen2:latest
223.166.95.229 deepseek-r1:7b
deepseek-r1:14b
deepseek-r1:8b
qwen3:latest
qwen3:14b
qwen2.5vl:32b
qwen2.5vl:latest
qwen3:8b
gemma3:12b
gemma3:27b
llava:34b
llava:13b
mxbai-embed-large:latest
nomic-embed-text:latest
qwq:latest
codellama:13b
llama3.2-vision:latest
qwen2.5-coder:latest
qwen2.5-coder:14b
phi4:latest
phi3:14b
mistral:latest
llama3.3:latest
llama3.2:latest
llama3.1:latest
llama3:latest
llama3:70b
gemma2:latest
gemma2:27b
180.158.174.61 qwq:32b-q8_0
qwq:32b-16384context
qwq:32b
nomic-embed-text:latest
deepseek-r1:32b
deepseek-r1:14b
llama3.2-vision:11b
qwen2.5:32b
llama3.2:latest

Disclaimer

These Ollama model endpoints are publicly exposed interfaces found on the internet. They are listed here for informational purposes only. Please be aware that:

  • These endpoints are not maintained or controlled by us
  • The availability and stability of these services cannot be guaranteed
  • Use these services at your own risk
  • We take no responsibility for any issues or damages that may arise from using these endpoints

免责声明

本文列出的 Ollama 模型接口均来自互联网上公开暴露的端点。请注意:

  • 这些端点并非由我们维护或控制
  • 无法保证这些服务的可用性和稳定性
  • 使用这些服务需自行承担风险
  • 对于使用这些端点可能产生的任何问题或损失,我们不承担任何责任
← Previous Post: Public Ollama Models

All Tags

mindset

devops

book

ci

communication

jenkins

自媒体

hometown

geek life

cloud

ai