OZEKI
AI Gateway
Quick start
Tutorials
Manual
Download
Contact
Sign in
MENU
Sign in
Quick start guide
Tutorial
Create a user account
Connect to an AI service
Create a route
AI clients
AI service providers
AI local models
192GB Vram
Ollama
24 GB Vram
Features
Administrators guide
Teams
Sign in
Quick start guide
Tutorial
Create a user account
Connect to an AI service
Create a route
AI clients
AI service providers
AI local models
192GB Vram
Ollama
24 GB Vram
Features
Administrators guide
Teams
AI Gateway
Next: 24 GB Vram
How to connect Ozeki AI Gateway to Ollama
More information
Local model execution commands for VLLM for dual NVidia RTX6000 Pro
How to connect Ozeki AI Gateway to Ollama
How to run local AI Models on 24 GB Vram, RTX 3090 or RTX 4090
Next: 24 GB Vram