Quickstart
Get CoreCube running with Docker Compose and connect your first AI client.
Prerequisites
- Docker Desktop or Docker Engine with Compose
- An LLM API key (Anthropic or OpenAI), or a self-hosted model endpoint
1. Start CoreCube
Create a docker-compose.yml:
services:
corecube:
image: registry.arantic.cloud/corecube/corecube:latest
ports:
- '7400:7400'
environment:
CUBE_ADMIN_EMAIL: admin@example.com
CUBE_ADMIN_PASSWORD: changeme123
PGVECTOR_URL: postgresql://corecube:changeme123@pgvector:5432/corecube
volumes:
- corecube-data:/data
depends_on:
- pgvector
pgvector:
image: pgvector/pgvector:pg16
environment:
POSTGRES_USER: corecube
POSTGRES_PASSWORD: changeme123
POSTGRES_DB: corecube
volumes:
- pgvector-data:/var/lib/postgresql/data
volumes:
corecube-data:
pgvector-data:
Start the stack:
docker compose up -d
CoreCube will be available at http://localhost:7400/admin.
:::warning Change default credentials Change the default email and password immediately after first login. Go to your profile icon → Profile → update your password. :::
2. Configure an LLM provider
- Open the Admin Console at
http://localhost:7400/admin - Navigate to LLM Providers
- Add a provider — for example, Anthropic Claude:
- Provider: Anthropic
- API Key: your Anthropic API key
- Model: claude-sonnet-4-5 (or your preferred model)
- Save and test the connection
3. Add your first connection
- Navigate to Connections
- Click New Connection
- Select a source (e.g., Confluence, Jira, or File Upload)
- Configure authentication and source filtering
- Assign a compartment and sensitivity level
- Save and click Sync Now
CoreCube will begin ingesting, chunking, and embedding your documents in the background.
4. Create an API key
- Navigate to API Keys
- Click New API Key
- Give it a name (e.g., "OpenWebUI")
- Assign a scope that includes the connections you want the client to access
- Copy the key — it is shown only once
5. Connect an AI client
OpenWebUI
In OpenWebUI settings, configure a new OpenAI-compatible connection:
- API Base URL:
http://localhost:7400/v1 - API Key: your CoreCube API key
Select any model from the list — CoreCube serves its configured LLM providers through a unified model listing.
Claude Desktop
Add CoreCube as an MCP server in your Claude Desktop configuration (claude_desktop_config.json):
{
"mcpServers": {
"corecube": {
"command": "npx",
"args": ["-y", "@anthropic-ai/mcp-client-sse", "http://localhost:7400/mcp"],
"env": {
"AUTHORIZATION": "Bearer cc_YOUR_API_KEY"
}
}
}
}
Direct API
curl http://localhost:7400/v1/chat/completions \
-H "Authorization: Bearer cc_YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"messages": [{"role": "user", "content": "What do our deployment runbooks say about rollbacks?"}],
"stream": false
}'
6. Explore query results
Use the Query Explorer in the Admin Console to inspect how CoreCube retrieves and ranks context for any query:
- Score breakdown per chunk (vector, FTS, freshness, combined)
- Which connections were included or excluded
- Context assembly trace (which chunks were selected, filtered, deduplicated)
Next steps
- Configure connectors to automate knowledge ingestion
- Set up access control with compartments and scopes
- Configure LLM routing with fallback providers
- Review the API reference for all available endpoints