Skip to main content

Quickstart

Get CoreCube running with Docker Compose and connect your first AI client.

Prerequisites

  • Docker Desktop or Docker Engine with Compose
  • An LLM API key (Anthropic or OpenAI), or a self-hosted model endpoint

1. Start CoreCube

Create a docker-compose.yml:

services:
corecube:
image: registry.arantic.cloud/corecube/corecube:latest
ports:
- '7400:7400'
environment:
CUBE_ADMIN_EMAIL: admin@example.com
CUBE_ADMIN_PASSWORD: changeme123
PGVECTOR_URL: postgresql://corecube:changeme123@pgvector:5432/corecube
volumes:
- corecube-data:/data
depends_on:
- pgvector

pgvector:
image: pgvector/pgvector:pg16
environment:
POSTGRES_USER: corecube
POSTGRES_PASSWORD: changeme123
POSTGRES_DB: corecube
volumes:
- pgvector-data:/var/lib/postgresql/data

volumes:
corecube-data:
pgvector-data:

Start the stack:

docker compose up -d

CoreCube will be available at http://localhost:7400/admin.

:::warning Change default credentials Change the default email and password immediately after first login. Go to your profile icon → Profile → update your password. :::

2. Configure an LLM provider

  1. Open the Admin Console at http://localhost:7400/admin
  2. Navigate to LLM Providers
  3. Add a provider — for example, Anthropic Claude:
    • Provider: Anthropic
    • API Key: your Anthropic API key
    • Model: claude-sonnet-4-5 (or your preferred model)
  4. Save and test the connection

3. Add your first connection

  1. Navigate to Connections
  2. Click New Connection
  3. Select a source (e.g., Confluence, Jira, or File Upload)
  4. Configure authentication and source filtering
  5. Assign a compartment and sensitivity level
  6. Save and click Sync Now

CoreCube will begin ingesting, chunking, and embedding your documents in the background.

4. Create an API key

  1. Navigate to API Keys
  2. Click New API Key
  3. Give it a name (e.g., "OpenWebUI")
  4. Assign a scope that includes the connections you want the client to access
  5. Copy the key — it is shown only once

5. Connect an AI client

OpenWebUI

In OpenWebUI settings, configure a new OpenAI-compatible connection:

  • API Base URL: http://localhost:7400/v1
  • API Key: your CoreCube API key

Select any model from the list — CoreCube serves its configured LLM providers through a unified model listing.

Claude Desktop

Add CoreCube as an MCP server in your Claude Desktop configuration (claude_desktop_config.json):

{
"mcpServers": {
"corecube": {
"command": "npx",
"args": ["-y", "@anthropic-ai/mcp-client-sse", "http://localhost:7400/mcp"],
"env": {
"AUTHORIZATION": "Bearer cc_YOUR_API_KEY"
}
}
}
}

Direct API

curl http://localhost:7400/v1/chat/completions \
-H "Authorization: Bearer cc_YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"messages": [{"role": "user", "content": "What do our deployment runbooks say about rollbacks?"}],
"stream": false
}'

6. Explore query results

Use the Query Explorer in the Admin Console to inspect how CoreCube retrieves and ranks context for any query:

  • Score breakdown per chunk (vector, FTS, freshness, combined)
  • Which connections were included or excluded
  • Context assembly trace (which chunks were selected, filtered, deduplicated)

Next steps

We use cookies for analytics to improve our website. More information in our Privacy Policy.