-
Notifications
You must be signed in to change notification settings - Fork 0
Database
OntoKit uses PostgreSQL as its primary database with SQLAlchemy 2.0 (async) as the ORM and Alembic for migrations.
The stack uses a single PostgreSQL instance shared between Zitadel and OntoKit:
┌─────────────────────────────────────────────────────────────────┐
│ PostgreSQL (port 5432) │
├─────────────────────────────┬───────────────────────────────────┤
│ Database: zitadel │ Database: ontokit │
│ User: zitadel │ User: ontokit │
│ Purpose: Identity data │ Purpose: Application data │
└─────────────────────────────┴───────────────────────────────────┘
│
▼
┌─────────────────┐ ┌─────────────────┐
│ FastAPI │────>│ SQLAlchemy │
│ (async) │ │ (asyncpg) │
└─────────────────┘ └─────────────────┘
The databases are created automatically by scripts/init-db.sh when the PostgreSQL container first starts.
The database connection is configured in ontokit/core/database.py:
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession
engine = create_async_engine(
str(settings.database_url),
echo=settings.debug, # Log SQL queries in debug mode
pool_pre_ping=True, # Verify connections before use
)| Mode | Connection URL |
|---|---|
| Full Docker | postgresql+asyncpg://ontokit:ontokit@postgres:5432/ontokit |
| Hybrid | postgresql+asyncpg://ontokit:ontokit@localhost:5432/ontokit |
Models are defined in ontokit/models/ using SQLAlchemy 2.0's declarative style:
# ontokit/models/project.py
from sqlalchemy.orm import Mapped, mapped_column
from ontokit.core.database import Base
class Project(Base):
__tablename__ = "projects"
id: Mapped[uuid.UUID] = mapped_column(primary_key=True, default=uuid.uuid4)
name: Mapped[str] = mapped_column(String(255), nullable=False)
description: Mapped[str | None] = mapped_column(Text, nullable=True)
is_public: Mapped[bool] = mapped_column(Boolean, default=False)
created_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), server_default=func.now()
)| Model | Table | Description |
|---|---|---|
Project |
projects |
Project container for ontologies |
ProjectMember |
project_members |
Project membership and roles |
After modifying models, generate a migration:
alembic revision --autogenerate -m "Description of changes"This creates a new file in alembic/versions/.
Always review auto-generated migrations before applying:
# View the generated SQL without applying
alembic upgrade head --sql# Apply all pending migrations
alembic upgrade head
# Apply specific migration
alembic upgrade <revision_id>If using Full Docker mode:
# Run migrations inside the API container
docker compose exec api alembic upgrade head
# Or generate a new migration
docker compose exec api alembic revision --autogenerate -m "Add new table"# Rollback one migration
alembic downgrade -1
# Rollback to specific revision
alembic downgrade <revision_id>
# Rollback all migrations
alembic downgrade base# Show current revision
alembic current
# Show all revisions
alembic history
# Show pending migrations
alembic history --indicate-currentAuto-generate doesn't catch everything. Review for:
- Dropped columns (may need data migration)
- Index changes
- Constraint modifications
# In your migration file
def upgrade():
op.create_index('ix_projects_owner_id', 'projects', ['owner_id'])For data transformations, use op.execute():
def upgrade():
# Add column
op.add_column('projects', sa.Column('slug', sa.String(255)))
# Populate existing rows
op.execute("UPDATE projects SET slug = lower(replace(name, ' ', '-'))")
# Make non-nullable
op.alter_column('projects', 'slug', nullable=False)# Apply
alembic upgrade head
# Verify
alembic current
# Rollback
alembic downgrade -1
# Re-apply
alembic upgrade headfrom sqlalchemy import select
from ontokit.core.database import async_session_maker
async with async_session_maker() as session:
# Query
result = await session.execute(
select(Project).where(Project.is_public == True)
)
projects = result.scalars().all()
# Insert
project = Project(name="New Project", is_public=True)
session.add(project)
await session.commit()Use dependency injection:
from fastapi import Depends
from sqlalchemy.ext.asyncio import AsyncSession
from ontokit.core.database import get_db
@router.get("/projects")
async def list_projects(db: AsyncSession = Depends(get_db)):
result = await db.execute(select(Project))
return result.scalars().all()# Connect to the Docker container
docker exec -it ontokit-postgres psql -U ontokit -d ontokit
# Or connect directly (if psql is installed locally)
psql -h localhost -p 5432 -U ontokit -d ontokit# For debugging Zitadel issues
docker exec -it ontokit-postgres psql -U zitadel -d zitadel-- List tables
\dt
-- Describe table
\d projects
-- List indexes
\di
-- Query data
SELECT * FROM projects LIMIT 10;
-- List all databases
\l
-- Switch database
\c ontokitCheck that PostgreSQL is running:
docker compose ps postgres
docker compose logs postgresVerify the database was initialized correctly:
# Check if both databases exist
docker exec ontokit-postgres psql -U postgres -c "\l"If the databases aren't created, the init script may not have run. Check:
# View init script logs
docker compose logs postgres | grep init
# Manually run init (if needed)
docker exec ontokit-postgres psql -U postgres -f /docker-entrypoint-initdb.d/init-db.shIf migrations are out of sync:
# Check current state
alembic current
# Stamp the database with a specific revision
alembic stamp <revision_id>To completely reset the database:
# Stop and remove the container with its volume
docker compose down -v
# Restart
docker compose up -d postgres
# Wait for it to be ready
docker compose logs -f postgres
# Re-run migrations (Hybrid mode)
alembic upgrade head
# Or in Full Docker mode
docker compose exec api alembic upgrade headWarning: This will also reset Zitadel's database. You'll need to reconfigure Zitadel applications.
To reset only the OntoKit database while keeping Zitadel:
# Connect and drop/recreate the database
docker exec ontokit-postgres psql -U postgres -c "DROP DATABASE ontokit;"
docker exec ontokit-postgres psql -U postgres -c "CREATE DATABASE ontokit OWNER ontokit;"
# Re-run migrations
alembic upgrade head- Development - Development workflow
- API Reference - API endpoints