Your data. Your database.
Full SQL access.

Every Deepline workspace includes a dedicated Neon PostgreSQL database. Your enrichment data lives in your schema, queryable with any SQL client, exportable anytime. No vendor lock-in.

$0
export fees, forever
5+1
normalized schemas + custom
psql
direct connection, any client

Hard facts

What's included

Every workspace gets a real PostgreSQL database. Not a shared multi-tenant store. Not a proprietary data format.

AttributeValue
Database engineNeon PostgreSQL (serverless)
IncludedFree with every workspace
IsolationDedicated Neon project per tenant
Schemas5 managed + 1 custom (dl_cache, dl_override, dl_graph, dl_resolved, dl_meta, tenant_custom)
SQL accessDirect psql / DBeaver / Metabase / any Postgres client
Export fees$0 (pg_dump, COPY TO, direct query)
Data residencyYour Neon project, your data
RetentionUnlimited (your database, no auto-deletion)
vs ClayData locked in Clay platform, no SQL, no database
vs ApolloData locked in Apollo, limited export credits

Why it matters

What an included database means for your GTM data

Enrichment data ownership

Query your enrichment data with any SQL tool. Connect Metabase, Looker, Grafana, or write raw SQL. No vendor SDK required, no API pagination, no row limits. Your data is a pg_dump away from full portability.

GTM data warehouse

Join enrichment data with CRM exports, analytics tables, or custom datasets. The tenant_custom schema gives you full DDL privileges: create your own tables, indexes, and views alongside your enrichment data.

Compliance and data sovereignty

Your data stays in your dedicated Neon project. Physically isolated compute and storage, no shared multi-tenant database. Delete anytime with standard PostgreSQL tools. No "please contact sales" to export your records.

Not ideal for ephemeral lookups

If you only need one-off lookups piped straight into a CRM or spreadsheet, you may never touch the database directly. The CLI and API return results inline. The database becomes valuable when you build on top of enrichment data over time.

Architecture

Schema architecture

Raw enrichments land in dl_cache, the identity graph links them to resolved entities, manual corrections go in dl_override, and coalesced views surface in dl_resolved.

dl_resolvedCoalesced views

Merged, deduplicated records from all providers. Query this 90% of the time.

Tables: resolved_people, resolved_companies, coalesced_enrichment_event
dl_cacheEnrichment cache

Raw responses from every provider call. Every enrichment you have ever run is stored here.

Tables: enrichment_event
dl_graphIdentity graph

Cross-provider deduplication. Links enrichment events to resolved person/company entities.

Tables: entities, adoptions, identifier_memberships
dl_overrideManual overrides

Human corrections that take precedence over cached data in dl_resolved views.

Tables: custom_enrichment_event
dl_metaMigrations & settings

Schema versioning and tenant configuration. Platform-managed, no manual intervention.

Tables: schema_migrations, tenant_settings

Examples

What you can do with SQL access

Queries the vendor UI never anticipated. No API pagination, no row limits, no export queue.

Find all enriched people at a company

sql
SELECT entity_id, doc->>'full_name' AS name,
       doc->>'email' AS email, doc->>'title' AS title
FROM dl_resolved.resolved_people
WHERE doc->>'company_domain' = 'stripe.com'
ORDER BY updated_at DESC;

Email verification rate by provider

sql
SELECT doc->>'source_provider' AS provider,
       COUNT(*) AS total,
       COUNT(*) FILTER (WHERE doc->>'email_status' = 'valid') AS verified,
       ROUND(100.0 * COUNT(*) FILTER (WHERE doc->>'email_status' = 'valid') / COUNT(*), 1) AS pct
FROM dl_cache.enrichment_event
GROUP BY 1 ORDER BY pct DESC;

Export all resolved contacts to CSV

sql
\copy (
  SELECT doc->>'full_name', doc->>'email',
         doc->>'company_name', doc->>'title'
  FROM dl_resolved.resolved_people
) TO 'contacts.csv' WITH CSV HEADER

Join enrichment data with your own segments

sql
SELECT p.doc->>'full_name' AS name,
       p.doc->>'email' AS email,
       s.segment, s.score
FROM dl_resolved.resolved_people p
JOIN tenant_custom.account_segments s
  ON s.company_entity_id = (p.doc->>'company_entity_id')::uuid
WHERE s.segment = 'enterprise'
ORDER BY s.score DESC;

Side-by-side

Data ownership comparison

Where does your enrichment data actually live?

FeatureDeeplineClayApolloZoomInfo
Included databaseNeon PostgreSQL (dedicated project)NoneNoneNone
Direct SQL accessAny Postgres client (psql, DBeaver, Metabase)No, UI-onlyNo, API with rate limitsNo, UI + limited API
Export cost$0 (pg_dump, COPY TO)CSV download onlyExport credits (limited per tier)Export credits (limited per contract)
Data isolationDedicated Neon project per tenantShared platformShared platformShared platform
Custom tablesFull DDL in tenant_custom schemaNoNoNo
BI tool connectionDirect Postgres connectionNo direct connectionNo direct connectionNo direct connection

Common questions

FAQ

Is the database really free?

Yes. Every Deepline workspace includes a dedicated Neon PostgreSQL project at no additional cost. There are no storage fees, no export fees, and no per-query charges from Deepline. Neon's serverless architecture scales storage automatically.

Can I connect Metabase, Looker, or Grafana directly?

Yes. Request a read connection URI via the API and configure your BI tool with the host, port, database, username, and password. SSL is required. The dl_resolved views are designed as the primary query surface for reporting.

Is my data shared with other tenants?

No. Each workspace gets a dedicated Neon project with separate compute, separate storage, and separate connection endpoints. There is no row-level security or shared-schema multi-tenancy.

Can I create my own tables?

Yes, in the tenant_custom schema. The override role has full DDL and DML privileges there. Create tables, indexes, functions, and views alongside your enrichment data. The 5 managed schemas are platform-managed and should not be modified directly.

What happens if I leave Deepline?

Run pg_dump against your Neon project and take everything with you. There are no export restrictions, no lock-in period, and no fees. Your data is standard PostgreSQL; it works anywhere Postgres runs.

Try Deepline in 30 seconds

Install the CLI, enrich your first contact, and query the results in your own PostgreSQL database.

bash
curl -s "https://code.deepline.com//api/v2/cli/install" | bash
Learn more about Deepline →