🧱 Databricks Connection
Connect your Databricks lakehouse to Meza AI.
Overview
Connect your Databricks workspace to Meza AI to query data from your lakehouse. This integration uses Databricks SQL warehouses for efficient query execution against Delta Lake tables.
Prerequisites
- Databricks workspace (AWS, Azure, or GCP)
- SQL warehouse (serverless or pro) enabled
- Personal access token with SQL permissions
- Catalog and schema access configured
Creating an Access Token
Go to User Settings
In Databricks, click your username in the top-right → User Settings.
Open Access Tokens
Go to the Developer → Access Tokens tab.
Generate Token
Click Generate New Token, enter a comment like "Meza AI", and set an expiration.
Copy Token
Copy the token immediately — you won't be able to see it again!
Finding Connection Details
To find your SQL warehouse connection details:
- Go to SQL Warehouses in your Databricks workspace
- Select your warehouse
- Click Connection details tab
- Copy the Server Hostname and HTTP Path
Connection Steps
Navigate to Databases
Go to Configuration → Databases in the left sidebar.
Select Databricks
Click Add Connection and select Databricks.
Enter Connection Details
Enter server hostname, HTTP path, and your access token.
Configure Catalog
Enter your Unity Catalog name and default schema.
Test & Save
Click Test Connection then Save.
Connection Parameters
| Parameter | Description | Example |
|---|---|---|
| Server Hostname | Workspace URL without https:// | dbc-abc123.cloud.databricks.com |
| HTTP Path | SQL warehouse endpoint path | /sql/1.0/warehouses/abc123def |
| Access Token | Personal access token | dapi... |
| Catalog | Unity Catalog name | main |
| Schema | Default schema | default |
💡 Note
/sql/1.0/warehouses/.Required Permissions
The access token user needs these permissions:
- Can Use permission on the SQL warehouse
- USE CATALOG on the Unity Catalog
- USE SCHEMA on target schemas
- SELECT on tables you want to query
Troubleshooting
Connection Timeout
- Check if the SQL warehouse is running (it may be stopped)
- Verify the server hostname is correct
- Ensure network connectivity to Databricks
Authentication Error
- Verify the access token hasnt expired
- Check the token has required permissions
- Generate a new token if needed