Works with cloud and on-prem DBs
Public and private network patterns
Firewall and port checklist included
- Create a read-only DB user for DataTwin.
- Allow inbound access from your DataTwin egress IP or security group.
- Open only the DB port needed for your engine.
- Enable TLS/SSL if your DB supports it.
- Add the connection in Dashboard -> Databases and run a test query.
- Keep DB private, no public internet exposure.
- Use one of: VPN, VPC peering, PrivateLink, SSH tunnel, or bastion relay.
- Restrict traffic by source CIDR, destination port, and protocol.
- Apply secret rotation and least-privilege role policies.
- Track access with DB audit logs and network flow logs.
Open only what you use. Keep source IP restricted to your DataTwin path.
| Engine |
Default Port |
Protocol |
Recommended Security |
| PostgreSQL | 5432 | TCP | TLS + IP allowlist + read-only user |
| MySQL / MariaDB | 3306 | TCP | TLS + security group rules |
| SQL Server | 1433 | TCP | TLS + AD/SQL auth hardening |
| Oracle | 1521 | TCP | TLS + wallet/cert validation |
| MongoDB | 27017 | TCP | TLS + auth enabled + IP scope |
| Redis (managed) | 6379 | TCP | TLS + auth token + private network |
| Snowflake | 443 | HTTPS | Network policy + role-based access |
| BigQuery | 443 | HTTPS | Service account least privilege |
Inbound
- Allow source: DataTwin egress CIDR (or private tunnel CIDR).
- Allow destination: DB host only (not full subnet if avoidable).
- Allow destination port: exact DB port only.
- Allow protocol: TCP only.
Outbound
- Allow DataTwin service to reach DB endpoint.
- Allow DNS and certificate validation endpoints if TLS is used.
Use this flow when a connection fails.
1) Confirm hostname and port
2) Confirm username, password, database name
3) Test network reachability from DataTwin path
4) Check firewall/security group deny logs
5) Verify TLS mode (require/verify-ca/verify-full)
6) Run a simple query: SELECT 1
7) Check DB audit logs for reject reason
Every API call from your backend requires two headers. Both are generated by DataTwin — you store them securely and send them with every request.
- Log in to the DataTwin dashboard.
- Go to API Keys in the sidebar.
- Click Generate API Key.
- Copy the key immediately — it is shown only once.
Authorization: Bearer dt_org_xxxxxxxxxxxxxxxxxxxx
This identifies your organization. Store it in your backend environment as DATATWIN_API_KEY.
- Go to API Keys → Registered Applications.
- Click Register Application.
- Fill in your app name and optionally set rate limits.
- Copy the App Token shown in the modal — it is shown only once.
X-App-Token: app_xxxxxxxxxxxxxxxxxxxx
This identifies which of your applications is calling the API. Store it as DATATWIN_APP_TOKEN.
Use both headers on every request. Never expose these values in client-side code.
Python
import requests
import os
API_BASE = "https://datatwinai.com"
headers = {
"Authorization": f"Bearer {os.getenv('DATATWIN_API_KEY')}",
"X-App-Token": os.getenv('DATATWIN_APP_TOKEN'),
"Content-Type": "application/json"
}
# Connect a database
requests.post(f"{API_BASE}/api/v1/database/connect", headers=headers, json={
"connection_string": "postgresql://user:pass@host/dbname"
})
# Run a SQL query
resp = requests.post(f"{API_BASE}/api/v1/query/", headers=headers, json={
"query": "SELECT * FROM users LIMIT 10"
})
print(resp.json())
# Ask a natural language question
resp = requests.post(f"{API_BASE}/api/v1/query/natural", headers=headers, json={
"question": "How many users signed up this month?"
})
print(resp.json())
Node.js
const axios = require('axios');
const headers = {
'Authorization': `Bearer ${process.env.DATATWIN_API_KEY}`,
'X-App-Token': process.env.DATATWIN_APP_TOKEN,
'Content-Type': 'application/json'
};
// Natural language query
const res = await axios.post(
'https://datatwinai.com/api/v1/query/natural',
{ question: 'Show me total revenue by month' },
{ headers }
);
console.log(res.data);
cURL
curl -X POST https://datatwinai.com/api/v1/query/natural \
-H "Authorization: Bearer $DATATWIN_API_KEY" \
-H "X-App-Token: $DATATWIN_APP_TOKEN" \
-H "Content-Type: application/json" \
-d '{"question": "How many active users do we have?"}'
Who generates the API Key and App Token?
DataTwin generates both. You copy them once from the dashboard and store them in your own backend environment variables. They are never stored in plain text on DataTwin's servers.
What happens if I don't send X-App-Token?
You will get a 403 Forbidden response. Org API keys only work when paired with a registered application token.
What happens if I exceed my rate limits?
You will get a 429 Too Many Requests response. Default limits are 120 requests/minute and 10,000 requests/day per application. You can adjust these in the Registered Applications settings.
Can I have multiple applications under one org?
Yes. Register a separate application for each backend service (e.g. one for your API, one for your analytics worker). Each gets its own token and rate limits.
I lost my App Token. What do I do?
Go to API Keys → Registered Applications, find your app, and click Rotate Token. A new token will be generated. Update your backend environment variable with the new token immediately.
Should I put the API key or App Token in frontend JavaScript?
No. These are backend credentials only. Never expose them in browser code. Use your own backend as a proxy.
What is the base URL?
https://datatwinai.com
Can DataTwin connect to databases worldwide?
Yes, as long as network reachability, credentials, and policy controls are configured.
Do I need to expose my database publicly?
No. Private connectivity is preferred for production.
Should I use admin credentials?
No. Use a dedicated read-only account with only required schema access.
What if my security team blocks direct access?
Use a bastion, VPN, or PrivateLink pattern and keep the DB private.
How do I reduce risk quickly?
Limit source IP, enforce TLS, rotate secrets, and enable query/audit logging.
- Intermittent connection drops after load: check NAT, idle timeouts, and connection pool limits.
- TLS handshake failures: validate certificate chain and hostname verification mode.
- High latency on private links: check cross-region routing and peering path.
- Authentication loops: verify auth plugin compatibility and user host restrictions.