Migrating the Domestic Abuse Services Mapped database to PostgreSQL
Recently, I migrated the database for Domestic Abuse Services Mapped from Airtable to a PostgreSQL database hosted on Render. The primary motivation for this was that I was consistently exceeding the API call limits of Airtable’s free plan. Switching to PostgreSQL also offered additional benefits, such as reducing reliance on external services—since I already use Render to host the application. As the site continues to grow and needs to handle higher traffic, PostgreSQL provides greater power, flexibility, and scalability for future development.
Setting Up a PostgreSQL Database
My first step was to set up a local PostgreSQL database. I installed PostgreSQL using Homebrew:
brew install postgresql
Next, I started the PostgreSQL service:
brew services start postgresql
To access the service, I used:
psql postgresInside the PostgreSQL shell, I created a new database called services with:
CREATE DATABASE services;I then defined the table, columns, and data types to mirror the structure of my original Airtable database.
Hosting on Render
The next step was to create a new PostgreSQL instance on Render and connect to it remotely in my Mac terminal using the provided connection string. I created the table, columns and data types to make a copy of my local database.
After this I:
- Exported the data from Airtable as a CSV file using their export feature
- Converted the CSV file into SQL INSERT statements
- Checked the data for duplicates and verified that all data types matched my new database schema
- Used PostgreSQL’s \copy command to bulk import the data into both my local and remote databases.
With the data now in PostgreSQL, I updated my API logic to use the node-postgres package and fetch data directly from the new database. For example, I created this utility function:
import { Client } from 'pg';
export async function getServicesFromDb() {
const client = new Client({ connectionString: process.env.DATABASE_URL });
await client.connect();
try {
const result = await client.query(
'SELECT * FROM services where approved = true'
);
return result.rows.map((row) => ({
type: 'Feature' as const,
properties: {
name: row.name || '',
description: row.description || '',
address: row.address || '',
postcode: row.postcode || '',
email: row.email || '',
website: row.website || '',
phone: row.phone || '',
donate: row.donate || '',
serviceType: row.service_type || [],
serviceSpecialism: row.service_specialism || [],
approved: row.approved,
localAuthority: row.local_authority || '',
},
geometry: {
type: 'Point' as const,
coordinates: [parseFloat(row.lng || '0'), parseFloat(row.lat || '0')],
},
}));
} finally {
await client.end();
}
}
This allowed me to fetch and serve data from PostgreSQL, replacing the old Airtable logic. I connected my application code to the database using environment variables e.g.
const client = new Client({ connectionString: process.env.DATABASE_URL });
Finishing Steps
As the API is public, I updated my Swagger documentation to reflect the new PostgreSQL data source. This will help me keep track of the API structure and endpoints, and make it easier for future collaborators to understand and use the API. One of the major advantages of using Airtable was the user interface. To replicate this I set up connections in the GUI application DataGrip to both my local and remote databases.This makes it easier to inspect records and run queries. Since this was a significant refactor, I spun up a staging environment on Render to test the changes. Once I was confident that everything was working as expected I pushed the updates to the production environment.
I’m satisfied that the site now benefits from a more robust, scalable, and maintainable backend powered by PostgreSQL.