Automating Daily Supabase Database Backups with Node.js
Posted by Nuno Marques on 16 Feb 2025
Why Regular Database Backups Matter
Database backups are critical to maintaining data integrity and ensuring business continuity. Whether you're running a personal project or a large-scale application, data loss can occur due to accidental deletions, corruption, or security breaches. Having an automated backup system means you can recover data quickly without downtime or panic.
Benefits of Automated Backups:
- Protection Against Data Loss – Recover from accidental deletions or failures.
- Disaster Recovery – Ensure data integrity in case of system failures.
- Compliance and Security – Meet industry standards for data protection.
- Peace of Mind – Focus on development without worrying about losing valuable data.
In this guide, we'll implement a daily automated backup system for a Supabase database using Node.js and the filesystem (fs
). Our script will:
- Export multiple tables from the database.
- Convert the data into CSV format.
- Save backups in a structured date-wise directory.
- Automate the process for long-term reliability.
I am implementing this backup system for my React app, Bokkah.com, to ensure data integrity and seamless recovery when needed.
Let's get started!
Setting Up the Backup Script
Prerequisites
Before running the script, make sure you have:
- A Supabase project set up with tables.
- Node.js installed on your system.
- Your Supabase credentials configured in a separate file (
supabase.js
).
The Code Breakdown
Here’s the complete script for automating Supabase database backups:
import { supabase } from "./config/supabase.js";
import fs from "fs";
import path from "path";
const currentDate = new Date().toLocaleDateString("en-GB").replace(/\//g, "-");
// Create backup directory if it doesn’t exist
const backupDir = `./backups/supabase/${currentDate}`;
if (!fs.existsSync(backupDir)) {
fs.mkdirSync(backupDir, { recursive: true });
}
async function exportTableToCSV(tableName) {
const { data, error } = await supabase
.from(tableName)
.select('*');
if (error) {
console.error(`❌ Error fetching ${tableName}:`, error);
return;
}
if (!data || data.length === 0) {
console.log(`⚠️ No data found in ${tableName}`);
return;
}
// Convert data to CSV format
const csvContent = [
Object.keys(data[0]).join(','), // Headers
...data.map(row => Object.values(row).map(value => `"${value}"`).join(',')) // Rows
].join('\n');
// Save file in ./backups/supabase/{tableName}.csv
const filePath = path.join(backupDir, `${tableName}.csv`);
fs.writeFileSync(filePath, csvContent);
console.log(`✅ Exported ${tableName}.csv`);
}
// List of tables to export
const tables = [
'recipes',
'user_invites',
'follows',
'users',
'notifications',
'user_collections',
'categories',
'collection_recipes',
'friends',
'collections',
'collection_invited_users'
];
tables.forEach(table => exportTableToCSV(table));
How the Script Works (Step by Step)
1. Create a Backup Directory by Date
We start by generating a folder for today’s backup using the date as the folder name:
const currentDate = new Date().toLocaleDateString("en-GB").replace(/\//g, "-");
const backupDir = `./backups/supabase/${currentDate}`;
This ensures each backup is stored in its respective folder (e.g., ./backups/supabase/16-02-2025
).
2. Check If the Directory Exists
Before creating the backup folder, we check if it already exists. If not, we create it recursively:
if (!fs.existsSync(backupDir)) {
fs.mkdirSync(backupDir, { recursive: true });
}
3. Export Data from Tables
We define an exportTableToCSV
function that:
- Fetches all rows from a given table.
- Converts the data into CSV format.
- Saves it as a
.csv
file inside the backup folder.
const { data, error } = await supabase.from(tableName).select('*');
If no data exists, we log a warning:
if (!data || data.length === 0) {
console.log(`⚠️ No data found in ${tableName}`);
return;
}
4. Convert Data to CSV Format
We extract the column names for headers and map the rows:
const csvContent = [
Object.keys(data[0]).join(','), // Headers
...data.map(row => Object.values(row).map(value => `"${value}"`).join(',')) // Rows
].join('\n');
This ensures the exported file follows the CSV format with comma-separated values.
5. Save the Backup File
The script writes the CSV data to a file inside the backupDir
:
const filePath = path.join(backupDir, `${tableName}.csv`);
fs.writeFileSync(filePath, csvContent);
Once completed, the script logs a success message:
console.log(`✅ Exported ${tableName}.csv`);
6. Loop Through All Tables
The script loops through an array of table names and calls exportTableToCSV
for each:
tables.forEach(table => exportTableToCSV(table));
This ensures every table gets backed up daily without manual intervention.
Best Practices for Backup Storage
Now that we have automated backups, here are some best practices to follow:
✅ Store Backups in a Secure Location
- Use cloud storage like AWS S3, Google Drive, or Dropbox for off-site backups.
- Restrict access to backup files to prevent unauthorized modifications.
✅ Automate with a Scheduler
-
Use
cron
jobs or Node.js schedulers (e.g.,node-cron
) to run the script daily. -
Example (Linux/macOS cron job):
0 2 * * * node /path/to/backup-script.js
This runs the script daily at 2 AM.
✅ Encrypt Sensitive Data
-
If your backups contain user data, encrypt them before storing them.
-
Consider using GPG encryption:
gpg --encrypt --recipient your-email@example.com backup-file.csv
✅ Rotate Backups Regularly
- Retain backups for a limited time (e.g., keep the last 30 days of backups).
- Automate old backup deletion to save storage.
Conclusion
Automating Supabase database backups with Node.js ensures data safety without manual effort. By using this script, you:
✅ Protect your data from loss.
✅ Automate exports efficiently.
✅ Organize backups by date for easy retrieval.
✅ Follow best practices for secure and reliable storage.
Set up a cron job, sync to the cloud, and you’ve got a bulletproof backup system!