$ cat /posts/supabase-cron-jobs-scheduled-tasks-and-automation.md
[tags]Supabase

Supabase Cron Jobs: Scheduled Tasks and Automation

drwxr-xr-x2026-01-265 min0 views
Supabase Cron Jobs: Scheduled Tasks and Automation

Implementing scheduled tasks with pg_cron enables automated database operations, recurring maintenance tasks, periodic data aggregation, scheduled reports, and time-based workflows running directly in PostgreSQL without external schedulers. Unlike external cron services requiring separate infrastructure and API authentication, pg_cron runs within database ensuring direct data access, simplified deployment, automatic retry mechanisms, and centralized task management. This comprehensive guide covers installing and configuring pg_cron extension, creating scheduled jobs with cron syntax, implementing database cleanup tasks removing old data, automating backup procedures, scheduling aggregate computations, sending scheduled notifications and reports, monitoring job execution and handling failures, and managing job dependencies and execution order. Scheduled tasks become essential when cleaning expired sessions, generating daily reports, archiving old records, calculating analytics metrics, sending periodic emails, or automating recurring business processes. Before proceeding, understand database basics, triggers and functions, and migrations.

Installing pg_cron Extension

sqlpg_cron_setup.sql
-- Enable pg_cron extension
-- Run in SQL Editor (Database > SQL Editor)
create extension if not exists pg_cron;

-- Grant permissions to schedule jobs
grant usage on schema cron to postgres;
grant all privileges on all tables in schema cron to postgres;

-- Verify installation
select * from cron.job;

-- Check pg_cron version
select extversion from pg_extension where extname = 'pg_cron';

-- View cron configuration
show cron.database_name;
show cron.log_run;

-- Enable job logging
alter system set cron.log_run = on;

-- Set timezone for cron jobs
alter database postgres set timezone to 'UTC';

-- Example: Schedule a simple job
select cron.schedule(
  'test-job',
  '* * * * *',  -- Every minute
  $$select now()$$
);

-- View scheduled jobs
select 
  jobid,
  schedule,
  command,
  nodename,
  nodeport,
  database,
  username
from cron.job
order by jobid;

Cron Syntax Reference

FieldAllowed ValuesSpecial Characters
Minute0-59* , - /
Hour0-23* , - /
Day of Month1-31* , - /
Month1-12* , - /
Day of Week0-6 (0=Sunday)* , - /
sqlcron_patterns.sql
-- Common Cron Patterns

-- Every minute
'* * * * *'

-- Every 5 minutes
'*/5 * * * *'

-- Every hour at minute 0
'0 * * * *'

-- Every day at 2:30 AM
'30 2 * * *'

-- Every Monday at 9:00 AM
'0 9 * * 1'

-- First day of every month at midnight
'0 0 1 * *'

-- Every weekday at 6:00 PM
'0 18 * * 1-5'

-- Every 15 minutes between 9 AM and 5 PM
'*/15 9-17 * * *'

-- Twice a day (6 AM and 6 PM)
'0 6,18 * * *'

-- Last day of every month (requires workaround)
-- Run daily and check if tomorrow is first day
'0 0 * * *' -- Check: SELECT date_trunc('month', now() + interval '1 day') != date_trunc('month', now())

Database Cleanup Tasks

sqlcleanup_jobs.sql
-- Delete expired sessions daily at 2 AM
select cron.schedule(
  'cleanup-expired-sessions',
  '0 2 * * *',
  $$
  delete from auth.sessions
  where expires_at < now();
  $$
);

-- Archive old logs weekly on Sunday at 3 AM
select cron.schedule(
  'archive-old-logs',
  '0 3 * * 0',
  $$
  insert into logs_archive
  select * from logs
  where created_at < now() - interval '90 days';
  
  delete from logs
  where created_at < now() - interval '90 days';
  $$
);

-- Remove unverified users after 7 days
select cron.schedule(
  'cleanup-unverified-users',
  '0 4 * * *',
  $$
  delete from auth.users
  where email_confirmed_at is null
    and created_at < now() - interval '7 days';
  $$
);

-- Vacuum database weekly
select cron.schedule(
  'vacuum-database',
  '0 1 * * 0',
  $$vacuum analyze$$
);

-- Clean up expired file upload tokens
select cron.schedule(
  'cleanup-upload-tokens',
  '*/30 * * * *',  -- Every 30 minutes
  $$
  delete from storage.objects
  where bucket_id = 'temp-uploads'
    and created_at < now() - interval '1 hour';
  $$
);

-- Function for comprehensive cleanup
create or replace function cleanup_old_data()
returns void as $$
declare
  deleted_count int;
begin
  -- Delete old notifications
  delete from notifications
  where created_at < now() - interval '30 days'
    and read = true;
  get diagnostics deleted_count = row_count;
  raise notice 'Deleted % old notifications', deleted_count;
  
  -- Delete abandoned carts
  delete from shopping_carts
  where updated_at < now() - interval '7 days'
    and status = 'abandoned';
  get diagnostics deleted_count = row_count;
  raise notice 'Deleted % abandoned carts', deleted_count;
  
  -- Archive completed orders
  insert into orders_archive
  select * from orders
  where status = 'completed'
    and completed_at < now() - interval '1 year';
  
  delete from orders
  where status = 'completed'
    and completed_at < now() - interval '1 year';
  get diagnostics deleted_count = row_count;
  raise notice 'Archived % old orders', deleted_count;
end;
$$ language plpgsql;

-- Schedule comprehensive cleanup
select cron.schedule(
  'daily-cleanup',
  '0 3 * * *',
  $$select cleanup_old_data()$$
);

Data Aggregation and Reports

sqlaggregation_jobs.sql
-- Calculate daily metrics at midnight
select cron.schedule(
  'calculate-daily-metrics',
  '0 0 * * *',
  $$
  insert into daily_metrics (date, total_users, active_users, revenue)
  select
    current_date - interval '1 day',
    (select count(*) from users),
    (select count(distinct user_id) from user_activity where date = current_date - interval '1 day'),
    (select coalesce(sum(amount), 0) from orders where date = current_date - interval '1 day');
  $$
);

-- Update user engagement scores hourly
select cron.schedule(
  'update-engagement-scores',
  '0 * * * *',
  $$
  update users u
  set engagement_score = (
    select
      count(distinct date(created_at)) * 10 +  -- Daily active days
      count(*) * 2 +                             -- Total actions
      coalesce(sum(case when action_type = 'purchase' then 50 else 0 end), 0)
    from user_activity
    where user_id = u.id
      and created_at > now() - interval '30 days'
  )
  where last_active_at > now() - interval '30 days';
  $$
);

-- Generate monthly reports on first day of month
select cron.schedule(
  'generate-monthly-report',
  '0 1 1 * *',
  $$
  insert into monthly_reports (month, data)
  select
    date_trunc('month', current_date - interval '1 month'),
    jsonb_build_object(
      'total_revenue', (select sum(amount) from orders where date_trunc('month', created_at) = date_trunc('month', current_date - interval '1 month')),
      'new_users', (select count(*) from users where date_trunc('month', created_at) = date_trunc('month', current_date - interval '1 month')),
      'active_users', (select count(distinct user_id) from user_activity where date_trunc('month', created_at) = date_trunc('month', current_date - interval '1 month'))
    );
  $$
);

-- Materialized view refresh
create materialized view top_products as
select
  p.id,
  p.name,
  count(oi.id) as order_count,
  sum(oi.quantity) as total_quantity,
  sum(oi.price * oi.quantity) as total_revenue
from products p
join order_items oi on oi.product_id = p.id
where oi.created_at > now() - interval '30 days'
group by p.id, p.name
order by total_revenue desc
limit 100;

-- Refresh every 6 hours
select cron.schedule(
  'refresh-top-products',
  '0 */6 * * *',
  $$refresh materialized view top_products$$
);

-- Update search rankings
select cron.schedule(
  'update-search-rankings',
  '0 2 * * *',
  $$
  update products
  set search_rank = (
    coalesce((select count(*) from order_items where product_id = products.id and created_at > now() - interval '7 days'), 0) * 10 +
    coalesce((select count(*) from product_views where product_id = products.id and created_at > now() - interval '7 days'), 0) * 1 +
    coalesce((select avg(rating) from reviews where product_id = products.id), 0) * 20
  );
  $$
);

Scheduled Notifications

sqlnotification_jobs.sql
-- Send daily digest emails at 9 AM
select cron.schedule(
  'send-daily-digest',
  '0 9 * * *',
  $$
  select net.http_post(
    url := 'https://your-project.supabase.co/functions/v1/send-digest',
    headers := jsonb_build_object('Authorization', 'Bearer ' || current_setting('app.service_role_key')),
    body := jsonb_build_object('type', 'daily_digest')
  );
  $$
);

-- Remind users about incomplete actions
select cron.schedule(
  'send-cart-reminders',
  '0 10 * * *',
  $$
  insert into notifications (user_id, type, message)
  select
    user_id,
    'cart_reminder',
    'You have items waiting in your cart!'
  from shopping_carts
  where status = 'active'
    and updated_at < now() - interval '24 hours'
    and updated_at > now() - interval '48 hours';
  $$
);

-- Send subscription expiry warnings
select cron.schedule(
  'subscription-expiry-warnings',
  '0 8 * * *',
  $$
  with expiring_soon as (
    select user_id, expires_at
    from subscriptions
    where expires_at between now() and now() + interval '7 days'
      and status = 'active'
  )
  insert into notifications (user_id, type, message, data)
  select
    user_id,
    'subscription_expiring',
    'Your subscription expires in ' || extract(day from expires_at - now()) || ' days',
    jsonb_build_object('expires_at', expires_at)
  from expiring_soon;
  $$
);

-- Weekly engagement email
select cron.schedule(
  'weekly-engagement-email',
  '0 10 * * 1',  -- Monday at 10 AM
  $$
  select net.http_post(
    url := 'https://your-project.supabase.co/functions/v1/send-weekly-summary',
    headers := jsonb_build_object(
      'Authorization', 'Bearer ' || current_setting('app.service_role_key'),
      'Content-Type', 'application/json'
    ),
    body := jsonb_build_object(
      'users', (
        select jsonb_agg(jsonb_build_object('id', id, 'email', email))
        from users
        where email_preferences->>'weekly_summary' = 'true'
      )
    )
  );
  $$
);

-- Birthday wishes
select cron.schedule(
  'send-birthday-wishes',
  '0 9 * * *',
  $$
  insert into notifications (user_id, type, message)
  select
    id,
    'birthday',
    'Happy Birthday! 🎉 Here\'s a special gift for you!'
  from users
  where date_part('month', birth_date) = date_part('month', current_date)
    and date_part('day', birth_date) = date_part('day', current_date);
  $$
);

Job Monitoring and Logging

sqljob_monitoring.sql
-- View job execution history
select
  job.jobid,
  job.schedule,
  job.command,
  run.status,
  run.return_message,
  run.start_time,
  run.end_time,
  extract(epoch from (run.end_time - run.start_time)) as duration_seconds
from cron.job job
left join cron.job_run_details run on run.jobid = job.jobid
order by run.start_time desc
limit 50;

-- Check failed jobs
select
  job.jobid,
  job.schedule,
  job.command,
  run.return_message,
  run.start_time
from cron.job job
join cron.job_run_details run on run.jobid = job.jobid
where run.status = 'failed'
order by run.start_time desc;

-- Create custom logging table
create table cron_job_logs (
  id uuid default gen_random_uuid() primary key,
  job_name text not null,
  status text not null,
  message text,
  duration_ms int,
  created_at timestamp with time zone default now()
);

-- Function with logging
create or replace function cleanup_with_logging()
returns void as $$
declare
  start_time timestamp;
  deleted_count int;
  duration_ms int;
begin
  start_time := clock_timestamp();
  
  begin
    delete from expired_data where expires_at < now();
    get diagnostics deleted_count = row_count;
    
    duration_ms := extract(epoch from (clock_timestamp() - start_time)) * 1000;
    
    insert into cron_job_logs (job_name, status, message, duration_ms)
    values ('cleanup_with_logging', 'success', format('Deleted %s records', deleted_count), duration_ms);
  exception when others then
    insert into cron_job_logs (job_name, status, message)
    values ('cleanup_with_logging', 'failed', sqlerrm);
    raise;
  end;
end;
$$ language plpgsql;

-- Alert on job failures
create or replace function check_cron_failures()
returns void as $$
declare
  failure_count int;
begin
  select count(*)
  into failure_count
  from cron.job_run_details
  where status = 'failed'
    and start_time > now() - interval '1 hour';
  
  if failure_count > 5 then
    perform net.http_post(
      url := 'https://hooks.slack.com/services/YOUR/WEBHOOK/URL',
      headers := jsonb_build_object('Content-Type', 'application/json'),
      body := jsonb_build_object(
        'text', format('⚠️ %s cron jobs failed in the last hour', failure_count)
      )
    );
  end if;
end;
$$ language plpgsql;

-- Schedule failure monitoring
select cron.schedule(
  'monitor-cron-failures',
  '*/30 * * * *',
  $$select check_cron_failures()$$
);

-- View job statistics
select
  job.command,
  count(*) as total_runs,
  count(*) filter (where run.status = 'succeeded') as successful,
  count(*) filter (where run.status = 'failed') as failed,
  avg(extract(epoch from (run.end_time - run.start_time))) as avg_duration_seconds
from cron.job job
left join cron.job_run_details run on run.jobid = job.jobid
where run.start_time > now() - interval '7 days'
group by job.jobid, job.command;

Managing Cron Jobs

sqljob_management.sql
-- List all scheduled jobs
select * from cron.job;

-- Unschedule a job by name
select cron.unschedule('job-name');

-- Unschedule a job by ID
select cron.unschedule(123);

-- Update job schedule
select cron.alter_job(
  job_id := 123,
  schedule := '0 3 * * *'  -- Change to 3 AM daily
);

-- Disable a job temporarily
update cron.job
set active = false
where jobname = 'expensive-job';

-- Re-enable a job
update cron.job
set active = true
where jobname = 'expensive-job';

-- Run a job immediately (testing)
select cron.schedule(
  'test-run',
  '* * * * *',
  $$select now()$$
);
-- Wait for execution, then unschedule
select cron.unschedule('test-run');

-- Clear old job run history
delete from cron.job_run_details
where start_time < now() - interval '30 days';

-- Export job definitions
select
  jobname,
  schedule,
  command
from cron.job
order by jobname;

-- Backup and restore jobs
-- Export to file
copy (
  select format(
    'select cron.schedule(%L, %L, %L);',
    jobname,
    schedule,
    command
  )
  from cron.job
) to '/tmp/cron_jobs_backup.sql';

-- Import from backup
\i /tmp/cron_jobs_backup.sql

Cron Job Best Practices

  • Use Descriptive Names: Name jobs clearly indicating purpose and frequency for easier management
  • Implement Logging: Add custom logging tracking execution time, records processed, and errors
  • Handle Errors Gracefully: Wrap job logic in exception handlers preventing job failures
  • Avoid Overlapping Runs: Use advisory locks preventing concurrent execution of same job
  • Optimize Long Jobs: Break large operations into batches processing incrementally
  • Monitor Performance: Track job duration and set alerts for unusually long executions
  • Test Thoroughly: Run jobs manually testing before scheduling in production
Critical: Cron jobs run with elevated privileges. Always validate data and handle errors properly. Avoid scheduling resource-intensive jobs during peak hours. Use advisory locks preventing concurrent runs. Monitor execution time preventing runaway jobs. Review security practices.

Common Cron Issues

  • Job Not Running: Check cron.job table for job existence, verify schedule syntax, check database timezone
  • Job Failing Silently: Enable cron logging, check job_run_details for error messages, add custom logging
  • Performance Issues: Jobs taking too long, add indexes, batch process large operations, run during off-peak hours
  • Overlapping Executions: Previous run still executing, implement advisory locks or increase interval

Next Steps

  1. Build Functions: Create database functions for complex logic
  2. Implement Webhooks: Trigger external services from jobs
  3. Monitor Performance: Track with analytics
  4. Optimize Queries: Apply performance techniques

Conclusion

Implementing scheduled tasks with pg_cron enables automated database operations running directly in PostgreSQL for recurring maintenance, data aggregation, scheduled reports, and time-based workflows without external schedulers. By installing pg_cron extension enabling cron scheduling within database, creating scheduled jobs using standard cron syntax specifying execution frequency, implementing cleanup tasks removing expired sessions old logs and unverified users, automating data aggregation calculating daily metrics updating engagement scores and refreshing materialized views, scheduling notifications sending daily digests cart reminders and subscription warnings, monitoring job execution tracking success rates duration and failures, and managing jobs updating schedules disabling temporarily or unscheduling when no longer needed, you build automated database workflows. Scheduled tasks advantages include direct database access without API overhead, simplified deployment running within PostgreSQL, automatic retry mechanisms provided by pg_cron, centralized management through SQL interface, and precise timing with standard cron syntax. Always use descriptive names identifying purpose clearly, implement logging tracking execution and errors, handle errors gracefully preventing job failures, avoid overlapping runs using advisory locks, optimize long jobs batching large operations, monitor performance tracking duration and setting alerts, and test thoroughly running manually before production scheduling. Scheduled tasks become essential when cleaning expired sessions, generating daily reports, archiving old records, calculating analytics metrics, sending periodic emails, or automating recurring business processes. Continue with functions, webhooks, and optimization.

$ cat /comments/ (0)

new_comment.sh

// Email hidden from public

>_

$ cat /comments/

// No comments found. Be the first!

[session] guest@{codershandbook}[timestamp] 2026

Navigation

Categories

Connect

Subscribe

// 2026 {Coders Handbook}. EOF.