$ cat /posts/django-on-aws-deploying-to-elastic-beanstalk-and-rds.md

Django on AWS: Deploying to Elastic Beanstalk and RDS

drwxr-xr-x2026-01-235 min0 views
Django on AWS: Deploying to Elastic Beanstalk and RDS

Amazon Web Services provides comprehensive cloud infrastructure for Django applications with Elastic Beanstalk simplifying deployment management while RDS offers managed PostgreSQL databases eliminating server maintenance overhead enabling developers to focus on application development. Elastic Beanstalk automates capacity provisioning, load balancing, auto-scaling, and application health monitoring deploying Django applications through simple command-line interface without managing underlying EC2 instances directly. RDS manages database backups, software patching, automatic failure detection, and recovery providing production-grade database infrastructure with multi-AZ deployments ensuring high availability. Without managed services, developers must manually configure EC2 instances, set up load balancers, manage database servers, implement backup strategies, and monitor infrastructure consuming significant time better spent building features. AWS deployment integrates with S3 for static file storage, CloudFront for content delivery, and Route 53 for DNS management creating complete production infrastructure. This comprehensive guide explores AWS deployment including understanding Elastic Beanstalk architecture and benefits, installing and configuring EB CLI command-line tools, creating Django applications for Elastic Beanstalk deployment, setting up RDS PostgreSQL databases with proper security groups, configuring environment variables and Django settings for AWS, deploying through Docker containers on Elastic Beanstalk, serving static files through S3 and CloudFront CDN, implementing auto-scaling and load balancing, monitoring application health and logs through CloudWatch, and best practices for production AWS deployments throughout application lifecycle from initial deployment through scaling to millions of users integrated with production deployment strategies.

Elastic Beanstalk Fundamentals

Elastic Beanstalk provides Platform as a Service (PaaS) layer over AWS infrastructure automating deployment, scaling, and monitoring while maintaining full control over underlying resources. Applications deploy as environments containing EC2 instances running web servers with Elastic Load Balancer distributing traffic and Auto Scaling adjusting capacity based on demand. Environments support multiple platforms including Python for Django with options for customizing instances through configuration files. Understanding Elastic Beanstalk architecture integrated with Django project structure enables effective cloud deployment maintaining scalability and reliability.

basheb_setup.sh
# Install EB CLI (Elastic Beanstalk Command Line Interface)
pip install awsebcli

# Verify installation
eb --version

# Configure AWS credentials
# Option 1: AWS CLI configuration
aws configure
# Enter: AWS Access Key ID, Secret Access Key, Region (e.g., us-east-1)

# Option 2: Set environment variables
export AWS_ACCESS_KEY_ID=your-access-key
export AWS_SECRET_ACCESS_KEY=your-secret-key
export AWS_DEFAULT_REGION=us-east-1

# Initialize Elastic Beanstalk application
cd /path/to/django/project
eb init

# Select region
# Choose platform: Python 3.11
# Choose application name
# Set up SSH for instances (optional)

# Create .ebignore file (similar to .gitignore)
# .ebignore
*.pyc
__pycache__/
.env
*.sqlite3
db.sqlite3
venv/
.git/
.DS_Store
staticfiles/
media/

# Create requirements.txt
pip freeze > requirements.txt

# Ensure required packages
# requirements.txt
Django==6.0.0
psycopg2-binary==2.9.9
gunicorn==21.2.0
django-storages==1.14.2
boto3==1.34.0
python-decouple==3.8

# Create Elastic Beanstalk configuration
# .ebextensions/01_django.config
option_settings:
  aws:elasticbeanstalk:application:environment:
    DJANGO_SETTINGS_MODULE: "myproject.settings.production"
    PYTHONPATH: "/var/app/current:$PYTHONPATH"
  aws:elasticbeanstalk:container:python:
    WSGIPath: "myproject.wsgi:application"

# Django collectstatic during deployment
# .ebextensions/02_python.config
container_commands:
  01_migrate:
    command: "source /var/app/venv/*/bin/activate && python manage.py migrate --noinput"
    leader_only: true
  02_collectstatic:
    command: "source /var/app/venv/*/bin/activate && python manage.py collectstatic --noinput"
    leader_only: true
  03_createsu:
    command: "source /var/app/venv/*/bin/activate && python manage.py createsu"
    leader_only: true

# Create environment
eb create production-env
# Choose load balancer type: application
# Choose instance type: t3.small or larger

# Deploy application
eb deploy

# Open application in browser
eb open

# Check status
eb status

# View logs
eb logs

# SSH into instance
eb ssh

RDS PostgreSQL Configuration

RDS provides managed PostgreSQL databases with automated backups, software patching, and multi-AZ deployments ensuring high availability. Creating RDS instances through AWS Console or CLI configures database endpoints for Django connection with security groups controlling access. Understanding RDS configuration integrated with PostgreSQL setup enables production-grade database infrastructure maintaining data integrity and performance.

pythonrds_configuration.py
# Create RDS PostgreSQL instance (AWS Console or CLI)

# Using AWS CLI
aws rds create-db-instance \
    --db-instance-identifier myproject-db \
    --db-instance-class db.t3.micro \
    --engine postgres \
    --engine-version 15.4 \
    --master-username dbadmin \
    --master-user-password SecurePassword123! \
    --allocated-storage 20 \
    --vpc-security-group-ids sg-xxxxxxxxx \
    --db-name myproject \
    --backup-retention-period 7 \
    --multi-az \
    --publicly-accessible false

# Get RDS endpoint
aws rds describe-db-instances \
    --db-instance-identifier myproject-db \
    --query 'DBInstances[0].Endpoint.Address'

# Configure security group to allow EB instances
# Add inbound rule: PostgreSQL (5432) from EB security group

# Set environment variables in Elastic Beanstalk
eb setenv \
    RDS_HOSTNAME=myproject-db.xxxxxx.us-east-1.rds.amazonaws.com \
    RDS_PORT=5432 \
    RDS_DB_NAME=myproject \
    RDS_USERNAME=dbadmin \
    RDS_PASSWORD=SecurePassword123! \
    SECRET_KEY=your-secret-key-here \
    DEBUG=False

# Django settings for RDS
# settings/production.py

import os
from decouple import config

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.postgresql',
        'NAME': os.environ.get('RDS_DB_NAME', 'myproject'),
        'USER': os.environ.get('RDS_USERNAME', 'dbadmin'),
        'PASSWORD': os.environ.get('RDS_PASSWORD'),
        'HOST': os.environ.get('RDS_HOSTNAME', 'localhost'),
        'PORT': os.environ.get('RDS_PORT', '5432'),
        'CONN_MAX_AGE': 600,
        'OPTIONS': {
            'connect_timeout': 10,
        },
    }
}

# Security settings
SECRET_KEY = os.environ.get('SECRET_KEY')
DEBUG = os.environ.get('DEBUG', 'False') == 'True'
ALLOWED_HOSTS = [
    os.environ.get('EB_HOSTNAME', ''),
    '.elasticbeanstalk.com',
    '.amazonaws.com',
]

# RDS connection test
# management/commands/test_db.py
from django.core.management.base import BaseCommand
from django.db import connection

class Command(BaseCommand):
    help = 'Test database connection'
    
    def handle(self, *args, **options):
        try:
            with connection.cursor() as cursor:
                cursor.execute("SELECT version();")
                version = cursor.fetchone()[0]
                self.stdout.write(
                    self.style.SUCCESS(f'Database connected: {version}')
                )
        except Exception as e:
            self.stdout.write(
                self.style.ERROR(f'Database error: {e}')
            )

S3 Static Files and Media Storage

S3 stores static files and user uploads separately from application servers enabling CDN distribution through CloudFront improving global performance. Django-storages integrates S3 with Django's static file system automatically uploading collected static files and media during deployment. Understanding S3 configuration enables efficient asset delivery maintaining fast page loads across geographic regions.

pythons3_static_files.py
# Install django-storages and boto3
pip install django-storages boto3

# Create S3 buckets
# Bucket 1: Static files (CSS, JS, images)
aws s3 mb s3://myproject-static

# Bucket 2: Media files (user uploads)
aws s3 mb s3://myproject-media

# Set bucket policies for public read access (static only)
# static-bucket-policy.json
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "PublicReadGetObject",
      "Effect": "Allow",
      "Principal": "*",
      "Action": "s3:GetObject",
      "Resource": "arn:aws:s3:::myproject-static/*"
    }
  ]
}

aws s3api put-bucket-policy \
    --bucket myproject-static \
    --policy file://static-bucket-policy.json

# Django settings for S3
# settings/production.py

INSTALLED_APPS = [
    # ...
    'storages',
]

# AWS S3 settings
AWS_ACCESS_KEY_ID = os.environ.get('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = os.environ.get('AWS_SECRET_ACCESS_KEY')
AWS_STORAGE_BUCKET_NAME = 'myproject-static'
AWS_S3_REGION_NAME = 'us-east-1'
AWS_S3_CUSTOM_DOMAIN = f'{AWS_STORAGE_BUCKET_NAME}.s3.amazonaws.com'
AWS_S3_OBJECT_PARAMETERS = {
    'CacheControl': 'max-age=86400',
}
AWS_DEFAULT_ACL = 'public-read'
AWS_LOCATION = 'static'

# Static files configuration
STATICFILES_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
STATIC_URL = f'https://{AWS_S3_CUSTOM_DOMAIN}/{AWS_LOCATION}/'

# Media files configuration
DEFAULT_FILE_STORAGE = 'myproject.storage_backends.MediaStorage'
MEDIA_URL = f'https://myproject-media.s3.amazonaws.com/media/'

# Custom storage backend for media
# myproject/storage_backends.py
from storages.backends.s3boto3 import S3Boto3Storage

class MediaStorage(S3Boto3Storage):
    bucket_name = 'myproject-media'
    location = 'media'
    file_overwrite = False
    default_acl = 'private'

# Collect static files to S3
python manage.py collectstatic --noinput

# CloudFront CDN setup (optional but recommended)
# Create CloudFront distribution pointing to S3 bucket
# Update STATIC_URL to use CloudFront domain
STATIC_URL = 'https://d1234567890.cloudfront.net/static/'

# Set environment variables in EB
eb setenv \
    AWS_STORAGE_BUCKET_NAME=myproject-static \
    AWS_S3_REGION_NAME=us-east-1
AWS ServicePurposeDjango IntegrationCost Consideration
Elastic BeanstalkApplication hostingDeploys Django with GunicornEC2 instance hours
RDS PostgreSQLManaged databaseDjango DATABASES settingInstance type + storage
S3Static/media storagedjango-storages integrationStorage + requests
CloudFrontCDN deliveryServes S3 files globallyData transfer out
Route 53DNS managementCustom domain routingHosted zones + queries
Elastic Beanstalk provides free tier including 750 hours/month of t3.micro instances perfect for testing. Production applications typically require t3.small or larger instances for adequate performance handling real traffic.

Production Deployment Workflow

Production deployment workflow includes testing locally, committing code to repository, deploying to staging environment, running migrations, and promoting to production ensuring changes validate before reaching users. Understanding deployment best practices integrated with testing strategies maintains application stability throughout deployment lifecycle.

bashdeployment_workflow.sh
# Complete deployment workflow

# 1. Local development and testing
python manage.py test
python manage.py check --deploy

# 2. Update requirements
pip freeze > requirements.txt

# 3. Commit changes
git add .
git commit -m "Feature: Add new functionality"
git push origin main

# 4. Deploy to staging
eb use staging-env
eb deploy

# 5. Run migrations on staging
eb ssh
source /var/app/venv/*/bin/activate
cd /var/app/current
python manage.py migrate
exit

# 6. Test staging environment
curl https://staging.example.com/health/

# 7. Deploy to production
eb use production-env
eb deploy

# 8. Monitor deployment
eb health
eb logs --all

# Zero-downtime deployment configuration
# .ebextensions/03_deployment.config
option_settings:
  aws:elasticbeanstalk:command:
    DeploymentPolicy: Rolling
    BatchSizeType: Percentage
    BatchSize: 50
  aws:autoscaling:updatepolicy:rollingupdate:
    RollingUpdateEnabled: true
    MaxBatchSize: 2
    MinInstancesInService: 1
    PauseTime: PT3M

# Health check configuration
# .ebextensions/04_healthcheck.config
option_settings:
  aws:elasticbeanstalk:application:
    Application Healthcheck URL: /health/

# Health check view
# views.py
from django.http import JsonResponse
from django.db import connection

def health_check(request):
    try:
        with connection.cursor() as cursor:
            cursor.execute("SELECT 1")
        return JsonResponse({'status': 'healthy', 'database': 'connected'})
    except Exception as e:
        return JsonResponse(
            {'status': 'unhealthy', 'error': str(e)},
            status=500
        )

# Auto-scaling configuration
# .ebextensions/05_autoscaling.config
option_settings:
  aws:autoscaling:asg:
    MinSize: 2
    MaxSize: 10
  aws:autoscaling:trigger:
    MeasureName: CPUUtilization
    Statistic: Average
    Unit: Percent
    UpperThreshold: 75
    LowerThreshold: 25
    BreachDuration: 5
    Period: 5
    EvaluationPeriods: 2

# Environment configuration
eb printenv
eb setenv KEY=VALUE

# Rollback deployment if issues
eb deploy --version previous-version

Monitoring and Logging

CloudWatch monitors application metrics, logs, and alarms enabling proactive issue detection before users experience problems. Elastic Beanstalk automatically streams logs to CloudWatch with custom metrics tracking business-specific events. Understanding monitoring integrated with application health checks maintains production reliability serving users consistently.

pythonmonitoring.py
# View application logs
eb logs
eb logs --all  # All log files
eb logs -z     # Download logs

# CloudWatch Logs configuration
# .ebextensions/06_cloudwatch.config
option_settings:
  aws:elasticbeanstalk:cloudwatch:logs:
    StreamLogs: true
    DeleteOnTerminate: false
    RetentionInDays: 7

# Custom CloudWatch metrics
# utils/cloudwatch.py
import boto3
from datetime import datetime

cloudwatch = boto3.client('cloudwatch', region_name='us-east-1')

def send_metric(metric_name, value, unit='Count'):
    cloudwatch.put_metric_data(
        Namespace='MyApp',
        MetricData=[
            {
                'MetricName': metric_name,
                'Value': value,
                'Unit': unit,
                'Timestamp': datetime.utcnow()
            }
        ]
    )

# Track custom events
send_metric('UserSignup', 1)
send_metric('OrderPlaced', order.total, unit='None')

# Django logging to CloudWatch
# settings.py
LOGGING = {
    'version': 1,
    'disable_existing_loggers': False,
    'handlers': {
        'console': {
            'class': 'logging.StreamHandler',
        },
    },
    'root': {
        'handlers': ['console'],
        'level': 'INFO',
    },
    'loggers': {
        'django': {
            'handlers': ['console'],
            'level': 'INFO',
            'propagate': False,
        },
    },
}

# CloudWatch Alarms (AWS CLI)
aws cloudwatch put-metric-alarm \
    --alarm-name high-cpu \
    --alarm-description "Alert when CPU exceeds 80%" \
    --metric-name CPUUtilization \
    --namespace AWS/EC2 \
    --statistic Average \
    --period 300 \
    --threshold 80 \
    --comparison-operator GreaterThanThreshold \
    --evaluation-periods 2

AWS Deployment Best Practices

  • Use separate environments: Maintain staging and production environments testing changes before production deployment
  • Enable RDS Multi-AZ: Deploy databases across availability zones ensuring automatic failover maintaining availability
  • Configure auto-scaling: Set appropriate min/max instances handling traffic spikes without manual intervention
  • Use S3 for static files: Offload static file serving to S3 and CloudFront reducing application server load
  • Implement health checks: Create health endpoints monitoring database connectivity and application status
  • Enable CloudWatch logging: Stream application logs to CloudWatch enabling centralized log analysis
  • Use environment variables: Store secrets in EB environment configuration keeping sensitive data outside code
  • Configure rolling deployments: Deploy updates gradually maintaining availability during releases
  • Set up CloudWatch alarms: Monitor critical metrics receiving alerts before issues impact users
  • Regular RDS backups: Configure automated backups with appropriate retention enabling disaster recovery
AWS Elastic Beanstalk simplifies Django deployment providing managed infrastructure with automatic scaling. Combined with RDS and S3, AWS offers complete production environment from databases through CDN delivery integrated with security best practices.

Conclusion

AWS provides comprehensive cloud infrastructure for Django applications with Elastic Beanstalk automating deployment, scaling, and monitoring while RDS offers managed PostgreSQL databases eliminating server maintenance. Elastic Beanstalk simplifies deployment through EB CLI creating environments with EC2 instances, load balancers, and auto-scaling groups deploying applications through simple commands without managing underlying infrastructure directly. RDS PostgreSQL provides production-grade databases with automated backups, software patching, multi-AZ deployments for high availability, and security groups controlling access integrated with Django database configuration. S3 and CloudFront handle static file storage and global content delivery offloading asset serving from application servers with django-storages integrating seamlessly collecting static files during deployment. Configuration through .ebextensions enables customizing environments running migrations, collecting static files, and setting Django-specific options like WSGI paths and environment variables. Deployment workflow includes local testing, committing to version control, deploying to staging environments, validating changes, and promoting to production with rolling deployments maintaining availability during updates. CloudWatch provides monitoring and logging streaming application logs, tracking custom metrics, and creating alarms notifying teams when thresholds breach enabling proactive issue resolution. Best practices include using separate staging and production environments, enabling RDS Multi-AZ deployments, configuring auto-scaling for traffic handling, using S3 for static files, implementing health check endpoints, enabling CloudWatch logging, using environment variables for secrets, configuring rolling deployments, setting up alarms, and maintaining regular backups. Understanding AWS services from Elastic Beanstalk through RDS and S3 enables deploying production Django applications leveraging managed infrastructure maintaining scalability, reliability, and performance serving thousands to millions of users throughout application lifecycle from initial deployment through enterprise operations integrated with security, monitoring, and optimization strategies ensuring successful cloud deployment on AWS infrastructure.

$ cat /comments/ (0)

new_comment.sh

// Email hidden from public

>_

$ cat /comments/

// No comments found. Be the first!

[session] guest@{codershandbook}[timestamp] 2026

Navigation

Connect

Subscribe

// 2026 {Coders Handbook}. EOF.