TechTutorial

Developing a Ruby on Rails app with Docker Compose

By May 1, 2019 No Comments

Docker Compose is a great tool to develop your Ruby on Rails application locally. It allows you to easily isolate your ruby environment, database, and even Redis if you use something like Sidekiq. In this guide we’re going to cover:

  • Setting up a new Rails application in Docker Compose

  • Running tasks like gem installation and migrations

  • Deploying to Heroku’s container platform

Pre-reqs

This tutorial was built using Docker Compose 1.20.1. It also uses version 3.6 compose file syntax. Basic knowledge of Docker is also helpful.

Getting Started

In this tutorial we’re going to run basically all commands inside of the Docker containers that Docker Compose spins up for us. To create our initial rails project with rails new we need to setup a simple Dockerfile with our basic environment in it.

I like creating my projects in my ~/Sites folder:

$ mkdir -p ~/Sites/rails-compose
$ cd ~/Sites/rails-compose

Once in this directory, create a file called Dockerfile with this content:

FROM ruby:alpine

RUN apk add --update build-base postgresql-dev tzdata

Breakdown

  • FROM ruby:alpine tells Docker to derive our image off of the latest Ruby image with the Alpine distribution. Alpine linux is a much smaller base image and ideal for working with to keep things compact.

  • RUN apk add --update ...

  • The build-base package which contains the necessary tools for compiling things from source. Since gems like nokogiri are native extensions and are compiled when doing gem install nokogiri, this is necessary.

  • The postgresql-dev package is used when installing the pg gem.

  • tzdata is used by Rails for timezone information when available so we install it here.

Now we need to reference this Dockerfile from a docker-compose.yml file:

version: '3.6'

services:
  web:
    build: .

When using . as the value for the build config, compose automatically assumes there's a Dockerfile present in the directory. It also uses the current directory for context when building the image.

Once you’ve created this file, let’s try running:

$ docker-compose build web

This should pull down the Ruby alpine image and install our runtime dependencies on top of it.

Next we need to initialize our Rails project using rails new. To do this inside of our container, we need to mount our local file system inside of the web container we've setup. This allows us to generate the files from inside of web and have them appear on our local file system.

Let’s modify our Dockerfile first to install Rails inside of the container. It now looks like:

FROM ruby:alpine

RUN apk add --update build-base postgresql-dev tzdata
RUN gem install rails -v '5.1.6'

After you’ve added this modification to the file, run:

$ docker-compose build web

This will re-build our image with the Rails gem inside of it so we can easily run rails new here in a second.

Next, we need to modify our docker-compose.yml to add volumes and a working directory:

version: '3.6'

services:
  web:
    build: .
    volumes:
      - ./:/app
    working_dir: /app

Once we have our modifications in place, we can scaffold our rails application using rails new. Let's generate our application with configuration to use Postgres as its database. We're going to skip javascript related configuration as well.

$ docker-compose run web rails new --database=postgresql -J --skip-coffee .

After this has completed successfully, our directory should be filled with all of the necessary files and folders for a simple rails application and a generated Gemfile and Gemfile.lock.

One of the caveats of developing with Docker Compose and Rails is how gems work. When using a traditional development style Bundler installs gems onto your local filesystem. So when you run commands such as rails server it loads the gems locally. However, when using Docker + Docker Compose, gems must be installed inside of the image instead since it has a separate filesystem. This is complicated because when we ran rails new just now, it installed a bunch of gems and generated files inside of a temporary filesystem. We did this because all we needed to accomplish was getting the Gemfile's onto our local filesystem using the volume mount we setup.

So to fix this, we need to make sure our base image has these gems installed. Let’s modify our Dockerfile.

FROM ruby:alpine

RUN apk add --update build-base postgresql-dev tzdata
RUN gem install rails -v '5.1.6'

WORKDIR /app
ADD Gemfile Gemfile.lock /app/
RUN bundle install

What we’re doing here is adding only our Gemfile and Gemfile.lock files to /app and then running bundle install.

The reason we don’t just the entire directory contents to the image in this step is that it means anytime we change our application files, Docker will consider the image out-of-date. This is because it uses the modified time of the files you add to determine if it needs to run commands or instead use its cache. Since Gemfile isn't modified very often, it makes sense to add it as a separate docker image layer to speed up builds, because docker will use a cache if no gems have been added or modified. We'll add the rest of the code to the image in a later step.

Let’s build our image with the gems now baked inside of it:

$ docker-compose build web

Next up, let’s get our application into a bootable state by modifying our docker-compose.yml file and add our commands / ports.

version: '3.6'

services:
  web:
    build: .
    volumes:
      - ./:/app
    working_dir: /app
    command: puma
    ports:
      - 3000:3000

Here we’ve added our command to start a puma server on port 3000 and exposed port 3000 to our local port 3000 as well. The port 3000 comes from the config/puma.rb file was created when we initially started our rails application.

And for our first major milestone, let’s start our Rails web server:

$ docker-compose up web

Now you should be able to travel to http://localhost:3000 and see the congratulatory “Yay! You’re on Rails!”.

Models and Migrations

Now that we have our initial application setup, let’s create a simple model and migrate our database.

We’re going to create a simple Joke model that holds a silly joke that we will display on our homepage randomly.

$ docker-compose run --rm web rails g model Joke body:text

This will generate our Joke model and migrations inside of our container. Because of our volumes declaration in our docker-compose.yml file they also will be available on your local filesystem.

If we try to run a migration right now, however, we’ll see it blow up:

$ docker-compose run --rm web rails db:migrate
rails aborted!
PG::ConnectionBad: could not connect to server: No such file or directory
    Is the server running locally and accepting
    connections on Unix domain socket "/tmp/.s.PGSQL.5432"?

Looks we need to start a postgres database for our Rails application to communicate with. To do this, let’s go ahead and add a postgres image to our docker-compose.yml file.

version: '3.6'

services:
  web:
    build: .
    volumes:
      - ./:/app
    working_dir: /app
    command: puma
    ports:
      - 3000:3000
    depends_on:
      - db
    environment:
      DATABASE_URL: postgres://postgres@db
  db:
    image: postgres:10.3-alpine

Breakdown

We’re adding another service called db that uses the postgres image. We've also added 2 pieces to our web service. The depends_on piece ensures that the postgres database image is started while the environment variable informs our Rails application how to connect to it. We need to modify our config/database.yml file, however, to add some additional config.

default: &default
  adapter: postgresql
  encoding: unicode
  pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 5 } %>
  url: <%= ENV['DATABASE_URL'] %>

development:
  database: app_development

test:
  database: app_test

This Rails trickery allows us to not specify a database name in our DATABASE_URL environment variable, and instead use the config file to define it. The reason I like doing this is because it allows us run tests against the same DB without having to swizzle around DATABASE_URL values.

From here, let’s create our database and run our migrations:

$ docker-compose run --rm web rails db:create db:migrate

After this has ran, our database should have our migration applied to it. The --rm flag ensures the container that is used to run this command is removed afterwards. This saves space on your local machine.

Now let’s add a silly joke to our database so we can see it on the homepage we’re going to build.

$ docker-compose run --rm web rails runner 'Joke.create(body: "Knock! Knock! Whos there? Owls say. Owls say who? Yes, they do.")'

Let’s modify our config/routes.rb file to add a root route to a joke controller:

Rails.application.routes.draw do
  root to: 'jokes#index'
end

And then let’s create a app/controllers/jokes_controller.rb file with:

class JokesController < ApplicationController
  def index
    joke = Joke.order('RANDOM()').first

    render html: joke.body
  end
end

If we start our application server again with docker-compose up web and travel to localhost:3000, we should see our joke appear on the screen!

Deploying To Heroku

Developing Rails application with Docker Compose is a great way to develop in isolation. It’s even better if you’re planning on deploying your application via a container to say something like Heroku’s container platform.

Let’s modify our Dockerfile to add the last piece of building a fully packaged application that runs:

FROM ruby:alpine

RUN apk add --update build-base postgresql-dev tzdata
RUN gem install rails -v '5.1.6'

WORKDIR /app
ADD Gemfile Gemfile.lock /app/
RUN bundle install

ADD . .
CMD ["puma"]

This adds our local files (all of our rails app) into the image and sets the default command to be puma. So when running docker run it will automatically use puma.

Deploying to Heroku

After we’ve modified our Dockerfile. Let's deploy our application's Docker image to Heroku!

Since Heroku’s CLI uses the git remote command to determine application locations, let's initialize an empty git repo in our application directory:

$ git init

With this, we can create a Heroku application with:

$ heroku create

Afterwards, let’s login to the Heroku container registry and push our Docker image. Heroku’s CLI will automatically build our image using our Dockerfile.

$ heroku container:login
$ heroku container:push web
$ `heroku container:release web

Next, since Heroku doesn’t assume our application type (since it’s just a Docker image), we need to setup a postgres database manually using:

$ heroku addons:create heroku-postgresql:hobby-dev

This will create a free tier postgres database and automatically set the DATABASE_URL environment variable for our container when it boots. Next, we need to give our Rails application a few more environment variables to make sure it starts:

$ heroku config:set RAILS_ENV=production SECRET_KEY_BASE=supersecret RAILS_LOG_TO_STDOUT=true

This makes our Rails app boot in production mode (defined in config/environments/production.rb) and forces logs to STDOUT. This allows us to see our logs when using heroku logs.

Next, we need to migrate our database to add our jokes schema. We can do this easily with:

$ heroku run rails db:migrate

Last, let’s add our Knock Knock joke to our database that is hosted by Heroku using a slightly modified command:

$ heroku run rails runner 'Joke.create(body: "Knock! Knock! Whos there? Owls say. Owls say who? Yes, they do.")'

After we’ve run all of these commands, let’s try loading our app!

$ heroku open

Boom! We have our Docker image deployed to Heroku and serving our simple Rails application that we’ve developed using Docker Compose.

Conclusion

I love developing Rails applications with Docker + Compose. It’s a super simple way to guarantee your production environment is exactly the same as your local machine since you’re just deploying the same image you’re running locally. As this technology matures even more this method of development will become more standard as well (which is a hunch of mine). I hope this was helpful and fun!

If you want to see the final result of this project, it’s hosted on Github: https://github.com/firehydrant-io/blog-rails-with-compose

My name is Robert Ross, but people like to call me Bobby Tables. I’m a full time software engineer during the day and I build https://firehydrant.io at night.

FireHydrant takes you from oops to ops

Manage deploys, incidents, and post mortems like it's no big deal.

Learn More