Ruby on Rails Logging to a Central Server
Tutorials January 19, 2026 ยท 4 min read

Ruby on Rails Logging to a Central Server

Configure Rails logging for centralized log management. Learn structured logging, log tagging, and integration with remote log services.

Rails has excellent logging capabilities built in, but the defaults are designed for development. This guide shows you how to configure Rails logging for production use and ship logs to a central server for analysis.

Understanding Rails Logging

Rails uses ActiveSupport::Logger by default, which builds on Ruby's Logger class. In production, you'll want to enhance it with:

  • Structured (JSON) output for easy parsing
  • Request context (request ID, user ID)
  • Log shipping to a central service

Basic Configuration

Configure logging in config/environments/production.rb:

# config/environments/production.rb
Rails.application.configure do
  config.log_level = :info
  config.log_tags = [:request_id]
  config.logger = ActiveSupport::Logger.new(STDOUT)
end

Structured Logging with Lograge

Lograge transforms verbose Rails logs into single-line structured entries:

# Gemfile
gem 'lograge'
gem 'lograge-sql' # optional: include SQL queries

# config/initializers/lograge.rb
Rails.application.configure do
  config.lograge.enabled = true
  config.lograge.formatter = Lograge::Formatters::Json.new

  config.lograge.custom_payload do |controller|
    {
      user_id: controller.current_user&.id,
      request_id: controller.request.request_id,
      ip: controller.request.remote_ip
    }
  end

  config.lograge.custom_options = lambda do |event|
    {
      params: event.payload[:params].except('controller', 'action', 'format'),
      exception: event.payload[:exception]&.first,
      time: Time.current.iso8601
    }
  end
end

This transforms logs from verbose multi-line output to clean JSON:

{"method":"GET","path":"/users/123","format":"html","controller":"UsersController","action":"show","status":200,"duration":45.2,"user_id":456,"request_id":"abc-123"}

Adding Request Context

Include request information in all logs:

# config/application.rb
config.log_tags = [
  :request_id,
  ->(request) { request.env['HTTP_X_TENANT_ID'] },
  ->(request) { request.session[:user_id] }
]

Semantic Logger (Alternative)

For more advanced logging, consider rails_semantic_logger:

# Gemfile
gem 'rails_semantic_logger'

# config/initializers/semantic_logger.rb
SemanticLogger.add_appender(
  io: STDOUT,
  formatter: :json
)

# In your code
Rails.logger.info('Order created', order_id: order.id, total: order.total)

Shipping Logs to Remote Services

Using Syslog

# Gemfile
gem 'syslog-logger'

# config/environments/production.rb
require 'syslog/logger'
config.logger = Syslog::Logger.new('rails-app', Syslog::LOG_LOCAL0)

Using HTTP (for 401 Clicks)

# lib/http_logger.rb
class HttpLogger < Logger
  def initialize(url, api_key)
    @url = url
    @api_key = api_key
    @http = Net::HTTP::Persistent.new
    super(nil)
  end

  def add(severity, message = nil, progname = nil)
    payload = {
      timestamp: Time.current.iso8601,
      level: severity,
      message: message || progname,
    }

    Thread.new do
      uri = URI(@url)
      request = Net::HTTP::Post.new(uri)
      request['Authorization'] = "Bearer #{@api_key}"
      request['Content-Type'] = 'application/json'
      request.body = payload.to_json
      @http.request(uri, request)
    end
  end
end

# config/environments/production.rb
config.logger = HttpLogger.new(
  ENV['LOG_URL'],
  ENV['LOG_API_KEY']
)

Background Job Logging

Ensure Sidekiq/other job processors include context:

# config/initializers/sidekiq.rb
Sidekiq.configure_server do |config|
  config.server_middleware do |chain|
    chain.add Sidekiq::Middleware::Server::RequestId
  end
end

# For Sidekiq 7+
class LoggingMiddleware
  def call(job_instance, job_payload, queue)
    Rails.logger.tagged(job_payload['jid']) do
      yield
    end
  end
end

Error Tracking

# config/initializers/exception_notification.rb
Rails.application.config.middleware.use ExceptionNotification::Rack,
  webhook: {
    url: ENV['WEBHOOK_URL'],
    http_method: :post
  }

# Or integrate with your logger
class ApplicationController < ActionController::Base
  rescue_from StandardError do |exception|
    Rails.logger.error(
      'Unhandled exception',
      exception: exception.class.name,
      message: exception.message,
      backtrace: exception.backtrace.first(10)
    )
    raise
  end
end

Log Rotation

# Using logrotate (recommended)
# /etc/logrotate.d/rails
/var/log/rails/*.log {
  daily
  rotate 7
  compress
  delaycompress
  missingok
  notifempty
  copytruncate
}

# Or in Rails
config.logger = Logger.new(
  Rails.root.join('log', 'production.log'),
  10,        # Keep 10 files
  10.megabytes
)

Performance Tips

Async Logging

# Use a buffered logger
config.logger = ActiveSupport::Logger.new(
  STDOUT,
  level: :info
)
config.logger.extend(ActiveSupport::Logger.broadcast(
  ActiveSupport::Logger.new(
    Rails.root.join('log', 'production.log')
  )
))

Log Level Guards

# Avoid expensive operations for filtered logs
if Rails.logger.debug?
  Rails.logger.debug("Complex data: #{expensive_calculation}")
end

Testing Logging

# spec/support/log_helper.rb
RSpec.configure do |config|
  config.around(:each, :capture_logs) do |example|
    output = StringIO.new
    old_logger = Rails.logger
    Rails.logger = Logger.new(output)

    example.run

    Rails.logger = old_logger
    example.metadata[:log_output] = output.string
  end
end

# In your tests
it 'logs order creation', :capture_logs do |example|
  OrderService.create(params)
  expect(example.metadata[:log_output]).to include('Order created')
end

Conclusion

Production Rails logging requires structured output, proper context, and reliable shipping to a central service. Whether you choose Lograge, Semantic Logger, or a custom solution, the key is making your logs queryable and including enough context to debug issues quickly.

Ship your Rails logs to 401 Clicks for instant search, alerting, and team collaboration on production issues.

A

Admin

Published on January 19, 2026