Concurrency and Parallelism in Ruby on Rails

Photo of Robert Kuśmirek

Robert Kuśmirek

Dec 2, 2024 • 11 min read

Ruby on Rails (RoR) is one of the most popular frameworks for developing web applications. Its flexibility and simplicity attract many developers, but in the context of complex applications that need to handle multiple requests simultaneously, understanding Ruby concurrency and parallelism is crucial. This article explains these concepts and shows their practical application in Ruby on Rails.

Definition

Concurrency

Concurrency refers to the ability of a system to manage multiple tasks in such a way that they appear to be executed simultaneously. In reality, the system may switch between tasks so quickly that they appear to be executed concurrently. This is useful when an application needs to handle multiple I/O operations that may be delayed by external factors (e.g., waiting for a response from an external server).

Parallelism

Parallelism refers to the ability of a system to execute multiple tasks simultaneously. Thanks to multi-core processors, each core can execute a separate task, leveraging multiple CPU cores for actual parallel processing. Parallelism is particularly useful in compute-intensive operations, such as processing large data sets or performing complex mathematical calculations.

Ruby on Rails Context

Ruby on Rails, based on the Ruby language, provides several tools and mechanisms for managing concurrency and parallelism. Concurrent Ruby provides tools and patterns for safe concurrent programming in Ruby applications, offering higher-level abstractions to handle challenges like data races. Although the Global Interpreter Lock (GIL) in the standard implementation of Ruby (MRI) may limit the full utilization of parallelism, there are proven methods that allow for the effective management of concurrent tasks.

Concurrency and Global Interpreter Lock in Ruby on Rails

Ruby on Rails uses various techniques to achieve concurrency:

  • Threads: Ruby allows the creation of threads that enable multiple tasks to be performed within a single process. Threads are lightweight and can be used to handle different operations simultaneously, such as database queries or I/O operations. Although threads are managed internally by Ruby, the GIL ensures that only one thread executes Ruby code at a time.

  • Sidekiq: Sidekiq is one of the most popular tools for background job processing in Rails applications. It allows long-running tasks, such as image processing or sending emails, to be performed in separate threads or processes. Sidekiq uses Redis as a backend for managing job queues, providing high performance and reliability. Sidekiq comes in a free version and an enterprise version. The free version offers basic features, while the enterprise version includes advanced capabilities such as better monitoring, support for multiple queues, and integration with other infrastructure management tools.

  • Action Cable: Action Cable is a library for handling WebSockets in Ruby on Rails, enabling real-time communication between the server and the client. It allows for managing concurrent WebSocket connections and transmitting data in real-time, which is especially useful in applications like chats or live notifications.

  • Fiber: Introduced in Ruby 1.9 and improved in Ruby 3.0, fibers are lightweight concurrency primitives that allow pausing and resuming code execution at specific points. They are particularly useful for I/O-bound tasks and can manage asynchronous operations efficiently without blocking the main thread.

Parallelism in Ruby on Rails

Although the GIL limits the full utilization of parallelism, there are ways to achieve parallel processing in Ruby on Rails:

  • Processes: Instead of relying on threads, which are limited by the GIL, multiple processes can be run. Each process has its own instance of the Ruby interpreter and can run concurrently with other processes. Web application servers, like Puma, can run multiple worker processes to handle many simultaneous requests.

  • Puma: Puma is a web application server for Ruby on Rails that supports both threads and processes. Puma can run multiple workers, each being a separate process, and each process can handle multiple threads. This allows for effectively scaling the application and handling a greater number of concurrent connections.

  • Active Job: Active Job is a framework for managing background jobs that is part of Ruby on Rails. Active Job acts as an abstraction layer that makes it easy to switch between different backends for job processing, such as Sidekiq, Resque, or Delayed Job. This allows developers to easily integrate parallel job processing into their applications.

  • Ractor: Introduced in Ruby 3.0, Ractor (Ruby Actor) is a new feature that provides a way to achieve true parallelism by running code in parallel on multiple cores without encountering thread-safety issues. Each Ractor has its own execution context, ensuring no shared state and thus avoiding race conditions.

Ruby’s Global Interpreter Lock (GIL)

Ruby’s Global Interpreter Lock (GIL) is a crucial aspect of Ruby’s concurrency model. The GIL is a mutex that ensures only one thread can execute Ruby code at any given time. This lock is essential for protecting the internal state of the Ruby virtual machine, preventing scenarios that could lead to crashes or data corruption.

While the GIL simplifies thread safety by preventing multiple threads from running Ruby code simultaneously, it also imposes a significant limitation on Ruby’s concurrency capabilities. Specifically, the GIL means that even if you have multiple threads, only one thread can execute Ruby code at a time. This restriction can hinder the performance of multi-threaded applications, especially on multi-core processors where true parallel execution could otherwise be achieved.

Despite this limitation, the GIL provides a level of thread safety that can be beneficial in many scenarios. By ensuring that only one thread executes Ruby code at a time, the GIL helps prevent data races and other concurrency-related issues, making it easier to write safe and reliable multi-threaded applications in Ruby.

In Practice

Concurrency Applications

In practice, concurrency in Ruby on Rails is used to handle multiple simultaneous web requests and background job processing. Tools like Sidekiq allow for long-running tasks (e.g., sending emails and image processing) to be handled in separate threads, ensuring that the main thread of the application remains responsive.

Scenario: Image Processing with Multiple Threads

Imagine a client wants to implement an image processing feature in their application. This can be done without concurrency, but then each image processing request would block the main application thread, leading to decreased performance and delays for other users. Using Sidekiq moves this task to the background, allowing the application to continue handling other requests without delays.

Scenario: Sending Bulk Emails

Consider an application that needs to send bulk emails to a large number of users, such as a newsletter or promotional campaign. Without concurrency, sending emails one by one would take a considerable amount of time and could block the application from handling other requests. By using a background job processor like Sidekiq, the application can queue the emails to be sent in the background, ensuring that the main application thread remains available for other tasks and improving overall performance.

In this case, it may also be worth considering external services specialized in sending bulk emails, which offer better scalability and additional features. Some popular external solutions include:

  • SendGrid: Provides reliable email delivery with extensive analytics and support for high-volume email sending.

  • Mailgun: Offers powerful APIs for sending, receiving, and tracking emails, with robust scalability.

  • Amazon SES (Simple Email Service): A cost-effective solution for sending bulk emails with high deliverability and integration with AWS services.

Scenario: Data Aggregation

Imagine an application that needs to aggregate data from multiple external APIs to provide a consolidated view to the user. With concurrency, the application would wait for each API response sequentially, leading to slow and efficient performance. The application can make concurrent API requests by using threads, aggregating the data as responses are received, and significantly reducing the time required to gather and process the information.

Parallelism Applications

Parallelism is useful in situations where actual simultaneous task processing is needed. In Ruby on Rails, this can be achieved by running multiple processes using an application server like Puma, which can handle multiple workers.

Scenario: Intensive Computations with Parallel Execution

If an application requires complex mathematical computations, using parallelism can significantly speed up this process. Instead of performing calculations sequentially, they can be divided into smaller tasks and processed concurrently in different processes, allowing for more efficient resource utilization.

Scenario: Large Data Imports

Consider an application that needs to import large datasets from external sources regularly. Without parallelism, the data import process could take a long time, potentially impacting the application's performance for other users. By dividing the data import task into smaller chunks and processing them in parallel, the import time can be significantly reduced, making the application more efficient and responsive.

Scenario: Real-Time Analytics

In applications that provide real-time analytics and dashboards, data needs to be processed and displayed almost instantaneously. For example, a financial application might need to aggregate and analyze transaction data from multiple sources in real-time. Using parallel processing, the application can handle multiple data streams simultaneously, ensuring that the analytics are up-to-date and accurate without delaying the user interface.

What to watch out for: Thread Pool Management

Concurrency

When implementing concurrency in Ruby on Rails, be aware of several key issues:

  • Shared State: Ensure that shared data between threads is properly managed to avoid conflicts and errors.

  • Deadlock: Avoid situations where threads block each other, leading to the system being locked up.

  • Performance: Monitor the application's performance to ensure that concurrency improves its operation rather than introducing additional delays.

Parallelism

Implementing parallelism also comes with challenges:

  • Process Management: Ensure that processes are properly managed and scaled to efficiently use system resources.

  • GIL: Be mindful of the GIL limitations in MRI, which can affect the actual benefits of parallelism.

  • Inter-Process Communication: Ensure that processes can efficiently communicate and exchange data if necessary.

Best Practices

  • Thorough Planning: Before implementing concurrency or parallelism, carefully plan which tasks will be processed simultaneously and the benefits they will bring to the application.

  • Use Proven Tools: Utilize popular and proven tools like Sidekiq for concurrency, which have good documentation and community support.

  • Test and Monitor: Regularly test and monitor your application to ensure that implemented solutions work as expected and do not introduce new problems.

Error Management: Implement error management mechanisms so that if one task fails, it does not affect the entire application.

Photo of Robert Kuśmirek

More posts by this author

Robert Kuśmirek

Robert is a student at the Warsaw School of Computer Science. His adventure with programming...
Lost with AI?  Get the most important news weekly, straight to your inbox, curated by our CEO  Subscribe to AI'm Informed

Read more on our Blog

Check out the knowledge base collected and distilled by experienced professionals.

We're Netguru

At Netguru we specialize in designing, building, shipping and scaling beautiful, usable products with blazing-fast efficiency.

Let's talk business