Home/Blog/Programming/Multithreading and concurrency fundamentals
Home/Blog/Programming/Multithreading and concurrency fundamentals

Multithreading and concurrency fundamentals

Cameron Wilson
Nov 20, 2023
10 min read

Become a Software Engineer in Months, Not Years

From your first line of code, to your first day on the job — Educative has you covered. Join 2M+ developers learning in-demand programming skills.

If you’re looking to make it as a Senior Software Engineer, you’re probably aware of how important multithreading and concurrency concepts can be. With the rapid rise of multi-core machines, engineers who are able to skillfully navigate their complexity are the most highly desired candidates by most tech companies today.

These concepts can seem more intimidating than they actually are. We want to debunk the fears around multithreading and introduce you to the basics. We will introduce you to multithreading and concurrency practices in Java, C++, and Go.

Here’s what will be covered today:


Become a top candidate and master concurrency.

Become proficient in concurrency with your language of choice. All through hands-on practice and real-world applications.

Concurrency for Senior Engineers


What is multithreading?#

Multithreading is a technique that allows for concurrent (simultaneous) execution of two or more parts of a program for maximum utilization of a CPU. As a really basic example, multithreading allows you to write code in one program and listen to music in another. Programs are made up of processes and threads. You can think of it like this:

  • A program is an executable file like chrome.exe

  • A process is an executing instance of a program. When you double click on the Google Chrome icon on your computer, you start a process which will run the Google Chrome program.

  • Thread is the smallest executable unit of a process. A process can have multiple threads with one main thread. In the example, a single thread could be displaying the current tab you’re in, and a different thread could be another tab.


Example of multithreading#

Think about a single processor that is running your IDE. Say you edit one of your code files and click save. When you click save, it will initiate a workflow which will cause bytes to be written out to the underlying physical disk. However, IO is an expensive operation, and the CPU will be idle while bytes are being written out to the disk.

While IO takes place, the idle CPU could work on something useful and here is where threads come in - the IO thread is switched out and the UI thread gets scheduled on the CPU so that if you click elsewhere on the screen, your IDE is still responsive and does not appear hung or frozen.

Threads can give the illusion of multitasking even though at any given point in time the CPU is executing only one thread. Each thread gets a slice of time on the CPU and then gets switched out either.

It initiates a task, which requires waiting and not utilizing the CPU or it completes its time slot on the CPU. There are many more nuances and intricacies on how thread scheduling works but this forms the basis of it.

With advances in hardware technology, it is now common to have multi-core machines. Applications can take advantage of these and have a dedicated CPU run each thread.


Why use multithreading?#

With the introduction of multiple cores, multithreading has become extremely important in terms of the efficiency of your application. With multiple threads and a single core, your application would have to transition back and forth to give the illusion of multitasking.

With multiple cores, your application can take advantage of the underlying hardware to run individual threads through a dedicated core, thus making your application more responsive and efficient. Multithreading basically allows you to take full advantage of your CPU and the multiple cores, so you don’t have untapped processing power with idle cores.

Developers should make use of multithreading for a few reasons:

  • Higher throughput
  • Responsive applications that give the illusion of multitasking.
  • Efficient utilization of resources. Thread creation is light-weight in comparison to spawning a brand new process and for web servers that use threads instead of creating a new process when fielding web requests, consume far fewer resources.

Note that you can’t continually add threads and expect your application to run faster. More threads means more problems, and you must carefully and thoughtfully design how they will work together. It may even be in some cases that you want to avoid multithreading altogether, especially when your application performs a lot of sequential operations.

An understanding of how threading works and knowledge of concurrent programming principles will exhibit maturity and technical depth of a developer. It’s also an important differentiator in landing a more senior job at a company.


Basic Concepts of Multithreading#

Programs, processes, and threads#

Operating systems today can run multiple programs at the same time. For example, you’re reading this article in your browser (a program) but you can also listen to music on your media player (another program).

Processes are what actually execute the program. Each process is able to run concurrent subtasks called threads.

Threads are sub-tasks of processes and if synchronized correctly can give the illusion that your application is performing everything at once. Without threads you would have to write one program per task, run them as processes and synchronize them through the operating system.


Concurrency#

Concurrency is the ability of your program to deal (not doing) with many things at once and is achieved through multithreading. Do not confuse concurrency with parallelism which is about doing many things at once.


Context Switching#

Context switching is the technique where CPU time is shared across all running processes and is key for multitasking.


Thread Pools#

Thread pools allow you to decouple task submission and execution. You have the option of exposing an executor’s configuration while deploying an application or switching one executor for another seamlessly.

A thread pool consists of homogenous worker threads that are assigned to execute tasks. Once a worker thread finishes a task, it is returned to the pool. Usually, thread pools are bound to a queue from which tasks are dequeued for execution by worker threads.

A thread pool can be tuned for the size of the threads it holds. A thread pool may also replace a thread if it dies of an unexpected exception. Using a thread pool immediately alleviates from the ails of manual creation of threads. Important notes about thread pools:

  • There’s no latency when a request is received and processed by a thread because no time is lost in creating a thread.

  • The system will not go out of memory because threads are not created without any limits

  • Fine tuning the thread pool will allow us to control the throughput of the system. We can have enough threads to keep all processors busy but not so many as to overwhelm the system.

  • The application will degrade gracefully if the system is under load.


Keep the learning going.#

Master Concurrency in your programming language of choice, without scrubbing through videos or documentation. Educative’s text-based courses are easy to skim and feature live coding environments - making learning quick and efficient.

Concurrency for Senior Engineers


Locking#

Locks are a very important feature that make multithreading possible. Locks are a synchronization technique used to limit access to a resource in an environment where there are many threads of execution. A good example of a lock is a mutex.


Mutex#

Mutex as the name hints implies mutual exclusion. A mutex is used to guard shared data such as a linked-list, an array or any simple primitive type. A mutex allows only a single thread to access a resource.


Thread Safety#

Thread safety is a concept that means different threads can access the same resources without exposing erroneous behavior or producing unpredictable results like a race condition or a deadlock. Thread safety can be achieved by using various synchronization techniques.


Issues Involved with Multiple Threads#

Deadlock#

Deadlocks happen when two or more threads aren’t able to make any progress because the resource required by the first thread is held by the second and the resource required by the second thread is held by the first.


Race conditions#

Critical section is any piece of code that has the possibility of being executed concurrently by more than one thread of the application and exposes any shared data or resources used by the application for access.

Race conditions happen when threads run through critical sections without thread synchronization. The threads “race” through the critical section to write or read shared resources and depending on the order in which threads finish the “race”, the program output changes.

In a race condition, threads access shared resources or program variables that might be worked on by other threads at the same time causing the application data to be inconsistent.


Starvation#

Other than a deadlock, an application thread can also experience starvation, where it never gets CPU time or access to shared resources because other “greedy” threads hog the resources.


Livelock#

A livelock happens when two threads keep taking actions in response to the other thread instead of making any progress. The best analogy is to think of two persons trying to cross each other in a hallway. John moves to the left to let Arun pass, and Arun moves to his right to let John pass.

Both block each other now. John sees he’s now blocking Arun and moves to his right and Arun moves to his left seeing he’s blocking John. They never cross each other and keep blocking each other. This scenario is an example of a livelock.


How to avoid issues with multiple threads#


How to avoid deadlocks?#

  • Avoid Nested Locks: This is the main reason for deadlock. Deadlock mainly happens when we give locks to multiple threads. Avoid giving locks to multiple threads if you already have given to one.

  • Avoid Unnecessary Locks: You should lock only those members which are required. Having unnecessary locks can lead to a deadlock. As a best practice, try to reduce the need to lock things as much as you can.


How to avoid race conditions?#

Race conditions occur within the critical section of your code. These can be avoided with proper thread synchronization within critical sections by using techniques like locks, atomic variables, and message passing.


How to avoid starvation?#

The best way to avoid starvation is to use a lock such as ReentrantLock or a mutex. This introduces a “fair” lock which favors granting access to the thread that has been waiting longest. If you wanted to have multiple threads run at once while preventing starvation, you can use a semaphore.


How to avoid livelocks?#

Livelocks can be avoided by making use of ReentrantLock as a way to determine which thread has been waiting longer so that you can assign it a lock. As a best practice, don’t block locks; if a thread can’t acquire a lock, it should release previously acquired locks to try again later.


What to learn next#

This article has just scratched the surface on multithreading and there is still much to learn and practice. Each language has its own intricacies to achieve multithreading. Make sure to learn and practice multithreading in your chosen language.

If you’d like to further your learning on multithreading, it’s highly encouraged that you check out Multithreading and concurrency practices in Java, Python, C++, and Go. These courses give you an overview of multithreading alongside hands-on practice so you can quickly master the concepts.


Continue reading about multithreading#

Frequently Asked Questions

Where do you use multithreading?

Multithreading is great for improving an application’s performance. It’s commonly used in applications that require concurrent tasks, like web servers and interactive programs, as well as in performance-critical applications, like video editing or gaming, in order to improve responsiveness and processing speed.


  

Free Resources