Introduction to Concurrency and Threads

A beginner-friendly introduction to concurrency and threads—what they are, why they matter, and how they impact application performance and safety.

Ever wondered why some applications feel lightning-fast while others seem to crawl? The secret often lies in how well they handle concurrency.

Introduction: The Good, The Bad, and The Concurrent

Picture this: you're cooking dinner for friends. You could start the rice, wait for it to finish, then chop vegetables, wait for that to finish, then start cooking the main dish. By the time you're done, your friends have either left or ordered pizza. 🍕

Or, you could be smart about it - start the rice, then while it's cooking, chop vegetables, prep the sauce, and coordinate everything to finish at the same time. That's essentially what concurrency does for your programs!

As the wise folks who wrote the book "Java Concurrency in Practice" put it:

Writing correct programs is hard, but writing correct concurrent programs is even harder.

But here's the thing - threads are everywhere in Java. Every single Java application uses threads, whether you realize it or not. When your JVM starts up, it's already juggling multiple threads for garbage collection, housekeeping tasks, and running your main method.

A Brief Journey Through Time: The History of Concurrency

The Ancient Days (Pre-Operating Systems Era)

Back in the early days of computing, computers were like that friend who can only do one thing at a time. A single program would hog all the computer's resources from start to finish. Imagine having a super powerful sports car but only being able to drive it in first gear - that's how inefficient it was!

The Revolution: Enter Operating Systems

Operating systems changed everything by introducing the concept of processes - think of them as isolated, independently executing programs that the OS manages like a skilled conductor managing an orchestra. Each process gets its own:

  • Memory space 🧠

  • File handles 📁

  • Security credentials 🔐

Processes could communicate with each other (technically called Inter Process Communication) through various mechanisms like:

  • Sockets 🔌

  • Signal handlers 📡

  • Shared memory 💾

  • Semaphores 🚦

  • Files 📄

The Next Evolution: Why We Needed Better Solutions

But even with processes, we faced some fundamental questions:

Why do we need simultaneous execution anyway?

  1. Resource Utilization 📊: Programs often wait for I/O operations. Instead of sitting idle, why not let another program use the CPU?

  2. Fairness ⚖️: Multiple users deserve equal access to system resources

  3. Convenience 🎯: It's better to have specialized programs for different tasks rather than one massive do-everything application

Think of it like a restaurant kitchen - you wouldn't want one chef doing everything sequentially. You want specialists working in parallel!

Threads: The Lightweight Champions 💪

Threads (and not processes) are considered by most operating systems as the basic unit of scheduling. Threads are often called "lightweight processes" - and for good reason! While processes are like separate houses with their own everything, threads are more like roommates sharing an apartment.

What Makes Threads Special?

Threads allow multiple streams of program control flow to coexist within a single process. Here's what they share and what they keep separate:

Shared Resources:

  • Memory space 🏠

  • File handles 📂

  • Heap objects 📦

Individual Resources:

  • Program counter 📍

  • Stack 📚

  • Local variables 🔢

The Power of Asynchronous Execution

Here's where it gets interesting: In the absence of explicit coordination, threads execute simultaneously and asynchronously with respect to one another.

Imagine you and your roommate both trying to use the same kitchen simultaneously. Without coordination, chaos ensues. But with proper coordination, you can create culinary magic together!

Processes vs Threads 🧵

Feature
Process
Thread

Memory

Has its own separate memory

Shares memory with other threads

Isolation

Isolated from other processes

Not isolated—can affect other threads

Creation Cost

Creating a process is slow and resource-intensive

Threads are quick and lightweight to create

Crash Impact

If one process crashes, others keep running

If one thread crashes, it can affect the whole process

Communication

Communicates via complex mechanisms (e.g., IPC)

Communicates easily by sharing variables and memory

The Incredible Benefits of Threads 🌟

1. Simplicity of Modeling

One of the most underrated benefits of threads is how they simplify complex asynchronous workflows. Instead of wrestling with complicated callback chains and event handlers, you can decompose a complex workflow into simpler, sequential pieces, each running in its own thread.

Example analogy: Think of organizing a wedding. Instead of one person handling everything sequentially (book venue → order flowers → arrange catering → send invitations), you assign different threads (people) to handle different aspects simultaneously, coordinating only at key synchronization points.

2. Exploiting Multiple Processors

Here's a mind-blowing fact: On a two-processor system, a single-threaded program is giving up access to half the available CPU resources. On a 100-processor system, it's giving up access to 99%!

The basic unit of scheduling is the thread, not the process. This means:

  • Single-threaded programs can only use one processor at a time

  • Multi-threaded programs can utilize multiple processors simultaneously

  • Even on single-processor systems, threads can improve throughput by overlapping I/O with computation

Real-world example: While one thread waits for a database query to complete, another thread can process user input, and a third can update the UI. It's like reading a book while waiting for your coffee to brew instead of staring at the coffee machine!

3. Simplified Handling of Asynchronous Events

In single-threaded applications, when the main thread blocks on I/O, everything stops. To avoid this, developers are forced to use complex nonblocking I/O, which is like trying to juggle while riding a unicycle - technically possible, but unnecessarily complicated.

With threads, you can use simple, straightforward synchronous I/O in each thread, making your code much easier to understand and maintain.

4. More Responsive User Interfaces

Remember the bad old days when clicking a button in an application would freeze the entire interface? That was the curse of single-threaded UIs.

Modern GUI frameworks solve this with an event dispatch thread (EDT):

  • User interactions trigger events in a dedicated thread

  • Long-running tasks execute in background threads

  • The UI remains responsive and smooth

It's like having a receptionist (EDT) who can chat with visitors while the workers (background threads) handle the heavy lifting.

The Dark Side: Risks of Threading ⚠️

With great power comes great responsibility. Threading introduces several categories of risks that can turn your elegant code into a debugging nightmare.

1. Safety Hazards: Race Conditions

Thread safety can be unexpectedly subtle. Consider this seemingly innocent code:

public class UnsafeSequence {
    private int value;
    
    /** Returns a unique value */
    public int getNext() {
        return value++;  // This looks atomic, but it's NOT!
    }
}

The problem? The innocent-looking value++ is actually three separate operations:

  1. Read the current value

  2. Add one to it

  3. Write the new value back

In a multithreaded environment, these operations can interleave:

Thread A reads value (9)
Thread B reads value (9)  ← Same value!
Thread A calculates 9 + 1 = 10
Thread B calculates 9 + 1 = 10
Thread A writes 10
Thread B writes 10  ← Both threads return 10!

The takeaway: Because threads share memory space and run concurrently, they can interfere with each other in subtle and dangerous ways.

To solve this, we can add explicit synchronization using the synchronized keyword. Here’s how the code can be updated to prevent race conditions:

public class SafeSequence {
    private int value;
    
    /** Returns a unique value in a thread-safe manner */
    public synchronized int getNext() {
        return value++;
    }
}

2. Liveness Hazards: When Progress Stops

While safety means "nothing bad ever happens," liveness means "something good eventually happens."

Liveness failures include:

  • Deadlocks 🔒: Thread A waits for resource held by Thread B, while Thread B waits for resource held by Thread A

  • Starvation 🍽️: Some threads never get access to resources they need

  • Livelocks 🔄: Threads keep changing state in response to each other but make no progress

The scary part: Like most concurrency bugs, liveness failures depend on timing and may not show up during development or testing!

3. Performance Hazards: The Hidden Costs

Threading isn't free. Context switches - when the CPU switches between threads - involve:

  • Saving and restoring execution context 💾

  • Loss of CPU cache locality 📍

  • CPU time spent on scheduling instead of actual work ⏰

When threads share data, they need synchronization mechanisms that can:

  • Inhibit compiler optimizations 🚫

  • Flush memory caches 🔄

  • Create traffic on the shared memory bus 🚌

Threads Are Everywhere: You Can't Escape Them! 🌐

Here's the truth bomb: Every Java application uses threads. Even if you've never explicitly created a thread, your application is already multithreaded.

When the JVM starts, it creates threads for:

  • Garbage collection 🗑️: Cleaning up unused memory

  • Finalization 🏁: Running cleanup code

  • Main thread 🎯: Running your main method

  • Timer threads ⏰: Executing scheduled tasks

Even using a simple Timer can introduce threading complexities:

Timer timer = new Timer();
timer.schedule(new TimerTask() {
    public void run() {
        // This code runs in a Timer thread!
        // If it accesses shared data, you need thread safety
    }
}, 1000);

The moment you introduce a Timer, your sequential program becomes concurrent, and all classes that access shared data must be thread-safe.

Questions to Ponder 🤔

As we wrap up this introduction to Java concurrency, here are some thought-provoking questions:

  1. Think about an application you use daily - How might it be using threads behind the scenes? What would happen if it were single-threaded?

  2. Consider a real-world scenario like a bank's ATM system - What are the safety hazards if multiple threads process transactions on the same account simultaneously?

  3. Resource sharing dilemma - In what situations would you prefer processes over threads, and vice versa?

  4. Performance trade-offs - If threading introduces overhead, when is it worth the cost? Can you think of scenarios where single-threading might be better?

  5. Future of concurrency - With cloud computing and distributed systems becoming the norm, how do you think concurrency concepts are evolving?

Conclusion: Embracing the Concurrent Future 🚀

Concurrency is not just a programming technique - it's a fundamental shift in how we think about problem-solving. As processor counts continue to increase and applications become more complex, mastering concurrency becomes essential for any serious software developer.

Remember:

  • Threads are powerful but require careful handling

  • Safety cannot be compromised - always coordinate access to shared data

  • Start simple and add complexity only when needed

  • Test thoroughly - concurrency bugs are notoriously elusive

The journey into concurrency can be challenging, but the rewards - in terms of performance, responsiveness, and elegant solution design - are immense. In our next posts, we'll dive deeper into the practical techniques for writing safe, efficient concurrent code in Java.

Last updated