Concurrency – Understanding the Concept

From our previous article, we got to know that with the start of every JVM process, we get lots of threads already working.Out of that one is: Main thread.

On top of that Main thread can also spawn multiple threads by using Java API. (We will discuss this part later on as to how do we do it).

NOW we should be able to VISUALIZE many threads running in JVM.

With this knowledge, in this article we will try to understand:

  1. What do we mean by concurrency?
  2. And what do we mean by a concurrent system?

So, stay tuned!!

Current article is part of our series on multithreading in java.

Understanding Concurrency

By simply searching “Concurrency” in google, this is the definition I got from the MIT:

Concurrency definition by MIT

You must be asking yourself: If Concurrency is: “multiple computations happening at the same time“, then what is parallelism? Is even the mighty MIT confused on this topic?

Alright! Let’s take another definition from Wikipedia:

Confused? Let me tear the definition apart:

  1. First, it says that a system is composed of different parts or units.
  2. Second, these different units can be executed in some order or without any order.
  3. Third, this ability of a system to execute these units is called Concurrency.
  4. Concurrency then enables parallel execution.(We will discuss this later)

A bit better to understand. Now, let’s make it even simpler.

Defining Concurrency

Concurrency is the ability of an application to handle multiple things at the same or overlapping time.

Overlapping time? Let me explain this definition right away before you try to interpret the definition in line of parallelism or something else.

What do we mean by concurrent application?

Example of a Non-Concurrent System

Let’s say you have designed an application which allows a user to query the database (A database client). Think of SQL Developer, MySQLyog etc.

NOW, by mistake, you write a stupid query which might take hours to complete and execute it. After realizing this, you try to hit the ‘x’ button, to cancel the query. But, it won’t work, why?

Because, you executed the query from the main thread and now the main thread is waiting for the DB I/O to give back the result. Logic of ‘x’ button will get executed only after query will give the result back.

This system is not concurrent as it can not deal with two things(Query execution and Cancellation) at the same time.

Example of a Concurrent System

Now, let’s convert the above non-concurrent system to a concurrent system.

For that, in our Main thread, we will have to provide logic such that:

  1. Whenever a DB query request comes, we pass it to a new thread.
  2. Meanwhile Main Thread is able to take our ‘x’ request.
  3. Once received, we can terminate the thread which is handling the DB query.

Fine. It can handle multiple things. But at the same time?

Why MIT said “at the same time”?

Why our definition has the part “overlapping time”?

It’s time to answer this NOW:

Same time vs Overlapping time

We have two threads with us:

  1. Main thread
  2. DB query handler thread

If we have multi-core system, these thread can run exactly at the same time on different CPU cores.

If we have an single CPU to work with, then with the help of operating system’s time slicing, each thread will be given some CPU cycles, but not exactly at the same time(of-course).To the user, it will look like it.

But Important thing is that both threads will have chance to get executed at overlapping time.

So, MIT is not confused. They simply assumed:

  1. Either that all systems are multi-core now-a-days, which is really the case.
  2. Or put the user prospect that: To them it will look like execution in parallel.

Whatever the case. Just grab our definition and it will serve you well in every situation.

So:

  1. If we run our non-concurrent system on single or multi-core system, it will remain non-concurrent. (Not able to handle multiple things at the same time)
  2. But If we run our concurrent system even on single core system, it will be a concurrent system which will be able to handle different things with the help of time slicing.

Key Points

  1. A concurrent system is able to handle multiple things at the same or overlapping time.
  2. In single core, it does so with the help of time slicing, which is the case of overlapping time.
  3. With multi-core CPU system it happens at same time.
  4. A non-concurrent system will remain non-concurrent system even on multi-core system.
  5. WE need to make a concurrent system. It won’t be provided.

What to read next?

There is prevalent confusion between Concurrency and Parallelism. With this article, you understood what is Concurrency. Next we will target:

  1. What is Parallelism?
  2. What is the difference between Concurrency and Parallelism in detail?

Go to: Understanding Concurrency vs Parallelism on single and multi core system

Leave a Comment