Thanks! We'll be in touch in the next 12 hours
Oops! Something went wrong while submitting the form.

An Introduction to Asynchronous Programming in Python

Sagar Khangan

Full-stack Development

Introduction

Asynchronous programming is a type of parallel programming in which a unit of work is allowed to run separately from the primary application thread. When the work is complete, it notifies the main thread about completion or failure of the worker thread. There are numerous benefits to using it, such as improved application performance and enhanced responsiveness.

Synchronous vs Asynchronous programming

Asynchronous programming has been gaining a lot of attention in the past few years, and for good reason. Although it can be more difficult than the traditional linear style, it is also much more efficient.

For example, instead of waiting for an HTTP request to finish before continuing execution, with Python async coroutines you can submit the request and do other work that's waiting in a queue while waiting for the HTTP request to finish.

Asynchronicity seems to be a big reason why Node.js so popular for server-side programming. Much of the code we write, especially in heavy IO applications like websites, depends on external resources. This could be anything from a remote database call to POSTing to a REST service. As soon as you ask for any of these resources, your code is waiting around with nothing to do. With asynchronous programming, you allow your code to handle other tasks while waiting for these other resources to respond.

How Does Python Do Multiple Things At Once?

Python Programming Models

1. Multiple Processes

The most obvious way is to use multiple processes. From the terminal, you can start your script two, three, four…ten times and then all the scripts are going to run independently or at the same time. The operating system that's underneath will take care of sharing your CPU resources among all those instances. Alternately you can use the multiprocessing library which supports spawning processes as shown in the example below.

CODE: https://gist.github.com/velotiotech/c508894ad72932e174c83d64e866ab30.js

Output:

CODE: https://gist.github.com/velotiotech/49543bd57e2bd0f4318fe287d2a8bbae.js

2. Multiple Threads

The next way to run multiple things at once is to use threads. A thread is a line of execution, pretty much like a process, but you can have multiple threads in the context of one process and they all share access to common resources. But because of this, it's difficult to write a threading code. And again, the operating system is doing all the heavy lifting on sharing the CPU, but the global interpreter lock (GIL) allows only one thread to run Python code at a given time even when you have multiple threads running code. So, In CPython, the GIL prevents multi-core concurrency. Basically, you’re running in a single core even though you may have two or four or more.

CODE: https://gist.github.com/velotiotech/163efa366f037ac615a0d02e230914dd.js

Output:

CODE: https://gist.github.com/velotiotech/f294f949dfcadfb87d7b0baf5a3c32ce.js

3. Coroutines using yield:

Coroutines are generalization of subroutines. They are used for cooperative multitasking where a process voluntarily yield (give away) control periodically or when idle in order to enable multiple applications to be run simultaneously. Coroutines are similar to generators but with few extra methods and slight change in how we use yield statement. Generators produce data for iteration while coroutines can also consume data.

CODE: https://gist.github.com/velotiotech/e682576969d9bd9f102282e9a40fcec4.js

Output:

CODE: https://gist.github.com/velotiotech/172a8e45c296e6d5f9720056f0b7b470.js

4. Asynchronous Programming

The fourth way is an asynchronous programming, where the OS is not participating. As far as OS is concerned you're going to have one process and there's going to be a single thread within that process, but you'll be able to do multiple things at once. So, what's the trick?

The answer is asyncio

asyncio is the new concurrency module introduced in Python 3.4. It is designed to use coroutines and futures to simplify asynchronous code and make it almost as readable as synchronous code as there are no callbacks.

asyncio uses different constructs: event loopscoroutinesand futures.

  • An event loop manages and distributes the execution of different tasks. It registers them and handles distributing the flow of control between them.
  • Coroutines (covered above) are special functions that work similarly to Python generators, on await they release the flow of control back to the event loop. A coroutine needs to be scheduled to run on the event loop, once scheduled coroutines are wrapped in Tasks which is a type of Future.
  • Futures represent the result of a task that may or may not have been executed. This result may be an exception.

Using Asyncio, you can structure your code so subtasks are defined as coroutines and allows you to schedule them as you please, including simultaneously. Coroutines contain yield points where we define possible points where a context switch can happen if other tasks are pending, but will not if no other task is pending.

A context switch in asyncio represents the event loop yielding the flow of control from one coroutine to the next.

In the example, we run 3 async tasks that query Reddit separately, extract and print the JSON. We leverage aiohttp which is a http client library ensuring even the HTTP request runs asynchronously.

CODE: https://gist.github.com/velotiotech/d3354f736c38464c1bbeb719c6977668.js

Output:

CODE: https://gist.github.com/velotiotech/8e20af0825daecdafc533d14d4d6b5a4.js

Using Redis and Redis Queue(RQ):

Using asyncio and aiohttp may not always be in an option especially if you are using older versions of python. Also, there will be scenarios when you would want to distribute your tasks across different servers. In that case we can leverage RQ (Redis Queue). It is a simple Python library for queueing jobs and processing them in the background with workers. It is backed by Redis - a key/value data store.

In the example below, we have queued a simple function count_words_at_url using redis.

CODE: https://gist.github.com/velotiotech/ddcabb504e11b18882ddfbcf73c66d1e.js

Output:

CODE: https://gist.github.com/velotiotech/9c86e68a28fca49d149efb3fd143050c.js

Conclusion:

Let’s take a classical example chess exhibition where one of the best chess players competes against a lot of people. And if there are 24 games with 24 people to play with and the chess master plays with all of them synchronically, it'll take at least 12 hours (taking into account that the average game takes 30 moves, the chess master thinks for 5 seconds to come up with a move and the opponent - for approximately 55 seconds). But using the asynchronous mode gives chess master the opportunity to make a move and leave the opponent thinking while going to the next one and making a move there. This way a move on all 24 games can be done in 2 minutes and all of them can be won in just one hour.

So, this is what's meant when people talk about asynchronous being really fast. It's this kind of fast. Chess master doesn't play chess faster, the time is just more optimized and it's not get wasted on waiting around. This is how it works.

In this analogy, the chess master will be our CPU and the idea is that we wanna make sure that the CPU doesn't wait or waits the least amount of time possible. It's about always finding something to do.

A practical definition of Async is that it's a style of concurrent programming in which tasks release the CPU during waiting periods, so that other tasks can use it. In Python, there are several ways to achieve concurrency, based on our requirement, code flow, data manipulation, architecture design  and use cases we can select any of these methods.

Get the latest engineering blogs delivered straight to your inbox.
No spam. Only expert insights.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Did you like the blog? If yes, we're sure you'll also like to work with the people who write them - our best-in-class engineering team.

We're looking for talented developers who are passionate about new emerging technologies. If that's you, get in touch with us.

Explore current openings

An Introduction to Asynchronous Programming in Python

Introduction

Asynchronous programming is a type of parallel programming in which a unit of work is allowed to run separately from the primary application thread. When the work is complete, it notifies the main thread about completion or failure of the worker thread. There are numerous benefits to using it, such as improved application performance and enhanced responsiveness.

Synchronous vs Asynchronous programming

Asynchronous programming has been gaining a lot of attention in the past few years, and for good reason. Although it can be more difficult than the traditional linear style, it is also much more efficient.

For example, instead of waiting for an HTTP request to finish before continuing execution, with Python async coroutines you can submit the request and do other work that's waiting in a queue while waiting for the HTTP request to finish.

Asynchronicity seems to be a big reason why Node.js so popular for server-side programming. Much of the code we write, especially in heavy IO applications like websites, depends on external resources. This could be anything from a remote database call to POSTing to a REST service. As soon as you ask for any of these resources, your code is waiting around with nothing to do. With asynchronous programming, you allow your code to handle other tasks while waiting for these other resources to respond.

How Does Python Do Multiple Things At Once?

Python Programming Models

1. Multiple Processes

The most obvious way is to use multiple processes. From the terminal, you can start your script two, three, four…ten times and then all the scripts are going to run independently or at the same time. The operating system that's underneath will take care of sharing your CPU resources among all those instances. Alternately you can use the multiprocessing library which supports spawning processes as shown in the example below.

CODE: https://gist.github.com/velotiotech/c508894ad72932e174c83d64e866ab30.js

Output:

CODE: https://gist.github.com/velotiotech/49543bd57e2bd0f4318fe287d2a8bbae.js

2. Multiple Threads

The next way to run multiple things at once is to use threads. A thread is a line of execution, pretty much like a process, but you can have multiple threads in the context of one process and they all share access to common resources. But because of this, it's difficult to write a threading code. And again, the operating system is doing all the heavy lifting on sharing the CPU, but the global interpreter lock (GIL) allows only one thread to run Python code at a given time even when you have multiple threads running code. So, In CPython, the GIL prevents multi-core concurrency. Basically, you’re running in a single core even though you may have two or four or more.

CODE: https://gist.github.com/velotiotech/163efa366f037ac615a0d02e230914dd.js

Output:

CODE: https://gist.github.com/velotiotech/f294f949dfcadfb87d7b0baf5a3c32ce.js

3. Coroutines using yield:

Coroutines are generalization of subroutines. They are used for cooperative multitasking where a process voluntarily yield (give away) control periodically or when idle in order to enable multiple applications to be run simultaneously. Coroutines are similar to generators but with few extra methods and slight change in how we use yield statement. Generators produce data for iteration while coroutines can also consume data.

CODE: https://gist.github.com/velotiotech/e682576969d9bd9f102282e9a40fcec4.js

Output:

CODE: https://gist.github.com/velotiotech/172a8e45c296e6d5f9720056f0b7b470.js

4. Asynchronous Programming

The fourth way is an asynchronous programming, where the OS is not participating. As far as OS is concerned you're going to have one process and there's going to be a single thread within that process, but you'll be able to do multiple things at once. So, what's the trick?

The answer is asyncio

asyncio is the new concurrency module introduced in Python 3.4. It is designed to use coroutines and futures to simplify asynchronous code and make it almost as readable as synchronous code as there are no callbacks.

asyncio uses different constructs: event loopscoroutinesand futures.

  • An event loop manages and distributes the execution of different tasks. It registers them and handles distributing the flow of control between them.
  • Coroutines (covered above) are special functions that work similarly to Python generators, on await they release the flow of control back to the event loop. A coroutine needs to be scheduled to run on the event loop, once scheduled coroutines are wrapped in Tasks which is a type of Future.
  • Futures represent the result of a task that may or may not have been executed. This result may be an exception.

Using Asyncio, you can structure your code so subtasks are defined as coroutines and allows you to schedule them as you please, including simultaneously. Coroutines contain yield points where we define possible points where a context switch can happen if other tasks are pending, but will not if no other task is pending.

A context switch in asyncio represents the event loop yielding the flow of control from one coroutine to the next.

In the example, we run 3 async tasks that query Reddit separately, extract and print the JSON. We leverage aiohttp which is a http client library ensuring even the HTTP request runs asynchronously.

CODE: https://gist.github.com/velotiotech/d3354f736c38464c1bbeb719c6977668.js

Output:

CODE: https://gist.github.com/velotiotech/8e20af0825daecdafc533d14d4d6b5a4.js

Using Redis and Redis Queue(RQ):

Using asyncio and aiohttp may not always be in an option especially if you are using older versions of python. Also, there will be scenarios when you would want to distribute your tasks across different servers. In that case we can leverage RQ (Redis Queue). It is a simple Python library for queueing jobs and processing them in the background with workers. It is backed by Redis - a key/value data store.

In the example below, we have queued a simple function count_words_at_url using redis.

CODE: https://gist.github.com/velotiotech/ddcabb504e11b18882ddfbcf73c66d1e.js

Output:

CODE: https://gist.github.com/velotiotech/9c86e68a28fca49d149efb3fd143050c.js

Conclusion:

Let’s take a classical example chess exhibition where one of the best chess players competes against a lot of people. And if there are 24 games with 24 people to play with and the chess master plays with all of them synchronically, it'll take at least 12 hours (taking into account that the average game takes 30 moves, the chess master thinks for 5 seconds to come up with a move and the opponent - for approximately 55 seconds). But using the asynchronous mode gives chess master the opportunity to make a move and leave the opponent thinking while going to the next one and making a move there. This way a move on all 24 games can be done in 2 minutes and all of them can be won in just one hour.

So, this is what's meant when people talk about asynchronous being really fast. It's this kind of fast. Chess master doesn't play chess faster, the time is just more optimized and it's not get wasted on waiting around. This is how it works.

In this analogy, the chess master will be our CPU and the idea is that we wanna make sure that the CPU doesn't wait or waits the least amount of time possible. It's about always finding something to do.

A practical definition of Async is that it's a style of concurrent programming in which tasks release the CPU during waiting periods, so that other tasks can use it. In Python, there are several ways to achieve concurrency, based on our requirement, code flow, data manipulation, architecture design  and use cases we can select any of these methods.

Did you like the blog? If yes, we're sure you'll also like to work with the people who write them - our best-in-class engineering team.

We're looking for talented developers who are passionate about new emerging technologies. If that's you, get in touch with us.

Explore current openings