


How can Python's asyncio library be used to write concurrent code using async/await syntax?
Jun 19, 2025 am 01:01 AMPython's asyncio library implements concurrent programming through coroutines, and uses async/await syntax to improve code readability. First, define coroutines with async def and pause execution through await to release control of the event loop; second, use asyncio.run() to start the event loop and run the main function; finally, perform multiple tasks concurrently through asyncio.gather() or create_task(). In actual applications, I/O operations should be encapsulated as coroutines, such as using aiohttp for network requests, and be careful to avoid blocking calls and managing shared state.
Python's asyncio library is built around the concept of coroutines — functions that can pause and resume their execution. With async/await syntax, writing concurrent code becomes more readable and manageable. Here's how to make it work.
Understanding Async IO Basics
At its core, asyncio uses an event loop to manage and execute asynchronous tasks. You define functions with async def
, which makes them coroutines. When you call one of these functions, it doesn't run immediately — instead, it returns a coroutine object that needs to be awaited or scheduled on the event loop.
import asyncio async def says_hello(): print("Hello") await asyncio.sleep(1) print("World") asyncio.run(say_hello())
This simple example shows the structure: the function is declared with async def
, uses await
to wait for another coroutine (like asyncio.sleep()
), and is executed using asyncio.run()
.
Key points:
-
async def
defines a coroutine. -
await
hands control back to the event loop until the awaited task completes. -
asyncio.run()
manages the event loop in modern Python versions (3.7).
Running Multiple Tasks Concurrently
To run multiple tasks at once, use asyncio.gather()
or asyncio.create_task()
. These allow your program to perform I/O-bound operations in parallel — like making several HTTP requests or reading multiple files.
Here's how you can run three instances of the earlier function concurrently:
async def main(): await asyncio.gather( say_hello(), say_hello(), say_hello() ) asyncio.run(main())
You'll notice all "Hello" messages appear first, then after a second, all "World" messages — this shows the concurrency.
Tips:
- Use
create_task()
inside a function to schedule tasks early and start working on them right away. - Don't forget to
await
tasks if you need their results. - Avoid blocking calls like
time.sleep()
; stick withawait asyncio.sleep()
.
Structuring Real-World Applications
For real applications, especially ones involving network requests or database queries, organize your code so each independent operation runs as a coroutine.
Let's say you're fetching data from multiple URLs:
async def fetch_data(url): print(f"Fetching {url}") await asyncio.sleep(1) # Simulate network delay print(f"Done with {url}") async def main(): urls = ["https://example.com", "https://example.org", "https://example.net"] tasks = [fetch_data(url) for url in urls] await asyncio.gather(*tasks) asyncio.run(main())
In this case, each fetch_data()
call represents a separate I/O operation. By running them concurrently, you save time compared to doing them one after another.
When building larger apps:
- Keep I/O-bound operations as coroutines.
- Use connection pools or async libraries like
aiohttp
for HTTP clients. - Be cautious about shared state — async doesn't automatically make your code thread-safe.
That's how asyncio and async/await work together to help you write efficient, non-blocking code. It takes some getting used to, but once you structure your I/O-heavy tasks this way, performance usually improves noticeably.
The above is the detailed content of How can Python's asyncio library be used to write concurrent code using async/await syntax?. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undress AI Tool
Undress images for free

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

In C++ concurrent programming, the concurrency-safe design of data structures is crucial: Critical section: Use a mutex lock to create a code block that allows only one thread to execute at the same time. Read-write lock: allows multiple threads to read at the same time, but only one thread to write at the same time. Lock-free data structures: Use atomic operations to achieve concurrency safety without locks. Practical case: Thread-safe queue: Use critical sections to protect queue operations and achieve thread safety.

The C++ concurrent programming framework features the following options: lightweight threads (std::thread); thread-safe Boost concurrency containers and algorithms; OpenMP for shared memory multiprocessors; high-performance ThreadBuildingBlocks (TBB); cross-platform C++ concurrency interaction Operation library (cpp-Concur).

Methods for inter-thread communication in C++ include: shared memory, synchronization mechanisms (mutex locks, condition variables), pipes, and message queues. For example, use a mutex lock to protect a shared counter: declare a mutex lock (m) and a shared variable (counter); each thread updates the counter by locking (lock_guard); ensure that only one thread updates the counter at a time to prevent race conditions.

To avoid thread starvation, you can use fair locks to ensure fair allocation of resources, or set thread priorities. To solve priority inversion, you can use priority inheritance, which temporarily increases the priority of the thread holding the resource; or use lock promotion, which increases the priority of the thread that needs the resource.

Task scheduling and thread pool management are the keys to improving efficiency and scalability in C++ concurrent programming. Task scheduling: Use std::thread to create new threads. Use the join() method to join the thread. Thread pool management: Create a ThreadPool object and specify the number of threads. Use the add_task() method to add tasks. Call the join() or stop() method to close the thread pool.

In C++ multi-threaded programming, the role of synchronization primitives is to ensure the correctness of multiple threads accessing shared resources. It includes: Mutex (Mutex): protects shared resources and prevents simultaneous access; Condition variable (ConditionVariable): thread Wait for specific conditions to be met before continuing execution; atomic operation: ensure that the operation is executed in an uninterruptible manner.

Thread termination and cancellation mechanisms in C++ include: Thread termination: std::thread::join() blocks the current thread until the target thread completes execution; std::thread::detach() detaches the target thread from thread management. Thread cancellation: std::thread::request_termination() requests the target thread to terminate execution; std::thread::get_id() obtains the target thread ID and can be used with std::terminate() to immediately terminate the target thread. In actual combat, request_termination() allows the thread to decide the timing of termination, and join() ensures that on the main line

Golang concurrent programming framework guide: Goroutines: lightweight coroutines to achieve parallel operation; Channels: pipelines, used for communication between goroutines; WaitGroups: allows the main coroutine to wait for multiple goroutines to complete; Context: provides goroutine context information, such as cancellation and deadline.
