Make your APIs faster

How I Decreased API Response Time by 89.30% in Python

Lalitvavdara
Python in Plain English
4 min readMay 13, 2021

--

API response time is an important factor to look for if you want to build fast and scalable applications.

While working on a client’s project I had a task where I needed to integrate a third-party API for the project. This means that the Client’s project API response time will now also depend on this third-party API.

Everything works fine if we just need to make a single query to the API and return the results for that query, things start getting worse when we need to make multiple queries and return results for each query we made.

For demonstrating how to make multiple requests efficiently we will use this simple Rhyming word API.

For example, suppose we want to find rhyming words for all the words present in [“hello”, “mellow”, “cat", "rat", “dog”, “frog”, “mouse”, “sparrow”, “man”, “women”] one might think of implementing a code like this.

Link to code

you will see that this approach takes approximately 9.479 seconds to search rhyming words for each word present in the Words list. Can you imagine you open a site and it takes 10 seconds to load? just because someone wrote this horrible code in the backend.

To better understand this look at this graph of how we make requests in this synchronous approach.

Synchronous Requests vs. Time Graph

As you see we make 1st request for (“hello”) word then wait for it to return the result and then append it to the list we want to return, and the same for each word in the list. This is an extremely time-consuming way of implementing these.

So, how can we improve this?

Well, that’s where Async/wait comes to our help. Let’s understand how we can make multiple requests simultaneously with the following Python code.

Link to code

here we create a Task for each request also called an Couritine in Python. In this implementation, we start a task but don’t wait for it to complete we simply move on and start a new task.

Finally, after we are done with creating all the tasks, we wait till we have result from each task, then we simply merge all the results from tasks using merge_lists function and return a combined list containing all the results.

The time it takes to search rhyming words for all the words in words list is approximate 1 second which is way better than what we had in the previous approach.

The requests vs time graph will now look like

Asynchronous Requests vs. Time Graph

As you can see in the graph by the time we initiate the final request we will have some of the previous requests already completed.

Here is the Python code for you to compare both implementations.

import requests, timewords = ["hello", "mellow", "cat", "rat", "dog", "frog", "mouse", "sparrow", "man", "women"]def make_req_syncronously(words_arr):
final_res = []
for word in words_arr:
url = f"https://api.datamuse.com/words?rel_rhy={word}&max=100"
response = requests.get(url)
json_response = response.json()
for item in json_response:
rhyming_word = item.get("word", "")
final_res.append({"word": word, "rhyming_word": rhyming_word})
return final_res
without_async_start_time = time.time()
response = make_req_syncronously(words)
time_without_async = time.time() - without_async_start_time
#
print
("total time for with synchronous execution >> ", time_without_async, " seconds")
import asyncio
import aiohttp # external library
def merge_lists(results_from_fc):
"""
Function for merging multiple lists
"""
combined_list = []
for li in results_from_fc:
combined_list.extend(li)
return combined_list
async def main():
headers = {'content-type': 'application/json'}
async with aiohttp.ClientSession(headers=headers) as session:
tasks = [] # for storing all the tasks we will create in the next step
for
word in words:
task = asyncio.ensure_future(get_rhyming_words(session, word)) # means get this process started and move on
tasks.append(task)
# .gather() will collect the result from every single task from tasks list
# here we use await to wait till all the requests have been satisfied
all_results = await asyncio.gather(*tasks)
combined_list = merge_lists(all_results)
return combined_list
async def get_rhyming_words(session, word):
url = f"https://api.datamuse.com/words?rel_rhy={word}&max=1000"
async with session.get(url) as response:
result_data = await response.json()
return result_data
async_func_start_time = time.time()
response2 = asyncio.get_event_loop().run_until_complete(main())
time_with_async = time.time() - async_func_start_time
print("\nTotal time with async/await execution >> ", time_with_async, " seconds")total_improvement = (time_without_async - time_with_async) / time_without_async * 100
print(f"\n{'*' * 100}\n{' ' * 32}Improved by {total_improvement} %\n{'*' * 100}")

Thank you for taking the time to read my blog.

More content at plainenglish.io

--

--