Welcome to the next pikoTutorial!
Imagine you’re developing an application that fetches data from external API. For example, a weather app might request current weather conditions for a list of cities and then display them. While the data is updated periodically, repeatedly querying the API with the same city is inefficient, especially if the data hasn’t changed significantly. Additionally, repeated requests to external services may also hit rate limits or cause additional costs.
The challenge here is how to avoid redundant API calls while still ensuring that the app always provides useful weather information to its users.
Using lru_cache
to solve the problem
To minimize redundant API calls, we can use lru_cache
to store the responses of previously requested weather data for each city. If a request is made for a city that was recently queried, the cached result will be returned instead of hitting the API again, significantly reducing network traffic and improving app performance.
Let’s simulate fetching weather data for a city using lru_cache
. Instead of sending a new API request every time, we cache the result and reuse it if the same city’s weather is requested again within a short period.
from functools import lru_cache
import time
# Simulate an API call to fetch weather data
@lru_cache(maxsize=10)
def fetch_weather_data(city: str) -> str:
print(f'Fetching weather data for {city}')
time.sleep(3) # Simulate a time consuming API call
return f'Weather data for {city}: Sunny, 25°C'
# Example usage
print(fetch_weather_data("New York")) # Takes 3s to fetch data
print(fetch_weather_data("London")) # Takes 3s to fetch data
print(fetch_weather_data("New York")) # Immediately returns cached result
What happens in the above example:
- the
fetch_weather_data
function simulates an API call by delaying the response with a 3-second sleep. The first time weather data is requested for a city, it “fetches” the data (simulating an API call). Subsequent calls for the same city retrieve the cached result, bypassing the API and returning the data instantly - this avoids hitting the API for every single request. If users frequently check the weather for the same cities (New York, London, etc.), the cache eliminates redundant API calls, saving time and resources
- the cache stores results for up to 10 cities (
maxsize=10
), and once the cache is full, the least recently used city’s data is evicted, making room for new entries. This limits memory consumption and prevents the cache from growing indefinitely.
Pitfalls to be aware of
Stale Data
Weather data (or any other dynamic data) changes frequently. If we cache an API response and serve it from the cache after a certain amount of time, it might be outdated, giving users incorrect or stale information. You can call fetch_weather_data.cache_clear()
at regular intervals (e.g., every few minutes) to ensure the cache is flushed and fresh data is fetched from the API.
Caching Errors
When dealing with external APIs, there’s always a possibility of errors, such as timeouts, authentication failures or connection issues. If the function encounters an error but still caches the result (e.g. an empty or invalid response), it could continue returning the error response from the cache.
High Cache Miss Rate
If your app is fetching data for many different cities, but users aren’t consistently requesting the same locations, caching may not provide significant benefits. For example, if each user requests a different city every time, the cache will keep evicting old data to store new entries, resulting in a high cache miss rate. Remember to carefully choose an appropriate maxsize
based on the typical usage patterns. If users often request weather data for the same set of cities, caching is highly effective. However, if the requests are highly variable, caching may not yield as much of a performance boost.