How To Build a Caching System Through Redis in Node JS?

In today’s world, you don’t want your application to be slow. If a website is taking too long to respond there might be a chance the user will not visit your application again and Boosting node js application.

In this article, we are going to implement the caching system through Redis in NodeJS. But before learning that we have to understand what exactly caching is, how it increases the performance of the website, and how to scale the node js application. Also, we have mentioned how we can use Redis in our application. 

What is Caching? 

Caching is the process where we store the regularly used data in an In-Memory Database. Suppose we are accessing a certain page regularly, we store the data of that page in the Redis database so that we can cache it, boosting node js application.

Why do we need Caching?

Some reasons to cache your node js application:- 

  • If you want to reduce the response time for a node js application you can use caching

To summarize, caching is a win-win situation for your customer and yourself as it reduces the cost of managing a website.

How to Scale Your NodeJS Application?

To follow the tutorial, you must install these software and packages on your operating system:- 

  • NodeJS: Open Source Library to run web server
  • NPM:- Node Package Manager to install Packages
  • Postman:- Postman is a tool for API Testing
  • Code Editor(VS Code):- For this application, we are using VS Code as an Editor

Make sure you have installed the NodeJS Application on your system. If you don’t have NodeJS you can go to the NodeJS Website to install the Application. After installing NodeJS NPM will be installed automatically. 

Getting Started:- 

To get started, create a new directory in your root folder by running the following commands:- 

  1. mkdir cache-app && cd cache-app  

Initialize your package.json file by running the following command:- 

npm init -y

Make sure you install Redis, Axios, and express packages using Node package manager(NPM)

npm install axios redis express

Axios :- Axios is a package to make a network request to node js backend

Redis:- Package which we are using for caching

Express:- Package used for creating a web server

Create a simple Express server application:- 

Now we will be going to request the food recipe API for various food items

In your index.js paste the following code 

const express = require('express'); const app = express(); const port = 3000; const axios = require('axios'); app.get('/recipe/:fooditem', async (req, res) => { try { const food item = req. params.food item; const recipe = await Axios.get(`http://www.recipepuppy.com/api/?q=${fooditem}`); return res.status(200).send({ error: false, data: recipe. data.results }); } catch (error) { console.log(error); } }); app.listen(port, () => { console.log(`Server running on port ${port}`); }); module.exports = app;
Code language: JavaScript (javascript)

This piece of code is making a request to the food recipe API with the use of Axios library about a specific food item.

Now start the server by running the command node index.js test your open postman and make a request to the food recipe endpoint.

As you can see the request that we have made is taking a significant time of 615ms which is not good for any website. We will improve this by using Redis Cache.

Add the following code to your index.js file to enable Caching

const express = require(‘express’); const axios = require(‘axios’); const redis = require(‘redis’); const app = express(); const port = 3000; const client = redis.createClient(6379); client.on(“error”, (error) => { console.error(error); }); app.get(‘/recipe/:fooditem’, (req, res) => { try { const foodItem = req. params.food item; client.get(foodItem, async (err, recipe) => { if (recipe) { return res.status(200).send({ error: false, message: `Recipe for ${foodItem}`, data: JSON.parse(recipe) }) } else { const recipe = await axios.get(`http://www.recipepuppy.com/api/?q=${foodItem}`); client.setex(foodItem, 1440, JSON.stringify(recipe.data.results)); return res.status(200).send({ error: false, message: `Recipe for ${foodItem} from the server`, data: recipe. data.results }); } }) } catch (error) { console.log(error) } }); app.listen(port, () => { console.log(`Server running on port ${port}`); }); module.exports = app;
Code language: JavaScript (javascript)

First, using the standard Redis port (6379), we constructed a Redis client and connected it to the local Redis instance.

Then, using the key from our Redis store, we attempted to find the correct matching data to fulfill the request in the /recipe route handler. If the result is found, it is delivered to the client requesting our cache, saving us from making another server request.

However, if the key is not present in our Redis store, a request is sent to the server, and when the server responds, the result is stored using a special key in the Redis store.

So long as the cached data hasn’t expired, subsequent requests to the same endpoint with the same parameter will always be fetched from the cache. The key is set to hold a string value in the store for a specific amount of seconds, in this example 1440 (24 minutes), using the Redis client’s set method.

Now we can test the application and it will take less time to make the second request to the web server.

Conclusion

I hope you have learned from the article how we can use Redis in our application. As a developer, you will use Redis in many applications like if you are building a chat application or any web server and it improves the web performance. So as a developer, everyone should learn Redis. Thank You!!!

Recent Post

  • Generative AI in Hospitality: Integration, Use Cases, Challenges, and Future Outlook

    Generative AI is revolutionizing the hospitality industry, redefining guest experiences, and streamlining operations with intelligent automation. According to market research, the generative AI market in the hospitality sector was valued at USD 16.3 billion in 2023 and is projected to skyrocket to USD 439 billion by 2033, reflecting an impressive CAGR of 40.2% from 2024 […]

  • Generative AI for Contract Management: Overview, Use Cases, Implementation Strategies, and Future Trends

    Effective contract management is a cornerstone of business success, ensuring compliance, operational efficiency, and seamless negotiations. Yet, managing complex agreements across departments often proves daunting, particularly for large organizations. The TalkTo Application, a generative AI-powered platform, redefines contract management by automating and optimizing critical processes, enabling businesses to reduce operational friction and improve financial outcomes. […]

  • Generative AI in customer service: Integration approaches, use cases, best practices, and future outlook

    Introduction The rise of generative AI is revolutionizing customer service, heralding a new era of intelligent, responsive, and personalized customer interactions. As businesses strive to meet evolving customer expectations, these advanced technologies are becoming indispensable for creating dynamic and meaningful engagement. But what does this shift mean for the future of customer relationships? Generative AI […]

  • Generative AI in corporate accounting: Integration, use cases, challenges, ROI evaluation, and future outlook

    Overview Corporate accounting is fundamental to ensuring an organization’s financial stability and strategic growth. As the cornerstone of financial reporting and decision-making, it upholds transparency and accountability in business operations. However, technological advancements, particularly the emergence of generative AI, are redefining the field. By automating repetitive tasks and amplifying data-driven insights, generative AI in corporate […]

  • Generative AI in HR Operations: Overview, Use Cases, Challenges, and Future Trends

    Overview Imagine a workplace where HR tasks aren’t bogged down by endless paperwork or repetitive chores, but instead powered by intelligent systems that think, create, and adapt—welcome to the world of GenAI. Generative AI in HR operations offers a perfect blend of efficiency, personalization, and strategic insight that transforms how organizations interact with their talent. […]

  • Generative AI in Sales: Implementation Approaches, Use Cases, Challenges, Best Practices, and Future Trends

    The world of sales is evolving at lightning speed. Today’s sales teams are not just tasked with meeting ambitious quotas but must also navigate a maze of complex buyer journeys and ever-rising customer expectations. Despite relying on advanced CRM systems and various sales tools, many teams remain bogged down by repetitive administrative tasks, a lack […]

Click to Copy