Microservices architecture is an approach that structures an application as a collection of services based on business components or capabilities. The services are maintainable, testable, and independent of each other. They are also loosely coupled to deploy changes independently and each service can be assigned to a small team of developers.
NEW RESEARCH: LEARN HOW DECISION-MAKERS ARE PRIORITIZING DIGITAL INITIATIVES IN 2024.
There are several benefits of adopting a microservices architecture.
- Reducing code complexity: Microservices help reduce code complexity by dividing the code into logical services.
- Easy application management: It reduces the application size for each team, simplifying maintaining and releasing new features.
- Shorter release cycles: It gives you the ability to scale selected services on demand with the help of cloud services like AWS, Azure, etc.
- Shorter build time: It reduces dependencies and enhances deployment flexibility to independently deploy only those services that are part of the release cycle. The deployment time is better than monolith applications because of the shorter build times.
An organization should transition from monolith to microservices if it wants to:
- Deploy new functionalities with zero downtime, especially Saas-based businesses that need their software up and running all the time.
- Isolate specific data and processes to comply with standards such as GDPR (General Data Protection and Regulation) and SOC2.
- Possess a high degree of team autonomy where each team makes decisions and develops software independently. Microservices architecture follows a structure that pushes autonomous behavior within an organization.
Some of the world’s largest companies, such as Amazon, Netflix, and Uber, have adopted microservices successfully. Over time, these enterprises have dismantled their monolith application into small services based on business capabilities like search, buying, payment, orders, etc., and assigned each service to a separate team.
Most startups or new projects within organizations start with a monolithic application or many tightly coupled small applications. The monolith can have many features, but all of the programming logic for those features are cohesive in the application code. Since the code is laced together cohesively, it becomes hard to untangle. Creating a new or updating a current feature in a monolith can prove to be very cumbersome since it can obstruct the application functionality. This makes enhancements much more tedious and costly, making scaling extremely difficult.
Characteristics of Microservices
There are four important characteristics of the microservices architecture:
1. Multiple Components
In the microservices approach, the software can be built on small logical components so that each service can be easily modified and deployed without messing up the integrity of the entire application.
2. Services Based on Business Capabilities
The microservices approach is built upon business functions and precedence. In a monolith, all the business logic is built into a single application, and each team has a set of responsibilities like database, UIs, backend logic, etc. In microservices, each team is assigned to a separate business capability, which acts as a separate product like search, orders, payment, etc.
3. Decentralized
Microservices use various technologies and platforms. One service can use a different set of tools than others. Microservices also supports decentralized data management. Each service usually manages its own database, which aligns with decentralized governance.
4. Fault Resistant
Since there are multiple sets of services, there is always a chance that one service can fail. However, unlike a monolith architecture, the probability of the failed service affecting other services is much lower. A good microservices architecture can withstand the failure of a service.
Microservices architecture puts a lot of emphasis on real-time service monitoring for architectural components and business transactions. Since the services are used as components, we can quickly identify the failed service.
Building an Event-Driven Microservices Architecture
For the example project, we will create two microservices using Node.js and TypeScript. One microservice will be admin connected to MySQL database, and the other will be cart connected to MongoDB. Communication between both microservices will occur using the AMQP (Advanced Message Queuing Protocol) protocol with the help of RabbitMQ.
Overall, the goal of AMQP is to enable message passing through broker services over TCP/IP connections. AMQP is considered a compact protocol since it’s a binary protocol, meaning that everything sent over AMQP is binary data. A binary protocol avoids sending useless data over the wire.
The Admin application will have a set of REST APIs to add, update and delete new items. Admin will also update the total items available in an inventory. This information also needs to be updated in the cart service independently. We will share the sample code for this example at the end of this blog post.
Running MySQL, MongoDB, and RabbitMQ Docker Images
We can install MySQL and MongoDB using other options like XAMPP and MongoDB drivers and can directly install RabbitMQ drivers for windows. But in this example, we will be running them using docker images as explained in the steps below:
1. Running MySQL Database
First, we will run the docker image of MySQL. If you don’t have the image already installed, the following command will also install the image:
docker run --name=mysql-image -p3306:3306 -e MYSQL_ROOT_PASSWORD=my-secret-pw -d mysql/mysql-server:8.0.20
After that, we will enable the MySQL root user to give access to all the IPs. For that, we need to execute the MySQL image’s bash:
docker exec -it mysql-image bash
Enter the same password i.e my-secret-pw after running the following command:
mysql -u root -p
Enter password:
...
After entering the password, run this command:
mysql> CREATE DATABASE db-admin;
And then, we will need to set the host to ‘%’ to make the MySQL docker image accessible to our application:
mysql> update mysql.user set host = '%' where user='root'; Query OK, 1 row affected (0.02 sec)
2. Running MongoDB
Running MongoDB in dockers as shown below:
docker run --name mongodb -d -p 27017:27017 mongo
Then, we can access the DB using the following URL:
mongodb://localhost:27017/main
3. Running RabbitMQ
We can run rabbitmq:3.8-management using the following docker command:
docker run -d --hostname my-rabbit --name some-rabbit -p 15672:15672 rabbitmq:3.8-management
We should be able to access the RabbitMQ dashboard using the URL — http://localhost:15672/. The username and password is guest.
Admin Service
It’s a simple NodeJS server using Typescript and TypeORM. Next, we will connect the server to MySQL DB. We will be using the following dependencies for this project.
"dependencies": { "amqplib": "^0.7.1", "cors": "^2.8.5", "express": "^4.17.1", "mysql": "^2.18.1", "reflect-metadata": "^0.1.13", "typeorm": "^0.2.31" },
Let’s add an Item entity for defining the table and columns. We will use the TypeORM DataMapper pattern.
import { Column, CreateDateColumn, Entity, PrimaryGeneratedColumn, } from "typeorm"; @Entity() export class Item { @PrimaryGeneratedColumn("uuid") id: string; @Column() name: string; @Column() imageUrl: string; @Column({ default: 0 }) totalItems: number; @CreateDateColumn({ name: "createdAt", type: "datetime" }) createdAt: Date; }
The function of the admin service is to store item details in a database. Whenever the admin adds, updates, or deletes an item, the changes should also reflect on the cart service because users will be using the cart service to buy items from the store.
We will also add the “ormconfig.json” file to provide credentials for the MySQL database.
{ "type": "mysql", "host": "0.0.0.0", "port": 3306, "username": "rootall", "password": "my-secret-pw", "database": "db-admin", "entities": [ "lib/entity/*.js" ], "logging": false, "synchronize": true }
After adding the database credentials, we will connect to the database and with AMQP in the “main.ts” file after installing amqplib dependency.
createConnection() .then((database: Connection) => { const itemRepo = database.getRepository(Item); amqp.connect("amqp://guest:guest@127.0.0.1:5672/", (error, connection) => { connection.createChannel((channelConnectionError, amqpChannel) => { if (channelConnectionError) { throw new Error(channelConnectionError); }
We will create an AMQP channel to push our items in Queue with a relevant queue name and data as shown below:
const ITEM_CREATED = "ITEM-CREATED"; const ITEM_UPDATED = "ITEM-UPDATED"; const ITEM_DELETED = "ITEM-DELETED";
After that, we will add REST API routes to allow the admin to perform CRUD. In each create, update, and delete route, we will also push data in a queue with relevant queue names and data. Those queues will be consumed by cart service at a later stage.
app.post("/api/items", async (request: Request, response: Response) => { const item = await itemRepo.create(request.body); const result = await itemRepo.save(item); amqpChannel.sendToQueue( ITEM_CREATED, Buffer.from(JSON.stringify(result)) ); return response.send(result); }); app.put( "/api/items/:id", async (request: Request, response: Response) => { const item = await itemRepo.findOne(request.params.id); itemRepo.merge(item, request.body); const result = await itemRepo.save(item); amqpChannel.sendToQueue( ITEM_UPDATED, Buffer.from(JSON.stringify(result)) ); return response.send(result); } ); app.delete( "/api/items/:id", async (request: Request, response: Response) => { const result = await itemRepo.delete(request.params.id); amqpChannel.sendToQueue( ITEM_DELETED, Buffer.from(request.params.id) ); return response.send(result); } );
amqpChannel.sendToQueue will publish the data using AMQP protocol, and we will have to subscribe to the specific queue in our cart service to operate independently.
Cart Service
The function of the cart service is to utilize the item and add it to the customer’s cart. The admin service will also be responsible for adding, deleting, and updating items in the cart service. Dependencies used in this service are shown below:
"dependencies": { "amqplib": "^0.7.1", "axios": "^0.21.1", "cors": "^2.8.5", "express": "^4.17.1", "mongodb": "^3.6.5", "reflect-metadata": "^0.1.13", "typeorm": "^0.2.31" },
We will use the same project design in the cart service as in the admin service. Let’s add an item entity in the cart service as well. The only difference with this entity will be the adminId column which will be the id assigned to the admin item entry.
import { Column, CreateDateColumn, Entity, ObjectIdColumn } from "typeorm"; @Entity() export class Item { @ObjectIdColumn() id: string; @Column({ unique: true }) adminId: string; @Column() name: string; @Column() imageUrl: string; @Column({ default: 0 }) totalItems: number; @CreateDateColumn({ name: "createdAt", type: "datetime" }) createdAt: Date; }
We will create the “ormconfig.json” file to add info about the MongoDB connection credentials.
{ "type": "mongodb", "host": "0.0.0.0", "database": "db-cart", "synchronize": true, "logging": true, "entities": [ "lib/entity/*.js" ], "cli": { "entitiesDir": "lib/entity" } }
In the “main.ts” file we will connect to AMQP which is hosted in rabbitMQ. After creating a successful connection, we will subscribe to the Queues which are pushed from the admin service.
amqp.connect("amqp://guest:guest@127.0.0.1:5672/", (error, connection) => { if (error) { throw new Error(error.message); } connection.createChannel((channelConnectionError, amqpChannel) => { if (channelConnectionError) { throw new Error(channelConnectionError.message); } amqpChannel.assertQueue(ITEM_CREATED, { durable: false }); amqpChannel.assertQueue(ITEM_UPDATED, { durable: false }); amqpChannel.assertQueue(ITEM_DELETED, { durable: false });
After this, we will listen to the specific queue to perform the task assigned to that queue. We will add a consumer who will receive data whenever data is pushed to that queue. For example, in the case of the ITEM_CREATED queue, we will add a consumer as shown below and will create a row in the database when the queue is triggered.
amqpChannel.consume( ITEM_CREATED, async (msg) => { const createItemEvent: Item = JSON.parse(msg.content.toString()); const item = new Item(); item.adminId = createItemEvent.id; item.name = createItemEvent.name; item.imageUrl = createItemEvent.imageUrl; item.totalItems = createItemEvent.totalItems; await itemRepo.save(item); console.log(ITEM_CREATED); }, { noAck: true } );
Running the Services
Let’s run both services using the “yarn start” command. The admin service will be available on port 5001, and the cart service will be available on port 5002.
We can now use the admin APIs. First, we will create an item using the POST /api/items API route.
As we created the item in the admin database, the same item should be added to the cart database.
If we update and delete items in the admin service, that should also affect the cart service. Similarly, if we buy the item from the cart, the item count in the admin database will also be updated.
Both the services are communicating with each other using AMPQ protocol with the help of RabbitMQ.
You can review the sample code on the Git repo.
Conclusion
The above example illustrates the importance of microservices architecture. Although implementing a microservices project adds complexity in the beginning, it helps in scalability, testability, and maintaining low cohesion in the application code. It can also provide the ideal architecture for Continous Delivery with on-demand scalability for selected services.
Microservices architecture directly aligns with the business requirements and can be divided into small services based on business capabilities. We also implemented a small Event-Driven microservice architecture to demonstrate how different services can communicate with each other. Event-Driven architecture is mostly used in communication between decoupled services. Some of the benefits of the Event-Driven approach are selected scalability and independent failure of service due to the decoupling nature of the architecture.
There is an increase in development speed due to Message brokers like rabbitMQ. The event router removes the need for heavy coordination between publisher and subscriber services. Because of centralized event routing, there is an ease of doing audits. Since it’s push-based, it also reduces cost. Everything happens on-demand and there is no need for polling to check an event, leading to less CPU utilization and bandwidth consumption.
This post was published under the JavaScript Community of Experts. Communities of Experts are specialized groups at Modus Create that consolidate knowledge, document standards, reduce delivery times for clients, and open up growth opportunities for team members. Learn more about the Modus Community of Experts program here.
Modus Create
Related Posts
-
CQRS and Event Sourcing
Asynchronously building a distributed application with microservices requires teams to critically evaluate feature importance and…
-
Ext JS to React: Migration to Open Source
Worried about Migrating from Ext JS? Modus has the Answers Idera’s acquisition of Sencha has…