A Comprehensive Guide to a Modern Tech Stack

11 min read Michael Carroll on Sep 15, 2023
A Comprehensive Guide to a Modern Tech Stack-Blog.jpg

Apps, products, and businesses are growing more interactive, data-intensive, and connected. Everyone and every product has been dramatically changed by an always-on connection to the Internet and, through it, to every other person and device in the world.

Billions of people now have smartphones with virtually unlimited data. Most people use their phones to read content, watch videos, and send emails. Yet the most exciting use of our phones leverages a blazing-fast connection to the Internet.

Think about what has changed in the last couple of decades. People worldwide now use their phones to order taxis, track food delivery, control their lights, and have about 100 apps for chatting with each other. They all have something in common – these apps are all powered by a new layer of modern technology, real-time technology.

For novice developers, realtime is more about Internet speed than the design and modern architecture of the app. If you think of the online app as the car and the Internet as the road, it’s akin to saying, “Give me an empty road, and I can race my car like a rocket ship.” Achieving top performance in a car is more about its design, not just the road itself.

Similarly, designing a realtime web app is more about how you engineer the app than the network bandwidth. This guide will dive into what developers need to know and consider when building a real-time web app and the pitfalls, complexities, and various options to design a perfect real-time, modern technology stack.

Challenges of a Modern Tech Stack

The underlying protocols of the web work in a way that assures that application data will reach the other end without any hard guarantees. You do not want this when building real-time applications with modern architecture. Messages are valuable, so you need reliable, guaranteed data delivery. What if that message is an ambulance dispatch request or sending a hazard warning in an industrial IoT device setting? Every message matters.

Managing Data Connections Between Peers

Building a modern application that communicates between users is fairly easy. However, expanding the realtime app to connect multiple users across the globe is more challenging. You must maintain a reliable and secure “always-on” connection between peers.

Scaling with a Modern Tech Stack is Key

Scalability is a challenge you should be thinking about from the beginning. You’ve built an amazing app, but when the user base grows beyond your expectations, you must ensure that the modern tech stack can handle an increased load of users and data. How can you best prepare for this?

Talk to an Expert

Let's connect to discuss your real-time project.

By submitting this form, you are agreeing to our Terms and Conditions and Privacy Policy.

Thinking in Real time with a Modern Tech Stack

A real-time application performs an action against an event, within a fraction of a second. However, the key indicator for real-time performance is the collaboration between apps, such that an event occurring in one app can trigger an action in another.

Use Cases for Modern Tech Stacks

  • RealTime Chat: Two users are connected in a mobile chat app. A chat message keyed in by one user is an event that triggers an action to display the message in the other user’s chat app.

  • Live Data Dashboard: A user monitors a device's state through a dashboard. A change in the device's parameter is an event that triggers an update. The value displayed in the data dashboard is updated in realtime.

  • Geolocation: A bus sends its location to a passenger’s mobile device. A change in the location of the bus is an event that triggers a change in the location of the bus icon displayed on the map on the passenger’s mobile device.

Considerations of a Modern Tech Stack

  1. Creating communication between multiple web apps- A real-time application has multiple instances that need to sync in a split second to provide the best user experience. So, it all boils down to data orchestration between the app instances.

  2. Process data in motion- It is imperative to process the data while in motion to achieve peak real-time performance. This means the data exchanged as part of real-time interactions should not be stored before processing, which can lead to delays.

  3. Build your applications with users in mind- In almost all cases, a real-time interaction has a human angle because one of the collaborators in any real-time data exchange is always a human. Even in a machine-to-machine interaction, a human sits at the back, receiving feedback. Hence, the human perception of a real-time event is very important.

If you pay attention to the above observations, the common denominator is the time difference between app-to-app data exchange. This data exchange should happen within a fraction of a second to achieve real-time performance. Although the internet speed and network bandwidth play a role here, the true essence of realtime interaction over the internet lies with the application protocol stack that works behind the scenes.

Let’s dig into building a realtime app with first principles. This approach will help you figure out the challenges along the way. As you overcome those challenges, you will be able to arrive at the perfect combination of the stack components that will form the basis for your upcoming real-time app.

Building a real-time app with a modern tech stack

Let’s dive into the development process of building a real-time app. A simple in-app chat feature is a great showcase. In its simplest form, two users can chat in real time via a chat interface.

In this case, you would develop the app on a standard TCP/IP stack, and both users will use their individual app instances to communicate over the Internet.

What about the protocol stack for this app?

Since the app instances talk to each other directly, WebSocket is likely the choice here. WebSocket works on the application layer and provides full duplex communication over TCP.


Socket.IO is one of the most popular libraries for implementing WebSockets. Although all the major browsers now support the WebSocket interface natively, Socket.IO provides a wrapper that provides additional features on top of a WebSocket connection, such as a fallback and acknowledgment mechanism.

Does the app possess real-time communication capabilities? Perhaps yes, because the app instances have a direct TCP-based connection. However, if the geographical separation between the two app users is large, the real-time performance will deteriorate.

This is not the best way to build any modern chat app since a little bit of scale will progressively make the inter-app communication complex. The impending bottleneck is too many connections to handle for each app.

A Developer’s Dilemma With a Modern Data Stack

Building a 1:1 or 1:N chat app is easy if you are a seasoned programmer. However, in the case of 1:N chat app, a significant amount of time must be spent managing the connections between the app instances. As a developer, you will experience some frustrating moments when you can’t focus on the app and UI/UX features just because the connection logic isn’t working right.

In this case, the app and UI/UX features constitute the app's business logic. The connection management is part of the modern application infrastructure. In terms of the percentage of time spent developing both these components, there is a significant jump in the latter while upgrading the app for 1:N chat.

Using Hosted Services as Part of Your Tech Stack

To handle the challenge of scale, your best option is to have a modern architecture based on each chat app instance, only having to deal with one connection, no matter how many users are involved in the chat session. This can be achieved by defining a new entity, the centralized chat service component.

With this arrangement, the chat client is simplified. However, the chat service component must now handle and orchestrate all messages between all the chat app instances.

Regarding the protocol stack, the same stack can be retained across all the components, including the chat service. However, since the app instances no longer have a direct connection, the chat service component must maintain an active connection with each chat app instance. It must move the messages quickly across them to provide real-time performance.

Vertical and Horizontal Scaling Data Architecture

With the revised data architecture, it can be expected that each chat service component plays the role of a chat room hosting multiple chat users via their chat app instances.

Vertical Scaling Architecture

The handling of scale concerning the number of chat users is now the sole responsibility of the chat service component. Imagine a public chat room where hundreds of users can join or leave at a given moment. This can lead to a significant latency in exchanging messages via the online chat service components.

This leads to the vertical scaling of the chat service component, such that more CPU and memory resources are utilized to serve more chat users. Eventually, this has a ceiling beyond which the chat service component can no longer handle more users joining.

A Modern Developer's Dilemma

Even though the connection management at the chat app client has been simplified, there are additional challenges in the chat service component. Coding the service component might seem easy in the beginning, but as more and more users sign up, managing their connectivity, state, and status information becomes a challenge.

By and by, the effort will be split into engineering the overall business login and the infrastructure. At scale, the latter will take over. You will find yourself spending more time ensuring the reliability and resilience of the data infrastructure rather than working on the business logic.

Scaling comparison real-time API

Horizontal Data Scaling

The other scalability challenge is the handling of geographically diverse locations from where the users join in.

Realtime performance of the app is impacted if the geographical expanse is not considered. To counter this, multiple chat service components need to be spawned at various geographical locations to aggregate the users for a location. This leads to what we know as horizontal scaling.

At this point, the architecture of the chat service component has to evolve to handle users from multiple regions and orchestrate with the other chat service components. This is where things become quite complex for you as a developer for several reasons.

Challenges of Scaling Modern Apps

Managing users and orchestrating between other chat service components

The chat service component is now distributed across multiple locations such that each chat message is replicated at every location. This adds partition in the topology of the application deployment, such that there is an inner partition that handles the orchestration between chat service components and an outer partition that manages the connectivity with chat app instances.

Real-time chat APP microservices diagram

This brings additional complexity because the chat service components have to deal with multifarious messages ranging from users’ chat messages to state synchronization between all chat service component instances.


Handling diverse messages at a scale requires an effective data processing pipeline, which provides a queuing mechanism to process messages. There are several message queue libraries available to handle this scenario. Kafka is a popular platform for building a real-time data pipeline for consuming messages.

Managing the digital states and world views across all application service components

The problem of orchestrating between the multiple chat service components brings us to the second problem of maintaining a uniform state across all instances. In this application, the main state information is the online status of every chat user. When there was a single chat service component, it maintained the online status for all users.

Now, with multiple components, all users' online statuses must be synchronized across all chat service components. This is a humongous task to achieve as the chat service component scales horizontally.

Managing the dynamic allocation of chat service components based on regional volumes

For building a scalable system, it’s important to consider the load on the service components. If too many users are logging in from a given location, then a single chat service component alone may not be able to handle that scale due to the limits of vertical scaling. Moreover, a single service component is also not desirable due to a single point of failure.

To mitigate these challenges, engineers must implement a load balancer at each location to spread the chat traffic across multiple chat service components. Above all, a set of protocols must be defined to spawn these chat service components dynamically based on traffic surges. Additionally, they should sync up with the existing components and be part of the data synchronization.


When deploying an application for scale across multiple geolocations, it is more about elasticity than scalability. Based on the user volume, the application should be able to scale up or down to handle the traffic with the help of the requisite amount of computing resources. That’s where containerization  technology provides a lot of flexibility. A container can be deployed as a standalone execution context within a host operating system/server with its isolated file system and resources.

Docker is the most popular container technology available today. A Docker container can be pre-built as an image and executed within a host OS as an independent system with its pre-allocated computing and memory resources.

For managing Docker-based elastic application deployment, there is also a need for a container management system that provides an application-wide view of all the containers. Kubernetes is one such tool that is used for this purpose. Together with Docker, Kubernetes provides a complete suite of tools for managing and administrating Docker containers.

With the additional complexity around horizontal scalability, there is a need for additional layers of architectural components to manage the service components over and above the application stack.

Security and access control of your modern tech stack

At a global scale, the issues of security and access control become paramount. Forbidding unauthorized users, denying access to hackers, or even avoiding DDoS attacks by rogue users is necessary. Security is a separate subject, and for such a vast network, a specialized layer of security protocols must reside at the network's outer edge to thwart any possible breach. It all adds to the inner partition's complexity and the chat service components.

Developer’s Dilemma of Scaling Your Architecture

If your application has come this far, then, needless to say, it is popular and used by many users worldwide. Just like the saying goes in software engineering: 80% of code is written to check and handle errors, while only 20% of code does the real job. The same logic applies to handling super scalability as well.

Infrastructure-as-a-Service is key to modern tech stacks

Now it’s time to talk about cost – for both setting up and maintaining the platform infrastructure at scale.

If you include the costs and the developer’s efforts in managing your app's real-time infrastructure, the situation can get out of hand rather quickly. All this is at the expense of the user experience of your web app since the developers would be busy tuning the infrastructure for real-time performance. 

That’s where a specialized cloud service provider can be a boon that handles the challenges of real-time scaling and expansion.

PubNub’s real-time data streaming service solves every problem developers and IT teams face in scaling up a real-time application, both in terms of time and cost. With PubNub, your app’s real-time infrastructure is as simple as this.

Apart from handling all the heavy lifting associated with handling scale, PubNub guarantees 99.999% of data transmission reaches the recipient within a tenth of a second.

The most noteworthy feature of PubNub is that it is not entirely a black box system. PubNub’s infrastructure is like a semi-porous white box that allows you to plug in realtime message-handling components. Also known as Functions, these components allow developers to build their custom real-time data processing pipelines on the go. Consequently, PubNub can process data in motion rather than at rest, which has a significant upside for real-time data transfer.

Backed by a globally distributed data streaming network, PubNub handles all your chores related to infrastructure, from scaling to security, and more.

And now, what about the protocol stack?

Yes, PubNub now becomes the de-facto application layer protocol responsible for transmitting your data packets from one app instance to another. Now, you do not have to worry about the infrastructure. You only worry about building that all enticing UX for your end users and just plug in the relevant PubNub SDK at your client code to take care of all things in real time.

Getting started with your modern tech stack

When you use a service like PubNub, which obviates any need to manage the infrastructure, then you focus all your energies on making your application more user-friendly and sticky. The more realtime features you add to the app, the more kudos you receive from your end users.

And the best part is that this percentage split in the effort will remain more or less constant. No matter how many users you have, PubNub will work silently behind the scenes to deliver your messages in real time, all the time, with no currency limits or penalties for growth.

Our experts are standing by to chat about your realtime app projects. Or, if kicking the tires is more your speed, sign up for a free trial, read through our docs, or check out our GitHub.