The 2014 book Flash Boys chronicles two entrepreneurs’ quest to lay a fiber-optic cable between the Chicago Stock Exchange and New York Stock Exchange: 827 miles, in the straightest, most direct way possible. That meant cutting through mountains rather than going around them and going through rivers, not under them. Why?
Like we discussed in our previous part, Every Millisecond Counts: Why Time is Everything for Businesses, you can bet that every millisecond matters in the world of high-frequency trading. The goal is to be faster than the rest of the market – the faster the data flows, the higher the likelihood of profiting off a trade. And that cable is where all the magic happened, shaving milliseconds off what any other connection was able to do.
This is the direction in which the world is heading – the more time you lose, the less successful your business is, down to the tenth of a millisecond. And there are big implications for burning valuable time that directly affect a business.
Loss of Revenue
Whether your business is the provider or consumer of data, your revenue is affected whenever time is wasted in moving information from its source to the algorithms and systems it will drive.
For businesses that deliver data, time is everything. Customers pick them because they can deliver the most accurate, most up-to-date information available. It’s the provider’s job to get that data from point A to point B, and customers place their faith in timeliness and reliability. But it’s not just delivering the data: it’s the sophistication of the systems to enrich the information while it’s in-motion. Leading providers are able to manipulate, filter, and act on data before it even reaches the end user, enhancing the value of information without losing time or money processing it in a central location.
For data-driven organizations, utilizing real-time data provides an unquestionable competitive advantage, and creates the opportunity to do innovative things with the data at hand. Organizations can better understand customers, detect what’s successful and what’s not, and even use current activity to confidently predict what’s coming next.
Loss of Customer Engagement Opportunities
Real-time data is massively valuable for customer engagement. It’s often said that to be successful, you need to engage the customer with the right message at the right time. The key to recognizing “the right time” is to tap into what your customers are doing and the actions they are taking, as the action happens, and extrapolating insights that drive new and exciting ways to engage.
Take a running app for example. When a user goes out for a run, the application collects a multitude of powerful insights into the user: geolocation data, what time of the day they normally exercise, and even how many miles they’ve put on a certain pair of shoes. Imagine if, at the very moment when the user has put 300 miles on a pair of shoes, the manufacturer or distributor could deliver an alert and deliver a compelling offer to purchase a new pair. Real-time data is the key to acquiring valuable insights, and insights are the fuel that drives innovation.
Loss of the Value of the Information Itself
We’ve discussed that overall the rate of information decay is accelerating. Utilizing information before it decays is fundamental to effective business operations. And simply delivering data without extracting insights, or taking action on it does not fit this business demand.
Newer technologies like serverless compute and event-driven architecture help businesses go beyond simple data-driven decisions, opening up opportunities to extract more value out of each piece of data: whether it’s manipulating the data to work with other systems, connecting the data stream to 3rd party computation offerings, or automatically triggering valuable tasks in other systems..
What is the best way to utilize the small window of time you have?
Moving from request/response architecture to real-time data streaming systems has fundamentally shifted delivery speeds, and altered the value and applicability of data. There’s been an explosion of protocols (long polling, MQTT, WebSockets), frameworks (Socket.IO), and infrastructure providers (PubNub) that enable software designers to open a network connection through which data can be streamed bidirectionally, quickly and reliably, between any number of connected devices or systems.
This real-time innovation has enabled, and thus ushered in, new types of technologies and design patterns to take advantage of the ever-shortening window.
Functions and Serverless Compute
Functions let you run and execute business logic on your data in-motion. You’ve probably by now come across Functions-as-a-Service (FaaS) services, where a vendor provides a scalable ‘virtual server’ model for running your code. Whether you build your own serverless Functions from scratch or go with a serverless compute vendor like AWS Lambda or PubNub Functions, the concept of executing your code without provisioning any additional servers to effect the actual computations has been a game changer.
Serverless compute takes real-time data streaming to the next level, and enables systems to do more in a shorter amount of time with the data. Beyond simply sending and receiving data, or continuously pinging additional servers or external services, functions bring the business logic directly into your data streams.
There are broad use cases for this:
- Manipulate the data as it’s in-motion: filtering, augmenting, translating, enriching and transforming data before it reaches its destination. This could be as simple as translating a chat message from English to Spanish, or more robust as ingesting a massive firehose of social media data, analyzing messages for their sentiment, and categorizing each message based on the perceived emotion.
- Connect 3rd party services: loosely-coupled by nature, serverless functions allow teams to integrate 3rd party services and execute business logic through them without having to spin up additional infrastructure. Powerful 3rd party services, like the cognitive/machine learning services from Amazon, Microsoft, and Watson, can live in the functions – delivering the capabilities of these services all in your app with a lightweight API request.
- Big data ingestion and analysis: for organizations that create massive amounts of data, like an industrial IoT company with thousands of sensors in the field, serverless functions help make sense of the data and deliver rapid insights and analysis.
Serverless compute is massively beneficial for allowing apps and organizations to deliver more value in a fraction of the time. On top of the value delivered, serverless functions reduce costs and provide a more loosely-coupled architecture making your app more flexible and opening the doors to more innovation.
With the explosion of data stream-based applications (applications and experiences relying on huge amounts of data points being sent and received by devices) and the expectation of realtime experiences from end users, edge computing has increasingly grown in both adoption and importance.
To deliver these types of applications, and the experiences end users demand, traditional request/response can’t keep up. With devices and end users constantly emitting and consuming large streams of data, as well as serious security considerations, edge computing deals with these challenges.
In a nutshell, edge computing is compute taking place as close to the data source as possible. When compared to cloud computing, where all data passes through and business logic is executed at the data center level, edge computing provides a faster, more efficient way to act.
Relying on cloud computing for processing data is simply too slow. Latency aside, the cloud simply cannot keep up with the massive data volume and velocity generated by data-intensive deployments, like large-scale chat or IoT.
Gathering, processing, and storing data at edge devices greatly alleviates the network of potential bandwidth bottlenecks. Not every bit of data needs to be sent to the cloud, and the cloud doesn’t need to be consulted for every minor decision.
Obviously, cloud computing shouldn’t be done away with. Cloud computing and the traditional ‘data warehouse’ model is still needed in situations requiring heavy computing and storage resources, such as big data analytics on historical data. However, moving data processing and serverless functions execution to the edge greatly increases the speed at which organizations can deliver insights and experiences to end users.
Both edge computing and serverless functions are part of a larger movement of how we architect software – event-driven architecture. The backbone of event-driven architecture is loosely-coupled, which means that the app is highly-distributed with fewer dependencies, making it incredibly flexible and dynamic.
Event-driven applications are built for responsiveness, creating and responding to internal and external events in near realtime. This allows the app to proactively respond to business activities – delivering immediate information dissemination and business logic execution.
Because it’s loosely-coupled by design, event-driven architecture is optimal for integrating with 3rd party services and building apps made up of powerful and easy-to-manage microservices. With event-driven, it’s easier to innovate with new features and capabilities, while providing speed and accuracy for data processing and business logic execution.
Time is of the Essence
Apps are getting faster, businesses are getting smarter, and organizations continue to innovate. And the common theme among the reasons why is based on how they manage time and data.
From how we communicate, to how we buy goods, to how we better understand the world around us, delivering data, and acting on the delivered data will continue to drive the capabilities of the apps and businesses we interact with.
So the question is, what will you do to deliver the most relevant, enriched data as quickly as possible?
Want in-depth analysis? Check out our full eBook Every Millisecond Counts: Why Time is Everything for Businesses.