Event-driven architecture (EDA) and the adoption of event streaming throughout enterprises are increasingly becoming essential architectural requirements for businesses across almost every industry. However, the more companies use it, the greater the complexity becomes. Event portals can play a pivotal role in not only gaining comprehensive insights into the intricate and expanding event-streaming ecosystem but also facilitating the auditing, management and governance of events.
According to a recent IDC Infobrief, 93% of companies employing EDA across various applications reported that EDA either met or exceeded their expectations. Moreover, 82% of IT leaders intend to apply EDA to two to three new use cases within the next 24 months.
Nonetheless, the number of event streaming use cases results in a surge of data volumes. This, in turn, entails the adoption of more brokers, projects and products for event streaming. For instance, an organization might use open source Kafka for one specific use case while turning to brokers like Confluent for another and Amazon MSK for yet another distinct purpose.
While the real-time processing of data is certainly a positive development, numerous experienced EDA adopters have yet to maximize its potential. Frequently, the situation involves a one-to-one exchange, where a stream is consumed only once, or organizations, having advanced through various stages of maturity with multiple streaming use cases, find themselves entangled in a complex network of brokers, clusters, topics and overlapping schemas.
Let’s explore three typical challenges that organizations encounter as they progress toward event-driven maturity.
Success Needs Visibility – Make Sure Your Event Portal Can Provide It
Application decoupling is great for runtime, but this lack of visibility on both the publish and subscribe side can cause issues when making changes to existing applications. Application producers don’t know who the consumers are and consumers don’t know who the publishers are.
An event portal offers a single window into an event-steaming ecosystem. It provides a native discovery agent to scan, for instance, a Kafka cluster and its schema registry to produce a visual representation of every topic and its schema(s) across multiple versions–and, importantly, who its consumers are.
If an organization were to decompose its monolithic applications into event-driven microservices that communicate with each other via, say, Apache Kafka, one of the biggest challenges is to understand and manage the infrastructure and information flows.
Specifically, how to see the microservices affected by a given change before deploying a new feature or function to ensure it won’t bring that system down – even for as few as five seconds.
By using an event portal, organizations can automatically scan their system and visualize a complete map of endpoints and event streams instead of manually diagraming event streams between microservices.
Using Data at Its Best – in Real-Time
Let’s extrapolate the poor visibility issue from one application across the entire enterprise.
For instance, if you’ve been using Kafka for several years, especially with many departmental or specific application-type use cases, your Kafka cluster now is a treasure trove of real-time data. But data is most valuable at the moment it’s produced – as per Forrester: “Data is, without a doubt, valuable. But when stored in vaults and locked down, it is not.” Real-time data is the most valuable data that exists. But siloed event streaming data means other departments, decision-makers, customers and partners don’t know about it – therefore, it’s not getting shared or reused to its full potential.
Typically, developers don’t have anywhere to go to find this data treasure trove. Some have resorted to building Confluence or Wiki pages that try to document that this data exists, using SharePoint or Word documents. Noble intentions, but without real-time data mining and updates, this information quickly gets stale and out of date.
Again, this is where an event portal makes a difference, providing a perpetually up-to-date catalog of data detailing all topics, event streams, schemas and pub/sub interfaces for each application, along with owners and points of contact, as well as changes for each of the managed EDA entities. This helps expedite development by letting developers easily share, discover and re-use any existing Kafka or event streaming asset, both inside and outside the organization.
For a real-world example of this in action, look at the Federal Aviation Administration and its SWIM (system-wide information management) infrastructure that distributes real-time information to FAA systems across the United States.
Through secure gateways, external partners of the FAA can tap into the flow of events. Airlines and other industry partners that need the SWIM data get it in real-time, without perpetually requesting updates. Whether it’s the availability of gates at an airport, the position data of planes in the sky, or the weather for a region – external partners have the latest information without needing to ask.
Don’t Forget to Govern Event Streams!
The decentralized and dynamic nature of event-driven systems introduces unique security challenges. One of the common trade-offs with event streaming is that it does include access control rules, but developers can err on the side of being too permissive to ensure agility. If streams are not visible, properly cataloged, and data updated regularly, this presents problems with data security, governance and compliance.
Using an event portal designed to give organizations visibility and control over their event streaming can turn security from reactive – to proactive. Users can organize systems into application domains, create and import payload schema definitions in a variety of formats, including AsyncAPI, define event interactions between applications and microservices, and create events and associated topic addresses using topic structure-proven practices. With the right visibility, administrators can ensure security, governance and compliance with internal policies and government regulations.
EDA to the Rescue!
A single, multi-broker event gateway technology offers a solution for organizations to uncover, regulate and oversee the life cycle of their real-time event streams throughout the enterprise. Its significance will continue to grow as more organizations adopt EDA as a fundamental platform.