IBM has completed the previously announced $11bn acquisition of data streaming platform Confluent, in a move to strengthen real-time data streaming for enterprise AI.
Under the terms of the deal announced in December 2025, IBM purchased all outstanding common shares of Confluent listed on Nasdaq for $31 per share in cash.
Confluent’s data streaming platform, which is based on Apache Kafka, is claimed to be adopted by over 6,500 enterprises worldwide, including 40% of Fortune 500 companies.
Through this acquisition, IBM aims to integrate Confluent’s real-time data streaming capabilities into its broader software portfolio, particularly targeting applications in AI and automation across hybrid and on-premises environments.
IBM’s intent is to address a significant challenge in enterprise AI deployments by improving the availability and governance of timely operational data.
The tech major notes that in many organisations, operational data remains fragmented and delayed across multiple business systems, creating obstacles for AI models that depend on current information at scale.
Confluent’s technology enables continuous processing, connection and governance of event data as it is generated. According to IBM, this approach has already been adopted by customers across sectors such as financial services, healthcare, manufacturing and retail.
Confluent CEO and co-founder Jay Kreps said: “Since our founding, Confluent’s mission has been to set the world’s data in motion, making data streaming as foundational to the enterprise as the database. Joining IBM allows us to accelerate that mission at a much greater scale.
“IBM’s global reach and deep enterprise relationships will help us go further, faster. As enterprises move from experimenting with AI to running their business on it, helping data flow continuously across the business has never mattered more.”
The integration plans include connecting Confluent’s event streams directly with IBM’s watsonx.data suite to allow enterprise events to enter AI and analytics workflows in real time while maintaining lineage tracking, policy enforcement and quality controls required by enterprises.
The combination also extends to IBM Z mainframes, which serve as infrastructure for high-volume transactional processing. This enables organisations to stream transactional data into analytics or AI workflows without disrupting core operations or legacy systems.
IBM Software senior vice president and chief commercial officer Rob Thomas said: “Transactions happen in milliseconds, and AI decisions need to happen just as fast. With Confluent, we are giving clients the ability to move trusted data continuously across their entire operation so their AI models and agents can act on what is happening right now, not on data that is hours old.
“Together, IBM and Confluent give enterprises the foundation for a new operating model – one where AI runs on live data, drives decisions in real time, and delivers value at scale.”
For companies relying on transactional systems such as those managing payments or reservations, these integrations allow for the adoption of real-time event-driven architectures while maintaining the reliability required for mission-critical operations.
IBM is also incorporating messaging middleware, including MQ and webMethods Hybrid Integration, into the architecture. With Confluent’s event streaming, these tools enable enterprises to automate event-driven responses across on-premises and cloud environments, allowing applications, APIs and AI agents to process business events in real time while maintaining data governance.
IBM Consulting along with partners will assist clients in building infrastructure that supports governed real-time data flows between systems.
The combined technologies are expected to enable deployment of both batch-based and real-time pipelines across hybrid environments using no-code agents for easier integration and orchestration. Streaming context features are designed so teams can discover, understand and govern analytics or AI-related data.
For users of IBM Z specifically, products such as Kafka SDK for IBM Z, IBM Z Digital Integration Hub and IBM Data Gate provide multiple ways to emit application- or data-driven events into Kafka streams securely. This architecture supports the secure propagation of transactional changes downstream while allowing reuse of event streams across different analytics or application platforms.
According to IBM, these integrations make it possible to capture operational events from legacy transaction systems alongside digital channel activity in a unified stream serving analytics platforms, AI models or automation tools in near real time. The system is structured to ensure reliability, security and compliance standards for large-scale enterprise use cases.
IBM adds that with the acquisition complete its smart data platform will connect streaming data with management tools from both portfolios. The company notes this combined offering is designed to help organisations move from isolated event pipelines towards governed real-time data products suitable for analytics platforms, operational applications or AI agents at scale.