Building Mainframe Data Pipelines to Feed Enterprise-wide Applications

Building Mainframe Data Pipelines to Feed Enterprise-wide Applications

Building Mainframe Data Pipelines to Feed Enterprise-wide Applications

Thursday, December 5
11:00 AM EST / 8:00 AM PST

Data-driven organizations need low-friction, highly available and fast access to mainframe data as input for distributed applications, microservices and other business processes. Yet, access to this data has historically meant overcoming multiple obstacles, leading to project delays, cost overruns and compromises in speed, security and flexibility.

Confluent and Luminex will discuss how to use Apache Kafka and Mainframe Data Integration (MDI) to build efficient, highly available and agile mainframe data pipelines with an event streaming platform that can be made available to unlimited applications - without the struggles or compromises of the past. Hitachi Vantara will demonstrate how Pentaho enables enterprises to leverage these secure, high-speed event streams for better operational insights and business operations.

Data Analysts, Enterprise Architects, Application Developers and anyone else who provides or relies on access to mainframe data can benefit from this modern approach to mainframe data integration.