Every single business on the planet is in the position to generate and collect absurd amounts of data. There are dozens of great stats about the explosion of digital information. The one that I like, from analyst firm IDC, published in its Digital Universe study, claims that digital data produced will exceed 44 zettabytes in the next 5 years. This is equivalent to about 57 times the amount of all the grains of sand on all the beaches in the world, or about 5,200 gigabytes of data for every single human being alive today.
Hidden in the never ending tidal wave of trillions of bytes of information are ideas, insights, answers and innovations that will redefine markets, drive profits and change business models. The trick of course, is being able to capture, sort and use that information in real-time and at scale. This is one of the biggest technological challenges of our generation.
A Central Nervous System for Digital Information
Increasingly, companies need to react to data in real-time, which means that data must be used in a way that is more like our own central nervous system. Businesses must be able to react to everything happening in and out of the business by getting data to all the different systems that need to use it, and all of this must be done in the most automated and self-service way possible. This is one of the big, defining technological challenges that we’ve been working on at Teradata. And, I’m pleased to tell you that we have developed an elegant solution with the foundations of agility and self-services at its core.
Increasingly, companies need to react to data in real-time, which means that data must be used in a way that is more like our own central nervous system.
At our Teradata Partners conference, we are announcing Teradata Listener, which is an intelligent, self-service solution for ingesting and distributing extremely fast-moving data streams throughout the analytical ecosystem. Listener is based on open source code and modern software engineering, including RESTFul APIs and Apache open-source technologies like Kafka, Cassandra, ElasticSearch, and Mesos.
It intelligently and easily ingests data from distributed data streams and writes to multiple data management systems, like Hadoop or an enterprise data warehouse, by acting as a buffer and self-service environment between data sources and target systems. Listener continuously captures and stores fast-moving, extremely large data sets from hundreds of sources such as Web clickstreams, social media feeds, mobile events and the Internet of Things such as IT server logs, sensors and telematics.
A Pipeline for Streaming Big Data
While streaming data is not new, the approach to tap the value by most organizations in the early era of big data has centered around one-off point solutions – be it in-house developed solutions that are often challenged to scale beyond original use cases, or commercial products that limit developer flexibility. Listener minimizes the labor needed to capture real-time data. Programmers can easily connect data sources to Listener and can send the data to the right target systems. They can then implement ETL and analytics in our Unified Data Architecture repositories, in many cases.
I often talk about the rise of the Sentient Enterprise, a model of innovation that prioritizes technologies that transform businesses into agile, automated and self-service consumers of digital information. In fact, those are the driving principle of all that we build at Teradata. Listener is more than simply the latest example of that work. It is an elegant answer to the question that asks, “how do I get a handle on the unending deluge of digital information streaming into my business?”
The constant fire hose of digital information is only getting bigger. Listener lets businesses get their arms around the problem. It serves as the central nervous system for data, connecting a network of current and emerging data sources with the systems and applications set up for data analysis.
Mr. Ratzesberger has a proven track record in executive management, as well as 20+ years of experience in analytics, large data processing and software engineering.
Oliver’s journey started with Teradata as a customer, driving innovation on its scalable technology base. His vision of how the technology could be applied to solve complex business problems led to him joining the company. At Teradata, he has been the architect of the strategy and roadmap, aimed at transformation. Under Oliver’s leadership, the company has challenged itself to become a cloud enabled, subscription business with a new flagship product. Teradata’s integrated analytical platform is the fastest growing product in its history, achieving record adoption.
During Oliver’s tenure at Teradata he has held the roles of Chief Operating Officer and Chief Product Officer, overseeing various business units, including go-to-market, product, services and marketing. Prior to Teradata, Oliver worked for both Fortune 500 and early-stage companies, holding positions of increasing responsibility in technology and software development, including leading the expansion of analytics during the early days of eBay.
A pragmatic visionary, Oliver frequently speaks and writes about leveraging data and analytics to improve business outcomes. His book with co-author Professor Mohanbir Sawhney, “The Sentient Enterprise: The Evolution of Decision Making,” was published in 2017 and was named to the Wall Street Journal Best Seller List. Oliver’s vision of the Sentient Enterprise is recognized by customers, analysts and partners as a leading model for bringing agility and analytic power to enterprises operating in a digital world.
Oliver is a graduate of Harvard Business School’s Advanced Management Program and earned his engineering degree in Electronics and Telecommunications from HTL Steyr in Austria.
He lives in San Diego with his wife and two daughters.