All Blogs

Six Trends Driving Adoption of Pentaho platform Suite

Matt Howard

February 03, 2021


Innovative organizations need DataOps and new technologies because old-school data integration is no longer sufficient. The traditional approach creates monolithic, set-in-concrete data pipelines that can’t convert data into insights quickly enough to keep pace with business. The following trends are driving the adoption of Hitachi’s Pentaho platform Suite:

1. Hybrid cloud and multicloud are the future.

Businesses in every industry are moving to the cloud, looking to preserve cash, optimize information technology (IT) costs, and support and secure users, who are often working remotely. As they consider the best ways to ensure agility and resiliency, they are increasingly exploring multicloud and hybrid cloud capabilities, including edge competencies. According to IDC, by 2022, over 90% of enterprises worldwide will be relying on a mix of on-premises/dedicated private clouds, multiple public clouds, and legacy platforms to meet their infrastructure needs.1

2. Modern architectures improve scalability for big data.

Organizations are also rethinking how they work with big data, moving away from a monolithic Hadoop-based data lake to distributed data lakes based on object storage. Today, you have one data fabric and many engines and consumers. Under this approach, the compute and the storage remain separate, and data is accessible via a range of technologies, making access for end users secure, easy and less expensive.

3. New insights can take you from “what” to “how.”

Today’s organizations work with a soaring breadth of data, including structured, unstructured and semistructured data. Humans, machines and applications may generate this data. The challenge is figuring what data you have and then how to piece all this diverse data together so that business users can more quickly generate actionable insights and make proactive decisions.

4. The need for IoT edge intelligence is driving hybrid solutions.

Trends such as Industry 4.0, smart cities, healthcare use cases and many other IoT applications drive a need for higher intelligence at the edge — and edge intelligence is a form of hybrid computing. According to IDC, by 2024, 25% of organizations will improve business agility by integrating edge data with applications built on cloud platforms, enabled by partnerships across cloud and communications service providers.2

5. AI and ML are rapidly adding value.

Today’s innovators are doing all they can to capture value from machine learning (ML), moving from focusing on “what can I learn” to “how can I get it done.” Artificial intelligence (AI) and ML are moving out of the lab and into production to support this effort, spurring a dramatic sea change. The challenges of moving AI and ML to production are not met by the current tools out there, which are focused on the development life cycle instead of the production life cycle.

6. Risk and compliance are getting more complicated.

Risk and compliance around data remain an ever-present concern, and privacy regulations like the General Data Protection Regulation (GDPR) are even more challenging in today’s world of data sprawl. Manual techniques are not keeping up due to the scale, time to production, and accuracy requirements. As a result, organizations need AI-driven automation to overcome the challenges they face using manual techniques.

There’s a Need for Something New: DataOps

DataOps uses intelligence, automation and analytics to bring data insights closer to those who need them most. DataOps streamlines data delivery, giving you a data-driven understanding of how to run your business today and adapt as markets and consumers change.

DataOps combines the principles of DevOps, continuous integration and continuous delivery with older notions of data delivery, allowing you to:

  • Simplify and automate to keep pace with data management tasks in a distributed environment.
  • Enable data integration, metadata management and governance across cloud environments.
  • Support governance and collaboration, providing self-service for data and analytics functions to empower line-of-business organizations.
  • Facilitate data management for today’s increasingly distributed and remote work environments.

Pentaho platform Suite helps you unlock the potential of intelligent data management. This system brings together data profiling and cataloging, edge device management and a full range of business analytics, all in a single, agile suite. It works with existing data infrastructures and cloud environments, enabling you to transform rigid data silos into comprehensive edge-core-cloud data fabrics.

How Innovators Are Improving Outcomes With Pentaho platform Suite

DataOps helps organizations bring IT and operational technology (OT) together to streamline how they work with data. In turn, this collaboration opens up better insights, supports smarter business decisions, and drives efficiency and cost savings. Here are some positive outcomes from innovators that have adopted a DataOps approach.

  • Enhancing scalability: A financial services powerhouse builds a traditional data-as-a-service platform to support its internal customers, all standardized on Pentaho platform Suite. The platform enables the organization to help many clients across the bank and address a wide range of data integration challenges with a flexible, scalable solution.
  • Bringing IT and OT closer together: A leader in the management and development of networks and services in the water, energy and environmental sectors was looking to streamline how it supported water utility customers and processes. Still, this large utility had incompletely integrated its IT and OT operations. The company chose the Pentaho platform Suite for a more unified architecture that would help it to better blend IT and OT data, backed by Hitachi Vantara’s strategic consulting to support the initiative.
  • Making data storage more cost-effective and agile: Another major financial institution is applying a DataOps approach to address a data lake optimization problem. Traditional Hadoop-based data lakes use combined storage and compute, creating a uniform environment for all information that resides in a data lake, regardless of age or usage. Organizations pay tremendously for Hadoop licenses and storage as that data lake expands — even though they do not use most of the data. Lumada Data Optimizer for Hadoop tiers data to object storage and breaks that tight coupling of storage and compute, enabling organizations to offload vast amounts of data to less expensive object storage. Behind the scenes, Lumada Data Optimizer for Hadoop seamlessly moves data in and out of the data lake’s hot and cold zones for efficiency and storage savings.
  • Automating for deeper insights: A leading mortgage organization is applying a DataOps approach to address its data cataloging challenge. This organization routinely pulls in data from a wide variety of sources each day. The firm sought to efficiently prepopulate its data lake with needed dataset properties and enhance processing efficiency with API-based automation. Lumada Data Catalog lets the organization ingest and catalog more than 10 million files each day and offers a fully customizable, searchable user interface. The automated solution tags columns in databases, data warehouses and data lakes with meaningful, business-relevant, standardized labels that help business users to view data with better insights without performing several manual tasks.
  • Driving smarter decisions through better access: A large national telecom provider, serving tens of millions of customers with mobile, internet and TV service, wanted to simplify access to its business-critical data, optimize processes and improve its business decision-making. Business agility is vital in this dynamic market space, and with Lumada Data Integration and Lumada Analytics, the company has aggregated 300GB of data onto a single platform. More importantly, it has made the data more readily available for reporting and analytics in real time. Optimizing its data processes has helped reduce costs while automating data access on a 24/7 basis. Now, business users can take advantage of smarter analytics to support better business decisions — all based on a 360-degree view of the company’s services operations.

More Innovation Is on the Horizon

Hitachi is continually enhancing and refining its Pentaho platform capabilities to help innovative organizations move beyond traditional siloed approaches toward new data paradigms. The combination of technology, processes and people is driving DataOps and this new way of innovating. Our advanced technology is powerful, but it is only one part of our advantage. We also have deep IT and OT expertise across the Hitachi Group companies and in our partner ecosystem. We have a long history of helping our customers strengthen their businesses while powering good in the world. Together, we tailor solutions to meet specific needs and foster innovations that lead to success.

For more information, see Why We Built the Pentaho platform Suite and Five Steps to Monetizing Your Data.

To learn more about how Hitachi can help you unleash the possibilities of DataOps, sign up for our upcoming webinar, visit our website, or contact us.

Sources:
1: IDC FutureScape: Worldwide Cloud 2020 Predictions, Doc # US44640719, October 2019
2: IDC, “IDC FutureScape: Worldwide Cloud 2021 Predictions”, Doc # US46420120, Dec 2020