You are currently browsing the tag archive for the ‘Master Data Management’ tag.
I recently attended the annual Informatica analyst summit to get the latest on that company’s strategy and plans. The data integration provider offers a portfolio of information management software that supports today’s big data and information optimization needs. Informatica is busy making changes in its presentation to the market and its marketing and sales efforts. New executives, including new CMO Marge Breya, are working to communicate what is possible with Informatica’s product portfolio, and it’s more than just data integration.
Big data and cloud computing have placed challenges on IT in its roles as both a facilitator and in providing governance and compliance with policies and regulations, including access and security. IT compliance costs are increasing, according to 53 percent of heavily regulated organizations, and even 17 percent of those subject to little or no regulation, according to our governance, risk and compliance research. CIOs should examine Informatica’s product portfolio to see how to increase efficiency in the access, governance and integration of data in IT systems for more effective business processes including those that are GRC related.
Governance over transactional, interaction and analytical systems is a complex task. Late last year I wrote about Informatica’s latest efforts in big data and cloud computing; the company is now shipping its PowerCenter Big Data Edition, which facilitates integration with Hadoop. I have written about how integration with big data is broken today as organizations struggle not just with Hadoop but also with other big data technologies. Informatica provides tools to parse data so it can be profiled and processed efficiently. For example, Informatica can perform natural language processing to extract entities from text within unstructured data that can help in a range of tasks including those related to IT need to perform reviews of data.
With its latest tools, Informatica has stepped beyond the Informatica Cloud Winter 2013 release, which started the software down the path of bringing master data management (MDM) and data governance into the cloud. The Cloud Spring 2013 release, expected in April, is about providing enterprise capabilities in the cloud. New Cloud Data Masking can help secure sensitive or confidential data; our data in the cloud research found that data security is the number one concern in 63 percent of organizations. A data loader for Salesforce makes a bulk read and write license available; I have written about how providing data plumbing is your business, as Salesforce has failed to meet customers’ needs in this area.
Informatica last month acquired Active Endpoints, whose Cloud Extend applies cloud-based workflow services to what would regularly just be state-based applications, such as Salesforce applications for SFA. Cloud Extend lets managers map out the steps that should be taken in an application and prompts users for action. This application, which is designed for line of business and analysts, can provide value for both business and centralized IT. Informatica is making it more efficient to set up and establish integration across the cloud, and its ability to subset data and support sandbox environments helps its customers reduce costs and time to get up and running.
Informatica has announced it is offering prepackaged integration with NetSuite and Workday applications that operate in the cloud in its Cloud Connector Marketplace Mall. This is a welcome step; Informatica needs to invest further to develop cloud connectors for the larger group of cloud computing applications in use today, as it has many more to address to reach critical mass or universal connectivity. The good news is that many software organizations that operate in the cloud, including MicroStrategy, Ultimate Software and Xactly, are embedding Informatica to improve their ability to be efficient with data and support customers’ needs. In its Spring release Informatica will also provide connectivity to Amazon Redshift, Oracle CRM On Demand and Microsoft Dynamics AX. The announced move to support Amazon Redshift is important as more organizations look to embrace cloud computing for their data storage and processing needs.
At the analyst summit Informatica presented its vision of the future of cloud as an IT-led activity, saying that the days that line of business owned and led cloud effort are past. In this the company could not be more wrong, as subscription and access to cloud applications and services by business continues to grow as their need for them increases when they get little to no support from IT. While IT might be getting engaged and starting to leverage this utility of computing, they are no way leading or controlling what business is doing. We continue to see this in sales, marketing, customer service, operations, human resources and even finance. In the end, business is held accountable for business processes and outcomes, and I do not see any research points that indicate this will change in the near future. What is needed is more of an adaptive environment where analysts and business can facilitate more interactions through data requests and tasks, not just stewardship and increasing the quality of the data that exists, which is only part of the bottleneck.
Informatica also provided more insight to how it uses Virtual Data Machine, where Informatica products can operate across platforms and environments yet be insulated from their differences. I would expect to hear more from the company on where this can play a role in cloud and hosted environments as much as it can in on-premises environments. Ultimately this technology should be able to support more integration points and partners as it has done with Teradata; Informatica recently announced further support for Teradata Unified Data Architecture, where it can streamline data integration from within the Informatica Virtual Data Machine to environments like Teradata.
Informatica also continues its strategic partnership with Heiler, which it is in the process of acquiring and expected by year end if approved by German regulatory review. Since my analysis of the announcement last fall the companies have been working to integrate MDM with product information management (PIM). Informatica has come to recognize that PIM is not MDM; they have different business and IT requirements, but together they can be a valuable combination. This simple position is not generally accepted by the majority of IT analysts, who have led many of the largest of software companies into the IT approach, which our PIM benchmark research has found is wrong, and which led me to write a perspective on how PIM is for business. Heiler, which we rated as Hot in our 2012 Value Index for Product Information Management, plus Informatica, which was Hot in our 2012 Value Index for Data Integration, combined might be the next PIM powerhouse.
Informatica continues to expand its portfolio to support a range of real-time operations needs. It recently released a new version of Informatica Ultra Messaging that my colleague Robert Kugel assessed. Beyond the near-real-time features is Informatica’s capability of handling complex event processing (CEP) and what we call operational intelligence in its products. Unfortunately, with such a busy product portfolio, Informatica’s CEP and operational intelligence capabilities are rarely marketed and not very well known. Our benchmark research finds that activity or event monitoring is a top priority in 62 percent of organizations, and that is exactly what Informatica PowerCenter offers.
I expect to see more big steps forward for Informatica, as it has many development initiatives that are still confidential that will continue its expansion as an information-centered software provider. As technology providers such as Informatica are further pressured to demonstrate business value, we will see a further shift to what we call information optimization, which is in the end what business needs on a more timely and consistent basis, as I have outlined in our research agenda.
Informatica finds its customers moving to being stewards of business data but need to move further to supporting analysts’ needs for data to perform analytics. Our latest research finds that 42 percent of organizations are still impeded by data-related tasks preventing them from handling analytic ones. This has led to the startling reality, found in our latest research into spreadsheets, that spreadsheets are used 74 percent of the time for business intelligence tasks, despite the fact that they are responsible for a high amount of errors from the manual copy, paste and calculation tasks. The need to remedy data-related problems should help Informatica bridge the data divide between business and IT. Informatica continues to be bullish on its growth opportunities, and it does not have to convince me, as our research for a decade has shown the need for rationalization to improve efficiency and profitability.
CEO & Chief Research Officer
Big data involves interplay between different data management approaches and business intelligence and operational systems, which makes it imperative that all sources of business data be integrated efficiently and that organizations be able to easily adapt to new data types and sources. Our recent big data benchmark research confirmed that big data storagetechnologies continue to follow many approaches, including appliances, Hadoop, and in-memory and specialized DBMSes. With the variety, velocity and volume of big data being part of today’s information architecture, and the potential for big data to be a source to feed other systems, integration should be a top priority.
Many organizations that have already deployed big data technology now struggle to access, transform, transport and load information using conventional technology. Even replication or migration of data from existing sources can be troublesome, requiring custom programming and manual processing, which are always a tax on resources and time. Barriers such as having data spread across too many applications and systems, which our benchmark research found in 67 percent of organizations, do not go away just because an organization is using big data technology; in fact, they get more complicated. However, big data also creates opportunities to use information to innovate and to improve business processes. To avoid the risks and take advantage of the opportunities, organizations need efficient processes and effective technology that makes information drawn from big data available to all people who need it.
Organizations need integration technology flexible enough to handle big data regardless of whether it originates in the enterprise or across the Internet. For this reason, tools for big data integration must be able to work with a range of underlying architectures and data technologies, including appliances, flat files, Hadoop, in-memory computing and conventional databases, and move data seamlessly between relational and non-relational structures. They must be able to adapt to events or streams of data, and they must harvest data from transactional systems and business applications in enterprise data warehouses. Supporting data quality and master data management needs is also part of supporting big data with data integration.
Selecting the right approach to big data integration is difficult when organizations lack knowledge of the functional requirements and best practices relevant to their industries, lines of business and IT. Deficiencies in existing software and data environments can further complicate the ability to choose wisely and so should be factored into the deployment decision-making process. Organizations must identify the types of integration being used or under consideration to handle data other than that formatted for relational databases, and evaluate processing capabilities and techniques to handle the proliferation of big data. IT professionals therefore must understand how to work with analysts and business management to deliver timely, benefit-based big data deployments.
IT should evaluate whether it can use existing skills to shorten the time it takes to get big data to users. Sinceour research has found lack of resources to be the top barrier to using innovative technology, according to 51 percent of organizations, businesses should make sure their IT staff does everything possible to maximize skills and resources internally and not waste them on custom, manual siloes of effort. Having the right data integration processes and data management methods can help IT work more efficiently and partner better with the business units.
Not having a dialogue about what information management competencies a business needs is a mistake. I have seen most IT industry analyst firms’ content deal with just a portion of the big data picture, discussing for example just the technologies for storing and accessing data, with a fixation on variety, velocity and volume. However, decision-makers must consider the efficient flow of data across its entire path of travel, from its origins to user systems, to ensure the effective functioning of any big data project. Failure to do that means failing to optimize information across its life-cycle for business value. Without the ability see the entire big data value chain, a business may find its initiatives exceed available limits of cost and time and damage a business case built on time-to-value metrics. According to our research, the most important benefits of big data technologies include retaining and analyzing more data (74%) and increasing the speed of analysis (70%). Organizations need to make sure they do not increase the number of manual processes they run and the time spent on them, thus impairing the value of big data.
We have begun research to assess the latest big data integration technologies and best practices to help advance these efforts, as we outlined in our research agenda on big data and information optimization for 2013. We will document emerging best practices in big data integration to meet business needs, from basic access and replication to transformational migration. Until we can share our results, be sure to consider big data integration as part of your business case and project, because it is essential to gaining the most value from your big data investments.
CEO & Chief Research Officer