You are currently browsing the category archive for the ‘Business Analytics’ category.
Cisco Systems has announced its intent to acquire Composite Software, which provides data virtualization to help IT departments interconnect data and systems; the purchase is scheduled to complete in early August. Cisco of course is known for its ability to interconnect just about anything with its networking technology; this acquisition will help it connect data better across networks. Over the last decade Composite had been refining the science of virtualizing data but had reached the peak of what it could do by itself, struggling to grow enough to meet the expectations of its investors, board of directors, employees, the market and visionary CEO Jim Green, who is well-known for his long commitment to improving data and technology architectures. According to press reports on the Internet, Cisco paid $180 million for Composite, which if true would be a good reward for people who have worked at Composite for some time and who were substantive shareholders.
Data virtualization is among the topics we explored in our benchmark research on information management. Our findings show that the largest barrier to managing information well is the dispersion of data across too many applications and systems, which impacts two-thirds (67%) of organizations. Our research and discussions with companies suggest that data virtualization can address this problem; it is one aspect of information optimization, an increasingly important IT and business strategy, as I have pointed out. Composite has made data virtualization work in technology initiatives including business analytics and big data but overall in information architecture.
Since my last formal analysis that I wrote about Composite, its technology has advanced significantly. One area of focus had been to increase virtualization of big data technologies, realized recently in its 6.2 SP3 release furthering its support of Hadoop. Over the last two years Composite has supported distributions from Apache, Cloudera and Hortonworks, interfacing to MapReduce and Hive since its 6.1 release. In the 6.2 release in 2012, Composite added improved techniques for processing analytics and virtualizing the resulting data sets, which came not only from traditional data sources but also from HP Vertica, PostgreSQL and improved interfaces with IBM and Teradata systems. These releases and expanded data connections have made Composite’s the most versatile data virtualization technology in the industry.
As Composite Software becomes part of Cisco, it is worth remembering that acquiring technology companies has been a major part of building the company Cisco is today; the many acquisitions have expanded its product portfolio and brought together some of the brightest minds in networking technology. Less often noted is Cisco’s mixed success in acquisitions of data-related software for enterprise use. Among the examples here is CxO Systems, which had a complex event processing (CEP) and data processing technology that Cisco eventually used to advance its event processing across network technologies; the software no longer exists in stand-alone form. Another example is Latigent, a provider of call center reporting on detailed call data that was supposed to help Cisco cultivate more services; eventually this software was shown not to fit with Cisco’s technology products or services portfolio and disappeared. Cisco’s business model has been to market and sell to data center and networking environments, and it has not been successful in selling stand-alone software outside this area. It is not easy to compete with IBM, Oracle, SAP and others when it comes to technology with which Cisco lacks experience and will impact its success in monetizing the investment in purchasing Composite Software.
Emerging computing architectures will change the way IT departments will operate in the future, but it’s an open question who will lead that change from traditional IT software stacks to hardware and networking technology. Cisco plans to transform the data center and networking from software and hardware stacks, partly through virtualizing software inside premises and out into the Internet and cloud computing. We see IT architectures changing in a haphazard, vendor-driven approach while most CIOs wants to revise their architectures in an efficient manner. Data and information architectures have moved from relying on databases and applications to adapt innovative technology such as in-memory processing and grid architectures that support new trends in big data, business analytics, cloud computing, mobile technology and social collaboration. These technology innovations and pressure to use them in business are straining the backbones of enterprise networks that have to adapt and grow rapidly. Most companies today still have data centers and networking groups and are increasing their network bandwidth and buying new networking technology from Cisco and others. This will not change soon, but they need better intelligence on the networking of data in these technologies. This I think is what Cisco is looking for with the acquisition of Composite, hoping to refine the infrastructure to help in virtualization of data across the network. This approach can play well into what Cisco articulates as a strategic direction around the concept it calls the Internet of Everything (IoE). The challenge will be to convince CIOs that this energetic of a change by Cisco is necessary for improving connectivity of data across the network. Data virtualization will play a key role in efficiently interconnecting networks and systems but where this is primarily applied within the data center of networking compared to where it is centered in the stack of information management technology today.
I think the acquisition makes sense for Composite Software as it could not grow fast enough to satisfy its technological ambitions and needed more deployments to support its innovations. Cisco for its part will be tested over time to determine what it makes of Composite. I expect that Cisco will fundamentally change its roadmap and support for enterprise IT as it adjusts the technology to work with its data center and networking technology; such a change inevitably will impact its existing customers and partners. The company will place Composite in its Services organization (which I was told includes software); I take this as an indication of more maturity needed by Cisco to bring software acquisitions into its product and services portfolios and manage as software with the right level of dedicated marketing and sales resources. For those who want to learn more about data virtualization, I recommend reading the educational content of Composite in the next couple of months before its website and material disappear from the Internet. This may sound harsh but is how companies like Cisco and others digest acquisitions and eliminate the companies past and by doing so eliminate critical information. No one is saying so and could not get an answer, but the reality is that Composite will become one item in the long product and services list on Cisco’s website; my guess is it will become part of the data center and virtualization services or unified computing services; to see how many Cisco has, just check its website.
Data virtualization is just beginning to mature and attract CIOs and IT professionals who care about the ‘I’ in their titles that stands for information. Composite Software deserves credit for its focus on data virtualization. Our research shows that data virtualization is an advancing priority in information management but ranks lower than others among initiated projects (11%), as the chart shows. Composite’s technology arrived ahead of the full maturing of IT organizations limiting its full potential. Within Cisco its people will have to adapt to the new owner’s wishes, which will diffuse some of its database-centered virtualization to focus more on the network, unless Cisco decides to expand its and Composite’s combined R&D team.
Congratulations to Composite Software for its innovative contributions and sad to see its independence and passion for data virtualization to disappear. I will be watching to see if Cisco understands the company’s real value, not just for data centers and networking but for bridging to enterprise IT and the market for information optimization.
CEO & Chief Research Officer
Information management is important to every line of business that seeks to improve its business processes and decision-making. In response to pressure from those departments, CIOs and IT organizations must examine whether they have focused enough on the I for information and not just the T for technology, and if they have not, commit to taking this responsibility more seriously than in the past. Informatica is one vendor that realizes the potential of its information beyond just data integration, and this is reflected in its expanded product portfolio and position in the market over the last several years. Our firm has taken note of companies gaining value from using Informatica; we awarded our 2013 CIO Leadership Award to George Brenckle of UMass Memorial Health Care for his work to maximize the value of information assets through managing data innovatively. Informatica itself has enhanced its position by introducing its new brand and a new CMO and demonstrating commitment to change from its executive leadership team at the company’s recent 2013 user conference. The focus of the brand now is on helping business and IT find the full value of their information.
The Informatica name is well-known in the corridors of IT, associated with addressing the need to make data accessible and integrated anywhere. The vendor has been advancing steadily for some time. We rated it Hot in our 2012 Value Index for Data Integration, and I recently assessed its efforts at our 2013 analyst summit. Established in data integration, Informatica is now focusing on the efficient management of information assets. This is not easy for most organizations, which have data spread across applications and systems; for two-thirds of organizations, according to our research as shown in the chart, this is a barrier to managing information.
The first step for the newly positioned company was to incorporate its technology into a virtual data machine (VDM) called Vibe that will make it easier to operate on any platform at any time. This approach unifies Informatica’s transformation library, optimizer, executor and connectors, which will help Informatica deploy any type of data and integration techniques on just about any platform a customer uses. Virtualizing the operations of its technology to isolate them from the platform on which it runs is a design Informatica has used before, but now the techniques and the execution of virtualization are fully realized. Opting to build integrations and deploy to any platform without the need to know the particulars of a system or technology is a wise decision for Informatica. When using Vibe it becomes simpler for organizations to run Informatica’s tools on-premises or in the cloud as they have to change nothing to run in either environment.
In addition the company has introduced a slimmer version of PowerCenter called PowerCenter Express, an entry-level product targeted for customers with smaller projects and providing a path to manage more sophisticated ones with the enterprise version. The PowerCenter Express Personal and Professional versions are available today for individual or departmental use, respectively. The Personal edition limits the number of rows used per day, which will prevent it from serious individual use in midsize and larger organizations, but it could be useful in small or lower-end midsize businesses. Informatica will need to invest to make sure prospects know it can help with smaller projects or companies with limited resources; the company is generally perceived as selling enterprise-class technology, and has limited its selection for data integration projects below the enterprise level. PowerCenter Express can support more than SQL-based sources and integrate with social media and other data integration technologies like those from Kapow Software that I have separately assessed. The Express edition will be available in July; Informatica has stated they are offering the single-user Personal Edition free of charge, and the Professional Edition for five users will be priced at $8,000 per user per year.
Informatica also announced availability of its Data Integration Hub, which I think can be as important as virtualization of the technology for many enterprises. Many want to centralize integration tasks to and from applications in a publish and subscribe method; that may be easier for managing the current and changing needs of applications and projects for some that see a centralized point-to-point movement as cumbersome. This approach, once referred to as enterprise application integration (EAI), was validated a decade ago and can remove latency in not just data transfer but in IT’s processes to get access to what is needed. Since its beginning Informatica and its products have been involved in an industry debate on the best way to pipe data across the enterprise and the company had been a staunch supporter of its approach over the hub-based approach. Now Informatica gives the customer the choice instead of championing one architectural approach over another. This is a step toward maturity in realizing that it has to adapt further to be a leader of information technology for CIOs moving forward. Data Integration Hub has been in early release and is expected to be generally available in the third quarter of 2013.
In the realm of master data management for IT organizations, Informatica has released MDM 9.6 to help organizations that want to use this critical mastering technique in both on-premises and on-demand cloud environments and where it must be accessed within applications. The new release has advanced data masking to support more sophisticated security and compliance, easier administration and a simpler application interface for business users and analysts. The focus on data security is significant, especially in cloud computing: In our research 63 percent of organizations said that is their largest concern about moving to the cloud, as the chart shows, and in our governance, risk and compliance research 38 percent of organizations said cloud computing is risky enough that they do not use it or limit it significantly. Informatica thus has an opportunity to help them with managing and securing data assets. Coupled with connectivity to the new Informatica Data Integration Hub, master data can be deployed more simply and consistently and operated across cloud computing environments where the interchange of data across many applications is not as easy as it may sound. In a related area Informatica enhances data governance with its MDM Data Director, which monitors the stewardship of data and facilitates action upon it; as well the company made it accessible from Apple smartphones and tablet interfaces earlier this year.
Continuing a longstanding effort, Informatica has assembled industry-specific solutions such as for healthcare and insurance. Advances in the Cloud MDM release help consolidate multiple instances of salesforce.com into one in which management of accounts is simpler; this should appeal to organizations looking to enforce consistency of data across marketing, sales and customer service. For those looking to enrich their information with external data, Informatica helps bring the data types together in a common account and customer record. A realistic approach to MDM that interoperates in both the cloud and on-premises is essential for organizations as the technology architectures of information and applications diversify and are not always confined to the data center of IT.
Informatica also pays attention to the importance of business-centric product information management through its nearly completed acquisition of Heiler Software, which I assessed when it was announced. After satisfying the legal requirements of acquiring a German software company, Informatica has accelerated its efforts to use the recent Heiler Enterprise PIM 7 release across the enterprise and to suppliers. This release improves data integration for self-service access to product information and methods to apply data quality and mapping to data across the enterprise. It also helps provides better data mastering from searches and classifications and improves how it manages digital assets related to the product information. It is critical for product information management to support multiple channels, from print and commerce to procurement and data exchange. Integrated with Informatica Data Quality, PIM 7 can provide efficient processing and support of natural-language processing, which can help organizations improve data quality; 45 percent of organizations said that is a reason for changing PIM, according to our research. Heiler has had global success with its products, and we recently awarded the 2013 Ventana Research Leadership Award in Information Management to Sportscheck, which uses Heiler for PIM across its retail channels. We also rated Heiler a Hot vendor in the 2012 Ventana Research Product Information Management Value Index. The battle for gaining value through PIM is something I pontificated about: Some observers see this as an MDM and IT agenda, but it is not. Informatica is gaining important capabilities through its acquisition of Heiler Software.
Informatica has been slower to improve its support for big data technologies. It has been advancing in integration of Hadoop, but in other systems including appliances and in-memory computing Informatica will need to step up its efforts to be a market leader. At the Informatica World conference the company demonstrated simple methods for integration and profiling and reintegration of data across Hadoop clusters, which is part of the larger big data integration challenges that I have written about. At the conference it also announced expansion of support for MongoDB through 10gen; that will help in integration of NoSQL databases to support documents and other information that is typically not placed into rows and columns. This partnership is important for Informatica’s efforts to be an information platform provider that brings together all types of content to support business. Also in the big data realm, Informatica has worked to apply its data matching technology to support the variety and volume of data, including international data sets like those from India and China. It has done a nice job to abstract the complexities of the underlying big data technology through its common user interface, which will help organizations streamline their data needs without requiring more staffing; our research found insufficient staffing to be an obstacle to effective information management for two-thirds of organizations, as the chart illustrates.
As Informatica turns the corner from some marketing and sales challenges in 2012, it has come into 2013 with a strong focus on new products to address gaps in its product portfolio, namely virtualization, a data hub, the cloud, big data and efficiency of product information management. Each of these is a substantial achievement, but pushing all of this news to the public at once can impede getting recognition for them individually. It is a marketing challenge to pace and streamline the release of technology announcements in order to maximize credit for its contributions to helping business and IT. Informatica is not the first to virtualize its technology or to support information management in the cloud or to integrate with product information management, but it is a sizable technology company and has to understand timing and readiness of the market, and when customers are ready to make investments.
We describe Informatica’s approach as information optimization, which goes beyond just the management of information to extract full value from these investments. I articulated an example of this with big data, and information optimization is a formal research priority in our agenda for 2013. We see a new generation of information applications for businesses and then consumers and suppliers that will be realized over the coming years and can be facilitated with Information. They have made a strong move to reposition itself as capable of unleashing the information potential of organizations. Now it must demonstrate its ability to accelerate growth and become a top software provider for technology that maximizes the value of information assets.
CEO & Chief Research Officer