You are currently browsing the tag archive for the ‘Information Management’ tag.

Cisco Systems has announced its intent to acquire Composite Software, which provides data virtualization to help IT departments interconnect data and systems; the purchase is scheduled to complete in early August. Cisco of course is known for its ability to interconnect just about anything with its networking technology; this acquisition will help it connect data better across networks. Over the last decade Composite had been refining the science of virtualizing data but had reached the peak of what vr_infomgt_barriers_to_information_managementit could do by itself, struggling to grow enough to meet the expectations of its investors, board of directors, employees, the market and visionary CEO Jim Green, who is well-known for his long commitment to improving data and technology architectures. According to press reports on the Internet, Cisco paid $180 million for Composite, which if true would be a good reward for people who have worked at Composite for some time and who were substantive shareholders.

Data virtualization is among the topics we explored in our benchmark research on information management. Our findings show that the largest barrier to managing information well is the dispersion of data across too many applications and systems, which impacts two-thirds (67%) of organizations. Our research and discussions with companies suggest that data virtualization can address this problem; it is one aspect of information optimization, an increasingly important IT and business strategy, as I have pointed out. Composite has made data virtualization work in technology initiatives including business analytics and big data but overall in information architecture.

Since my last formal analysis that I wrote about Composite, its technology has advanced significantly. One area of focus had been to increase virtualization of big data technologies, realized recently in its 6.2 SP3 release furthering its support of Hadoop. Over the last two years Composite has supported distributions from Apache, Cloudera and Hortonworks, interfacing to MapReduce and Hive since its 6.1 release. In the 6.2 release in 2012, Composite added improved techniques for processing analytics and virtualizing the resulting data sets, which came not only from traditional data sources but also from HP Vertica, PostgreSQL and improved interfaces with IBM and Teradata systems. These releases and expanded data connections have made Composite’s the most versatile data virtualization technology in the industry.

As Composite Software becomes part of Cisco, it is worth remembering that acquiring technology companies has been a major part of building the company Cisco is today; the many acquisitions have expanded its product portfolio and brought together some of the brightest minds in networking technology. Less often noted is Cisco’s mixed success in acquisitions of data-related software for enterprise use. Among the examples here is CxO Systems, which had a complex event processing (CEP) and data processing technology that Cisco eventually used to advance its event processing across network technologies; the software no longer exists in stand-alone form. Another example is Latigent, a provider of call center reporting on detailed call data that was supposed to help Cisco cultivate more services; eventually this software was shown not to fit with Cisco’s technology products or services portfolio and disappeared. Cisco’s business model has been to market and sell to data center and networking environments, and it has not been successful in selling stand-alone software outside this area. It is not easy to compete with IBM, Oracle, SAP and others when it comes to technology with which Cisco lacks experience and will impact its success in monetizing the investment in purchasing Composite Software.

Emerging computing architectures will change the way IT departments will operate in the future, but it’s an open question who will lead that change from traditional IT software stacks to hardware and networking technology. Cisco plans to transform the data center and networking from software and hardware stacks, partly through virtualizing software inside premises and out into the Internet and cloud computing. We see IT architectures changing in a haphazard, vendor-driven approach while most CIOs wants to revise their architectures in an efficient manner. Data and information architectures have moved from relying on databases and applications to adapt innovative technology such as in-memory processing and grid architectures that support new trends in big data, business analytics, cloud computing, mobile technology and social collaboration. These technology innovations and pressure to use them in business are straining the backbones of enterprise networks that have to adapt and grow rapidly. Most companies today still have data centers and networking groups and are increasing their network bandwidth and buying new networking technology from Cisco and others. This will not change soon, but they need better intelligence on the networking of data in these technologies. This I think is what Cisco is looking for with the acquisition of Composite, hoping to refine the infrastructure to help in virtualization of data across the network. This approach can play well into what Cisco articulates as a strategic direction around the concept it calls the Internet of Everything (IoE). The challenge will be to convince CIOs that this energetic of a change by Cisco is necessary for improving connectivity of data across the network. Data virtualization will play a key role in efficiently interconnecting networks and systems but where this is primarily applied within the data center of networking compared to where it is centered in the stack of information management technology today.

I think the acquisition makes sense for Composite Software as it could not grow fast enough to satisfy its technological ambitions and needed more deployments to support its innovations. Cisco for its part will be tested over time to determine what it makes of Composite. I expect that Cisco will fundamentally change its roadmap and support for enterprise IT as it adjusts the technology to work with its data center and networking technology; such a change inevitably will impact its existing customers and partners. The company will place Composite in its Services organization (which I was told includes software); I take this as an indication of more maturity needed by Cisco to bring software acquisitions into its product and services portfolios and manage as software with the right level of dedicated marketing and sales resources. For those who want to learn more about data virtualization, I recommend reading the educational content of Composite in the next couple of months before its website and material disappear from the Internet. This may sound harsh but is how companies like Cisco and others digest acquisitions and eliminate the companies past and by doing so eliminate critical information. No one is saying so and could not get an answer, but the reality is that Composite will become one item in the long product and services list on Cisco’s website; my guess is it will become part of the data center and virtualization services or unified computing services; to see how many Cisco has, just check its website.

Data virtualization is just beginning to mature and attract CIOs and IT professionals who care about vr_infomgt_information_management_initiativethe ‘I’ in their titles that stands for information. Composite Software deserves credit for its focus on data virtualization. Our research shows that data virtualization is an advancing priority in information management but ranks lower than others among initiated projects (11%), as the chart shows. Composite’s technology arrived ahead of the full maturing of IT organizations limiting its full potential. Within Cisco its people will have to adapt to the new owner’s wishes, which will diffuse some of its database-centered virtualization to focus more on the network, unless Cisco decides to expand its and Composite’s combined R&D team.

Congratulations to Composite Software for its innovative contributions and sad to see its independence and passion for data virtualization to disappear. I will be watching to see if Cisco understands the company’s real value, not just for data centers and networking but for bridging to enterprise IT and the market for information optimization.

Regards,

Mark Smith

CEO & Chief Research Officer

Business analytics can help organizations use data to find insightsVR_leadershipwinner that lead to new opportunities and address issues unrecognized before. One player in this market is Datawatch, known for its tools for information optimization and harvesting value from big data including content and documents. I assessed the company earlier this year, and recently our firm recognized its customers’ achievements with 2013 Ventana Research Leadership Awards for Information Optimization with Phelps County Regional Medical Center and Governance, Risk and Compliance (GRC) with The Fauquier Bank.

Datawatch has made news with its acquisition of Panopticon, a Swedish provider of business visualization and analytics for not only data but events and messages as well. Panopticon’s innovations in analytical discovery are not widely recognized yet but look forward to them getting recognition moving forward. Of the four types of analytical discovery I have described (data, visual, event and information), Panopticon addresses the first three and Datawatch the last though they already have experience with data. The combined company could simplify the portfolio of tools business people and analysts need to produce analytics and insights. Their capabilities enable users to go far beyond the reporting and dashboard approach of conventional business intelligence tools.

Panopticon has been steadily advancing its technology, developing interactive visual discovery of data in real time from many sources, including complex event processing, message queues and polling. Its tools also take data from relational databases and big data technologies like SAP HANA and blend it with proprietary sources to meet users’ specific strategic and operational needs. They use in-memory computing and analytics to process data and events on the fly, creating specialized time windows in which to analyze and visualize information. Panopticon Designer provides access to various sources of data that can be brought into its environment vr_grc_value_of_a_better_approach_to_grcusing prebuilt connectors. Panopticon’s technology operates on the Web and offers a mobile capability that connects to a server-based environment. It also can be embedded in other environments, which its software partners CallidusCloud, Deltek and NICE Systems do to value add to their applications. I am impressed that Panopticon also is on the cutting edge of integrating to sources such as SAP HANA. It goes beyond what SAP and others offer today, offering full integration of SAP in-memory computing called SAP HANA and can simultaneously integrate real-time events from SAP Sybase ESP and others. SAP itself highlighted Panopticon and these capabilities at its press conference in 2012. However, since SAP began working on its own technology for this, SAP Lumira, it has been less eager to welcome partners in business analytics, as I have pointed out.

Panopticon had made progress in customer deployments in North America, and that should increase as part of Datawatch and help expand it further globally. Panopticon’s approach has appealed to financial services companies and specific risk management needs. Our research into governance, risk and compliance finds large potential in new technology to identify and manage risks faster, which 79 percent of organizations are seeking and which Panopticon can supply. Advanced analytics that handle more information can more easily address risk and fraud issues than do older tools for reporting or monitoring.

Panopticon’s support of technology events that are flowing acrossvr_oi_goals_of_using_operational_intelligence a network can also addresses a growing demand for what we call operational intelligence. Our research finds that 63 percent of organizations consider visualizing events important, and as the chart shows, their primary goals for operational intelligence, both named by 59 percent of organizations, are to manage performance better and detect fraud or security problems. As well, our latest research into the technology innovations of big data and business analytics finds that visual discovery is a much desired capability not available in most organizations today; beyond that businesses need access to all types of data, events and information to provide insights for analysts or management. Panopticon delivers more than data visualization which is why this company and products are significant opportunity for Datawatch to shape and grow the full potential of this acquisition.

Datawatch is addressing my colleague’s view on the four pillars of big data analytics which is part of our research and helping organizations gain business value from big data. For Datawatch this step moves them to help organizations realize the full value of information optimization including from big data investments as I have pointed out that are essential for almost every organization. By acquiring Panopticon and integrating to Datawatch will introduce a new breed of technology to handle the broad spectrum of information for discovery (including the use of visualization) and to generate insights that will bring value to big data investments and business processes.

Regards,

Mark Smith

CEO & Chief Research Officer

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 17,018 other followers

Mark Smith – Twitter

Top Rated

Blog Stats

  • 146,679 hits
Follow

Get every new post delivered to your Inbox.

Join 17,018 other followers

%d bloggers like this: