You are currently browsing the category archive for the ‘Information Management (IM)’ category.
It has become evident from the advancements we’ve seen in the business analytics market that the use of visualization is now becoming mainstream. In my analysis of the market last year I wrote about the pathetic state of dashboards, where the assumption in the business intelligence software industry is that placing four to six charts or tables of data in a screen and publishing to business users can create business intelligence. That assumption has yet to be proven and is completely irrational, as presenting analytics in basic charts does little to provide context and a guide for taking actions and making decisions.
Now enter the new age of visual discovery, drawn from advanced data visualization tools with interactive analytic capabilities that enable users to present large volumes of data in an appealing manner. This technology does not need to just operate against big data technology where the volume and velocity of data also need a better method of interacting with the data and is a top capability not available today in 37 percent of organizations, according to our big data benchmark research. These types of data visualization go beyond the doldrums of overused pie and bar charts, bringing heat maps, scatter plots and other forms into the standard presentation of business analytics. However, such visualizations are not designed for the average business user, since most are not trained in visual interpretation of data. Rather, they are designed for the analyst who can interpret what the visualization means and then take some set of interactive steps to investigate and discover what is behind the presentation. Sometimes called visual discovery, this method is one of four main approaches used to maximize the outcomes of the analytical discovery process, along with data, event and information discovery, as I have previously outlined. Also realize that analytical discovery is just one of many types of business analytics that organizations need to be successful. And while many in the industry are calling this segment visual analytics, the scope is the same in terms of the proper use of visualization.
Despite the skills gap in business and IT in interpretation of visualization and the ability to know what type of visualization should be used when, where and why, adoption of visual discovery software is growing, mostly in business to complement business intelligence software efforts in IT. Technology providers have seen this growth in recent years, and now most business analytics vendors have expanded upon existing products, created new ones or acquired technology to enter the rapidly growing market. This frenzy has just about every business intelligence software provider touting existing or new visualization like it is the recipe for living forever.
Let’s get a little reality check that visual discovery is an important step in the business analytics market, as it provides more intelligent representation of data than just simple bar and pie charts. However, we have to also realize that just providing sophisticated visualization to business users is not going to magically help them interpret the data and make a decision or take an action. Interpretation of visualization is as much art as it is science unless an analyst makes notations of the key points the business user should focus on. This concept of annotation is lacking in most analytics tools, which is why analysts continue to copy and paste visualization into Microsoft PowerPoint to highlight bullet points of interest for the business users and then publish to business users that in many cases is converted to Adobe Acrobat format. I pointed this out in a previous piece titled “Why Business Intelligence is Failing Business,” noting that technology vendors should place more intelligence in their analytics software to generate observations and places to focus on using expert systems, personalization and maybe even methods of artificial intelligence.
Presenting hundreds of points in a scatter plot or 25 shades of green and red in a heat map is not an intuitive approach for business users trying to determine where they should start and where they should stop. In fact, sometimes performing visual discovery on data points that look obvious for interacting might not be the best place to focus as the data could be blinding areas of opportunity or concern that have been aggregated or masked out by the data and visualization. Visualization of what should be interacted with and discovered needs to be more intelligently presented. For example, the visual discovery tool might provide a link to a document, report or even the actual events that impacted the metric being represented. Most important, electronic and collaborative discussion forums where analysts and business users can discuss and address issues and opportunities are needed to guide the business in the discovery-to-actions process. The current approach of interacting via telephone or email is insufficient, as the complete context of issues is often lost using these methods.
Until visual discovery can graduate to being more than visuals or tables of data and relate to how business actually works and meet the real competencies of the larger scope of users, we will see over-adoption of visual discovery. Until organizations realize that this type of tool is good for only a specific type of individual, one with a high level of analytic and data competency, we will see growing frustration among users, as has occurred with dashboards. What we need is to have more visual presentation methods will expand to formats that are more digestible by business users, including geographic maps, paths of activity traffic, workflow of business processes and even presenting the key facts inside of a business centric infographic, all of which today are available and found in separate products. As I pointed out in my analysis of information optimization, the point of visualization is to provide information that is optimized for effective utilization across any role and competency in the organization. I do believe that visual discovery is critical to larger technology innovation within the business analytics market: According to our business technology innovation benchmark research, it is the top-ranked technology innovation priority in 39% of organizations. I believe we will continue to see advancements in the application of visualization and even where it can be more effectively reviewed on tablets and integrated with other technologies that are used frequently.
For business users and analysts, it is essential to assess visual discovery tools based not just on the role of the users, but also their competency in analytics to ensure a full return on investment. Take heed of the best practices in discovery analytics that my colleague Tony Consentino discussed recently and do advanced visualization in your organization if it can be used properly. If you are passionate about visualization and presenting the real meaning of information, then a read through Edward Tufte books might help provide a larger perspective, as his work looks at the history and importance of presenting information properly through visualization is well recognized. Rationalizing the potential tyranny of visual anarchy is critical to the success of your business analytics and big data investments.
CEO & Chief Research Officer
Cisco Systems has announced its intent to acquire Composite Software, which provides data virtualization to help IT departments interconnect data and systems; the purchase is scheduled to complete in early August. Cisco of course is known for its ability to interconnect just about anything with its networking technology; this acquisition will help it connect data better across networks. Over the last decade Composite had been refining the science of virtualizing data but had reached the peak of what it could do by itself, struggling to grow enough to meet the expectations of its investors, board of directors, employees, the market and visionary CEO Jim Green, who is well-known for his long commitment to improving data and technology architectures. According to press reports on the Internet, Cisco paid $180 million for Composite, which if true would be a good reward for people who have worked at Composite for some time and who were substantive shareholders.
Data virtualization is among the topics we explored in our benchmark research on information management. Our findings show that the largest barrier to managing information well is the dispersion of data across too many applications and systems, which impacts two-thirds (67%) of organizations. Our research and discussions with companies suggest that data virtualization can address this problem; it is one aspect of information optimization, an increasingly important IT and business strategy, as I have pointed out. Composite has made data virtualization work in technology initiatives including business analytics and big data but overall in information architecture.
Since my last formal analysis that I wrote about Composite, its technology has advanced significantly. One area of focus had been to increase virtualization of big data technologies, realized recently in its 6.2 SP3 release furthering its support of Hadoop. Over the last two years Composite has supported distributions from Apache, Cloudera and Hortonworks, interfacing to MapReduce and Hive since its 6.1 release. In the 6.2 release in 2012, Composite added improved techniques for processing analytics and virtualizing the resulting data sets, which came not only from traditional data sources but also from HP Vertica, PostgreSQL and improved interfaces with IBM and Teradata systems. These releases and expanded data connections have made Composite’s the most versatile data virtualization technology in the industry.
As Composite Software becomes part of Cisco, it is worth remembering that acquiring technology companies has been a major part of building the company Cisco is today; the many acquisitions have expanded its product portfolio and brought together some of the brightest minds in networking technology. Less often noted is Cisco’s mixed success in acquisitions of data-related software for enterprise use. Among the examples here is CxO Systems, which had a complex event processing (CEP) and data processing technology that Cisco eventually used to advance its event processing across network technologies; the software no longer exists in stand-alone form. Another example is Latigent, a provider of call center reporting on detailed call data that was supposed to help Cisco cultivate more services; eventually this software was shown not to fit with Cisco’s technology products or services portfolio and disappeared. Cisco’s business model has been to market and sell to data center and networking environments, and it has not been successful in selling stand-alone software outside this area. It is not easy to compete with IBM, Oracle, SAP and others when it comes to technology with which Cisco lacks experience and will impact its success in monetizing the investment in purchasing Composite Software.
Emerging computing architectures will change the way IT departments will operate in the future, but it’s an open question who will lead that change from traditional IT software stacks to hardware and networking technology. Cisco plans to transform the data center and networking from software and hardware stacks, partly through virtualizing software inside premises and out into the Internet and cloud computing. We see IT architectures changing in a haphazard, vendor-driven approach while most CIOs wants to revise their architectures in an efficient manner. Data and information architectures have moved from relying on databases and applications to adapt innovative technology such as in-memory processing and grid architectures that support new trends in big data, business analytics, cloud computing, mobile technology and social collaboration. These technology innovations and pressure to use them in business are straining the backbones of enterprise networks that have to adapt and grow rapidly. Most companies today still have data centers and networking groups and are increasing their network bandwidth and buying new networking technology from Cisco and others. This will not change soon, but they need better intelligence on the networking of data in these technologies. This I think is what Cisco is looking for with the acquisition of Composite, hoping to refine the infrastructure to help in virtualization of data across the network. This approach can play well into what Cisco articulates as a strategic direction around the concept it calls the Internet of Everything (IoE). The challenge will be to convince CIOs that this energetic of a change by Cisco is necessary for improving connectivity of data across the network. Data virtualization will play a key role in efficiently interconnecting networks and systems but where this is primarily applied within the data center of networking compared to where it is centered in the stack of information management technology today.
I think the acquisition makes sense for Composite Software as it could not grow fast enough to satisfy its technological ambitions and needed more deployments to support its innovations. Cisco for its part will be tested over time to determine what it makes of Composite. I expect that Cisco will fundamentally change its roadmap and support for enterprise IT as it adjusts the technology to work with its data center and networking technology; such a change inevitably will impact its existing customers and partners. The company will place Composite in its Services organization (which I was told includes software); I take this as an indication of more maturity needed by Cisco to bring software acquisitions into its product and services portfolios and manage as software with the right level of dedicated marketing and sales resources. For those who want to learn more about data virtualization, I recommend reading the educational content of Composite in the next couple of months before its website and material disappear from the Internet. This may sound harsh but is how companies like Cisco and others digest acquisitions and eliminate the companies past and by doing so eliminate critical information. No one is saying so and could not get an answer, but the reality is that Composite will become one item in the long product and services list on Cisco’s website; my guess is it will become part of the data center and virtualization services or unified computing services; to see how many Cisco has, just check its website.
Data virtualization is just beginning to mature and attract CIOs and IT professionals who care about the ‘I’ in their titles that stands for information. Composite Software deserves credit for its focus on data virtualization. Our research shows that data virtualization is an advancing priority in information management but ranks lower than others among initiated projects (11%), as the chart shows. Composite’s technology arrived ahead of the full maturing of IT organizations limiting its full potential. Within Cisco its people will have to adapt to the new owner’s wishes, which will diffuse some of its database-centered virtualization to focus more on the network, unless Cisco decides to expand its and Composite’s combined R&D team.
Congratulations to Composite Software for its innovative contributions and sad to see its independence and passion for data virtualization to disappear. I will be watching to see if Cisco understands the company’s real value, not just for data centers and networking but for bridging to enterprise IT and the market for information optimization.
CEO & Chief Research Officer