Big data has become a big deal as the technology industry has invested tens of billions of dollars to create the next generation of databases and data processing. After the accompanying flood of new categories and marketing terminology from vendors, most in the IT community are now beginning to understand the potential of big data. Ventana Research thoroughly covered the evolving state of the big data and information optimization sector in 2014 and will continue this research in 2015 and beyond. As it progresses the importance of making big data systems interoperate with existing enterprise and information architecture along with digital transformation strategiesbecomes critical. Done properly companies can take advantage of big data innovations to optimize their established business processes and execute new business strategies. But just deploying big data and applying analytics to understand it is just the beginning. Innovative organizations must go beyond the usual exploratory and root-cause analyses through applied analytic discovery and other techniques. This of course requires them to develop competencies in information management for big data.
Topics: Big Data, MapR, Predictive Analytics, SAP, Human Capital, Mulesoft, Operational Performance Management (OPM), Paxata, SnapLogic, Splunk, Business Analytics, Cloud Computing, Cloudera, Hadoop, Hortonworks, IBM, Informatica, Information Management, Operational Intelligence, Oracle, Business Intelligence (BI), Business Performance Management (BPM), Customer Performance Management (CPM), Datawatch, Dell Boomi, Financial Performance Management (FPM), Information Management (IM), Information Optimization, Sales Performance Management (SPM), Savi, Sumo Logic, Supply Chain Performance Management (SCPM), Tamr, Trifacta
Big data has great promise for many organizations today, but they also need technology to facilitate integration of various data stores, as I recently pointed out. Our big data integration benchmark research makes it clear that organizations are aware of the need to integrate big data, but most have yet to address it: In this area our Performance Index analysis, which assesses competency and maturity of organizations, concludes that only 13 percent reach the highest of four levels, Innovative. Furthermore, while many organizations are sophisticated in dealing with the information, they are less able to handle the people-related areas, lacking the right level of training in the skills required to integrate big data. Most said that the training they provide is only somewhat adequate or inadequate.
Topics: Big Data, Operational Performance Management (OPM), Business Analytics, CIO, Cloud Computing, Data Integration, Hadoop, Business Intelligence (BI), Business Performance Management (BPM), Information Applications (IA), Information Management (IM)
I had the pleasure of attending Cloudera’s recent analyst summit. Presenters reviewed the work the company has done since its founding six years ago and outlined its plans to use Hadoop to further empower big data technology to support what I call information optimization. Cloudera’s executive team has the co-founders of Hadoop who worked at Facebook, Oracle and Yahoo when they developed and used Hadoop. Last year they brought in CEO Tom Reilly, who led successful organizations at ArcSight, HP and IBM. Cloudera now has more than 500 employees, 800 partners and 40,000 users trained in its commercial version of Hadoop. The Hadoop technology has brought to the market an integration of computing, memory and disk storage; Cloudera has expanded the capabilities of this open source software for its customers through unique extension and commercialization of open source for enterprise use. The importance of big data is undisputed now: For example, our latest research in big data analytics finds it to be very important in 47 percent of organizations. However, we also find that only 14 percent are very satisfied with their use of big data, so there is plenty of room for improvement. How well Cloudera moves forward this year and next will determine its ability to compete in big data over the next five years.
Topics: Big Data, Teradata, Zoomdata, Cloudera, Hadoop, Hortonworks, IBM, Location Intelligence, Operational Intelligence, Oracle, Business Intelligence (BI), Hive, Impala, Information Applications (IA), Information Management (IM), IT Performance Management (ITPM)
Cisco Systems has announced its intent to acquire Composite Software, which provides data virtualization to help IT departments interconnect data and systems; the purchase is scheduled to complete in early August. Cisco of course is known for its ability to interconnect just about anything with its networking technology; this acquisition will help it connect data better across networks. Over the last decade Composite had been refining the science of virtualizing data but had reached the peak of what it could do by itself, struggling to grow enough to meet the expectations of its investors, board of directors, employees, the market and visionary CEO Jim Green, who is well-known for his long commitment to improving data and technology architectures. According to press reports on the Internet, Cisco paid $180 million for Composite, which if true would be a good reward for people who have worked at Composite for some time and who were substantive shareholders.
Topics: Big Data, Networking, Operational Performance Management (OPM), Business Analytics, Cloud Computing, Data Management, Governance, Risk & Compliance (GRC), Hadoop, Information Management, Business Performance Management (BPM), Cisco, Composite Software, Data Centers, Data Virtualization, Information Applications (IA), Information Management (IM), Information Optimization, Internet of Everything, IT Performance Management (ITPM)
Teradata recently gave me a technology update and a peek into the future of its portfolio for big data, information management and business analytics at its annual technology influencer summit. The company continues to innovate and build upon its Teradata 14 releases and its new processing technology. Since my last analysis of Teradata’s big data strategy, it has embraced technologies like Hadoop with its Teradata Aster Appliance, which won our 2012 Technology Innovation Award in Big Data. Teradata is steadily extending beyond providing just big data technology to offer a range of analytic options and appliances through advances in Teradata Aster and its overall data and analytic architectures. One example is its data warehouse appliance business, which according to our benchmark research is one of the key technological approaches to big data; as well Teradata has advanced support with its own technology offering for in-memory databases, specialized databases and Hadoop in one integrated architecture. It is taking an enterprise management approach to these technologies through Teradata Viewpoint, which helps monitor and manage systems and support a more distributed computing architecture.
Topics: Big Data, MicroStrategy, SAS, Tableau, Teradata, Customer Excellence, Operational Performance Management (OPM), Business Analytics, CIO, Cloud Computing, Hadoop, In-Memory Computing, Location Intelligence, Operational Intelligence, Business Intelligence (BI), CMO, Customer Performance Management (CPM), Discovery, Information Applications (IA), Information Management (IM), Intelligent Memory, Teradata Aster, visual discovery
I recently attended the annual Informatica analyst summit to get the latest on that company’s strategy and plans. The data integration provider offers a portfolio of information management software that supports today’s big data and information optimization needs. Informatica is busy making changes in its presentation to the market and its marketing and sales efforts. New executives, including new CMO Marge Breya, are working to communicate what is possible with Informatica’s product portfolio, and it’s more than just data integration.
Topics: Big Data, Data Quality, Master Data Management, MDM, Business Analytics, Cloud Computing, Data Governance, Data Integration, Data Management, Governance, Risk & Compliance (GRC), Hadoop, Informatica, Information Management, Location Intelligence, Operational Intelligence, Business Intelligence (BI), CEP, Informatica Cloud, Information Management (IM), IT Performance Management (ITPM), Salesforce
Data is a commodity in business. To become useful information, data must be put into a specific business context. Without information, today’s businesses can’t function. Without the right information, available to the right people at the right time, an organization cannot make the right decisions nor take the right actions, nor compete effectively and prosper. Information must be crafted and made available to employees, customers, suppliers, partners and consumers in the forms they want it at the moments they must have it. Optimizing information in this manner is essential to business success. Yet I see organizations today focusing on investments in big data because they believe it can effortlessly bring analysts insights. That premise is incorrect.
Topics: Appliances, Big Data, Analyst, Business Analytics, Cloud Computing, Hadoop, In-Memory Computing, Information Management, Information Applications (IA), Information Management (IM), Information Optimization
The big-data landscape just got a little more interesting with the release of EMC’s Pivotal HD distribution of Hadoop. Pivotal HD takes Apache Hadoop and extends it with a data loader and command center capabilities to configure, deploy, monitor and manage Hadoop. Pivotal HD, from EMC’s Pivotal Labs division, integrates with Greenplum Database, a massively parallel processing (MPP) database from EMC’s Greenplum division, and uses HDFS as the storage technology. The combination should help sites gain from big data a key part of its value in information optimization.
Topics: Big Data, EMC, MapR, HAWQ, HDFS, Pivotal HD, Business Analytics, Cloud Computing, Cloudera, Hadoop, Hortonworks, Location Intelligence, Business Intelligence (BI), Cirro, Hive, Information Applications (IA), Information Management (IM), Tableau Software
Business is starting to realize that taking advantage of big data is not just technically feasible but affordable by organizations of all sizes. However, as outlined in our agenda on big data and information optimization, the technology must be engineered to the information needs of business. Hortonworks has been steadily advancing its big data technology called Hadoop and contributing its developments back to the Apache Software Foundation for a range of projects. The company performs enterprise-level testing to ensure Hadoop not just operates but scales across operating systems, cloud computing, virtual machines and appliances. Over the last year Hortonworks has released a number of certifications and benchmarks for an enterprise-ready version of Hadoop for which it provides support and services. These are important steps forward in meeting the needs of IT management, which is the audience evaluating big data technologies in 66 percent of organizations according to our big data research.
Topics: Big Data, Microsoft, Talend, Teradata, Apache, Microsoft HDInsight, Simba, Strata Conference, Business Analytics, Cloud Computing, Hadoop, Hortonworks, Informatica, HDP, Hive, Information Applications (IA), Information Management (IM), Tez
LucidWorks addresses the growing volume of information now being stored in the enterprise and in big data with two products aimed at the enterprise with search technology. Though you may not be familiar with LucidWorks (previously known as Lucid Imagination), the company has for many years contributed to Apache Lucene, an open source search project, and commercialized and supported for it for business.
Topics: Big Data, MapR, Business Analytics, Business Intelligence, Business Mobility, Cloud Computing, Cloudera, Hadoop, Hortonworks, Business Intelligence (BI), Customer Performance Management (CPM), Information Management (IM), Sales Performance Management (SPM), Search, Workforce Performance Management (WPM)