Teradata continues to expand its information management and analytics technology for big data to meet growing demand. My analysis last year discussed Teradata’s approach to big data in the context of its distributed computing and data architecture. I recently got an update on the company’s strategy and products at the annual Teradata analyst summit. Our big data analytics research finds that a broad approach to big data is wise: Three-quarters of organizations want analytics to access data from all sources and not just one specific to big data. This inclusive approach is what Teradata as designed its architectural and technological approach in managing the access, storage and use of data and analytics.
Topics: Big Data, Teradata, Operational Performance Management (OPM), Analytics, Business Analytics, Cloud Computing, Information Management, NoSQL, Business Intelligence (BI), Business Performance Management (BPM), Customer Performance Management (CPM), Financial Performance Management (FPM), Information Applications (IA), Information Management (IM), IT Performance Management (ITPM), JSON, Sales Performance Management (SPM), Supply Chain Performance Management (SCPM), UDA, Workforce Performance Management (WPM)
I had the pleasure of attending Cloudera’s recent analyst summit. Presenters reviewed the work the company has done since its founding six years ago and outlined its plans to use Hadoop to further empower big data technology to support what I call information optimization. Cloudera’s executive team has the co-founders of Hadoop who worked at Facebook, Oracle and Yahoo when they developed and used Hadoop. Last year they brought in CEO Tom Reilly, who led successful organizations at ArcSight, HP and IBM. Cloudera now has more than 500 employees, 800 partners and 40,000 users trained in its commercial version of Hadoop. The Hadoop technology has brought to the market an integration of computing, memory and disk storage; Cloudera has expanded the capabilities of this open source software for its customers through unique extension and commercialization of open source for enterprise use. The importance of big data is undisputed now: For example, our latest research in big data analytics finds it to be very important in 47 percent of organizations. However, we also find that only 14 percent are very satisfied with their use of big data, so there is plenty of room for improvement. How well Cloudera moves forward this year and next will determine its ability to compete in big data over the next five years.
Topics: Big Data, Teradata, Zoomdata, Cloudera, Hadoop, Hortonworks, IBM, Location Intelligence, Operational Intelligence, Oracle, Business Intelligence (BI), Hive, Impala, Information Applications (IA), Information Management (IM), IT Performance Management (ITPM)
At this year’s annual SAP user conference, SAPPHIRE, the technology giant showed advances in its cloud and in-memory computing efforts. It has completed the migration of its conventional application suite and portfolio of tools to operate on SAP HANA, its in-memory computing platform, and made improvements in its cloud computing environment, SAP HANA Enterprise Cloud. The last time I analyzed SAP HANA was when it won our firm’s 2012 Overall IT Technology Innovation Award. Now HANA has been transitioned from just a database technology into a broad platform. SAP wisely consolidated its efforts previously known as SAP NetWeaver into SAP HANA. This resolves some confusion regarding HANA and NetWeaver in the cloud, which I assessed. The recently announced SAP HANA Platform now provides the enterprise class of HANA implementation in the cloud. It comes with a trial edition of the data and visual discovery technology now called SAP Lumira, whose price has been reduced to encourage adoption (and which I discuss more below). The use of in-memory databases for big data is accelerating: According to our technology innovation research, 22 percent of organizations are planning to use this technology over the next two years, and through 2015 it will have a higher growth rate than other approaches.
Topics: Big Data, Predictive Analytics, SAP, Social Media, Teradata, Mobile Technology, Operational Performance Management (OPM), Business Analytics, Business Collaboration, CIO, Cloud Computing, Governance, Risk & Compliance (GRC), HP, Business Intelligence (BI), Business Performance Management (BPM), CFO, CMO, Customer Performance Management (CPM), Financial Performance Management (FPM), Information Applications (IA), Information Management (IM), SAP EPM, SAP HANA, SAP Lumira, SAPPHIRE, Supply Chain Performance Management (SCPM), Tagetik, Workforce Performance Management (WPM)
Teradata recently gave me a technology update and a peek into the future of its portfolio for big data, information management and business analytics at its annual technology influencer summit. The company continues to innovate and build upon its Teradata 14 releases and its new processing technology. Since my last analysis of Teradata’s big data strategy, it has embraced technologies like Hadoop with its Teradata Aster Appliance, which won our 2012 Technology Innovation Award in Big Data. Teradata is steadily extending beyond providing just big data technology to offer a range of analytic options and appliances through advances in Teradata Aster and its overall data and analytic architectures. One example is its data warehouse appliance business, which according to our benchmark research is one of the key technological approaches to big data; as well Teradata has advanced support with its own technology offering for in-memory databases, specialized databases and Hadoop in one integrated architecture. It is taking an enterprise management approach to these technologies through Teradata Viewpoint, which helps monitor and manage systems and support a more distributed computing architecture.
Topics: Big Data, MicroStrategy, SAS, Tableau, Teradata, Customer Excellence, Operational Performance Management (OPM), Business Analytics, CIO, Cloud Computing, Hadoop, In-Memory Computing, Location Intelligence, Operational Intelligence, Business Intelligence (BI), CMO, Customer Performance Management (CPM), Discovery, Information Applications (IA), Information Management (IM), Intelligent Memory, Teradata Aster, visual discovery
Business is starting to realize that taking advantage of big data is not just technically feasible but affordable by organizations of all sizes. However, as outlined in our agenda on big data and information optimization, the technology must be engineered to the information needs of business. Hortonworks has been steadily advancing its big data technology called Hadoop and contributing its developments back to the Apache Software Foundation for a range of projects. The company performs enterprise-level testing to ensure Hadoop not just operates but scales across operating systems, cloud computing, virtual machines and appliances. Over the last year Hortonworks has released a number of certifications and benchmarks for an enterprise-ready version of Hadoop for which it provides support and services. These are important steps forward in meeting the needs of IT management, which is the audience evaluating big data technologies in 66 percent of organizations according to our big data research.
Topics: Big Data, Microsoft, Talend, Teradata, Apache, Microsoft HDInsight, Simba, Strata Conference, Business Analytics, Cloud Computing, Hadoop, Hortonworks, Informatica, HDP, Hive, Information Applications (IA), Information Management (IM), Tez
Using Hadoop just got easier, thanks to Teradata’s introduction of SQL-H, a new query interface to analyze data from Hadoop. Most Hadoop access methods require preprocessing and staging of data from the Hadoop Distributed File System (HDFS) using technologies such as MapReduce. These approaches require new skills and technologies, introducing more time and costs for users, which offset the benefits of Hadoop, which according to our big data benchmark research include increasing the speed of analysis. Teradata has announced support for SQL-H not only for its own Aster Database 5.0, which it expects to release in the third quarter, but also supporting the commercial version of Hadoop through Hortonworks.