You are currently browsing the tag archive for the ‘alteryx’ tag.

At the recent Teradata’s annual Partners user conference, the company outlined its expanding role as a provider distributed information architecture technology. My colleague Tony Cosentino assessed Teradata’s business analytics and big data strategy, but there is more under the covers in regards to the company’s expanding role for big data and enterprise architectures. Over the last several decades Teradata has been known for providing enterprise data warehouse appliances, such as its unveiling of its new Teradata 2700 data warehouse appliance, which uses the latest multicore Intel processors. Now, as organizations continue to invest in distributed approaches in which they store and utilize data on a range of appliances and through Hadoop-based big data technology, Teradata has begun to provide integration with Hadoop, including a direct connector to it and commercialized versions of it in partnership with Cloudera and Hortonworks.  Earlier this year, for instance, Teradata formed a partnership with Hortonworks that provides a commercialized edition of the open source Hadoop that now is further integrated.

Teradata is expanding its big data portfolio in two significant areas to which IT organizations should pay close attention. The first area of expansion is its evolving line of analytic and data appliances. This expansion accelerated over the last couple of years thanks to Teradata’s acquisition of Aster Data, which provided Teradata an anchor point for accessing data across the enterprise, as I recently assessed. Aster Data provides a unified approach to accessing Hadoop data from analytics or business intelligence tools through SQL-H and via the Apache HCatalog metadata catalog using patented SQL-to-MapReduce technology. The first fruits of this technology integration, the Teradata Aster Big Analytics Appliancewas announced this month. It provides high levels of processing power for a range of analytic applications that need access to unstructured data using Hortonworks which is embedded and integrated in the appliance. This technology integration is unique in its potential value for organizations that utilize big data across the enterprise, which is why we recently awarded it our 2012 Technology Innovation Award for Big Data.

The second area of expansion is Teradata’s portfolio of data management software, which advances the support for a unified data environment that helps manage the enterprise architecture for big data and support the information management needs of an organization. Teradata’s new set of data management tools, Teradata Unity, supports a range of synchronization, data loading, monitoring and data-moving operations. These tools are critical to ensuring access and integration of data across the enterprise and meeting the performance and scalability needs of analytic architectures. I personally like the simplicity of Teradata Viewpoint, which provides an integrated management console to look at the performance and scalability levels of the company’s data appliances. These new management tools address one of the largest impediments to taking advantage of big data according to our big data research, which is the staffing and training required to support implementations. Teradata also has advanced in the science of time-series analytics with Teradata Temporal, which helps capture changes over time.  These tools will help with the administration of data environments.

Teradata also has a vibrant ecosystem of analytic and business intelligence technology that interoperates with its portfolio of data technologies. Dozens of partners were at

the conference exhibiting their integration with and access to Teradata. Some technology vendor announcements that demonstrate integration to Teradata are worth mentioning. Alteryx announced an update to its version 8 to further support Teradata and Aster in the highest performance manner. QlikView announced its forthcoming access to Teradata with QlikView Version 11 advancements. SAS’ integration and work with Teradata continues to bring advanced analytics to Teradata environments. The company highlighted individuals taking advantage of the technologies, who it calls Analytic Heroes.

The key point for IT organizations is Teradata’s focus on big data, for which Teradata has embedded, integrated and expanded the value of Hadoop within its architecture and globally supports it in a 24×7 operation. I continued to see its overall success, as we noted in our recent 2012 Leadership Awards, where we highlighted contributions by organizations using Teradata, including HMS, Nationwide, RBC and the United States Department of Agriculture. Teradata’s role as part of enterprise architecture has become more open than ever, and in many ways more open than that of Oracle, which pushes its Exadata and Exalytics integrated vertical appliances and the Oracle-only approach for IT that my colleague recently assessed. While many other vendors are announcing new big data technology, Teradata has been providing this support and appliances successfully for decades. Our research shows that retaining and analyzing more data is the top benefit organizations gain from big data. To that end, Teradata is offering new hardware and software approaches. Organizations looking at big data technology approaches should take note that the company’s approach is open and can integrate with existing enterprise architectures and investments to meet the business and IT needs today and into the future.

Regards,

Mark Smith

CEO & Chief Research Officer

Kognitio has been serving the analytics and data needs of organizations for more than 20 years with an in-memory analytics platform that meets many of the big-data needs of today’s organizations. Kognitio Analytical Platform provides a unique massively parallel processing (MPP) in-memory database that can rapidly load data and calculate analytics; it is available both in an analytical software appliance and via cloud computing.

Its software can be installed on commodity x86 servers or used in the cloud on Amazon Web Services. The technology is fast to load with the company’s own tools or through data integration tools such as Informatica’s. Kognitio’s in-memory platform can take data from a disparate set of sources and process the data using analytics that can operate simultaneously across processors. As we noted last year Kognitio expanded to support MDX analytic query expressions, which through its Microsoft Excel interface can help conduct specific types of analytics faster than just using SQL. The challenge is that the use of MDX has not grown significantly among business analysts, nor has it been widely supported with other BI and analytics products. Kognitio has worked to reduce administrative complexity of its product; the appliance does not require a DBA to manage indexes or partitions.

In light of the significant growth in the use of Hadoop to process large volumes of data, Kognitio has formed a partnership with Hortonworks to integrate its technology with Hadoop through an accelerator.  I assessed Hortonworks Hadoop Summit and how the use of Hadoop can accelerate the processing of analytics and data. Use of Hadoop is growing fast in big-data technology ecosystems;  almost one-third of organizations now plan to use it, according to our big-data benchmark research. Kognitio has expanded its partnerships to reach analysts; for instance, it established a partnership with Alteryx, which provides a unique workflow and process-centric approach to analytics that I analyzed. It also announced a partnership and integration with Advanced Visual Systems for visualizing large volumes of data like consumer behavior and social media. These types of partnerships are critical for Kognitio, and it needs to explain their value better to prospective buyers and also promote them, which have yet to appear in the partner listing on its website.

Kognitio has struggled to grow as much as its unique technology probably deserved over the last ten years. Last year the U.K.-based company brought in new management to help guide its growth, especially in the United States. It also established new pricing for its technology based on the amount of memory used for processing the data. My analysis is that management needs to more directly publicize the performance and scalability of its approach compared to others to ensure it remains relevant in the conversation on big data and business analytics. Increasing its visibility will also help it reach the professionals who evaluate big-data technology, who according to our research are mostly in the IT organization.

Kognitio has some cost/benefit and computing efficiency advantages that should be very important to IT. While organizations usually know that analytics provide benefits, they also are looking for efficiency, streamlining their information architecture and providing operational support for business analysts, who are looking to be freed from manual work with preparing data and more automation so they can focus on analysis. If you are examining methods to increase the response time for analytics using big data and decrease the costs to do so, you should look at Kognitio.

Regards,

Mark Smith – CEO & Chief Research Officer

Mark Smith – Twitter

Top Rated

Stats

  • 168,128 hits
Follow

Get every new post delivered to your Inbox.

Join 18,302 other followers

%d bloggers like this: