You are currently browsing the tag archive for the ‘CIO’ tag.

Many businesses are close to being overwhelmed by the unceasing growth of data they must process and analyze to find insights that can improve their operations and results. To manage this big data they find a rapidly expanding portfolio of technology products. A significant vendor in this market is SAS Institute. vr_Big_Data_Analytics_02_defining_big_data_analyticsI recently attended the company’s annual analyst summit, Inside Intelligence 2014 (Twitter Hashtag #SASSB). SAS reported more than $3 billion in software revenue for 2013 and is known globally for its analytics software. Recently it has become a more significant presence in data management as well. SAS provides applications for various lines of business and industries in areas as diverse as fraud prevention, security, customer service and marketing. To accomplish this it applies analytics to what is now called big data, but the company has many decades of experience in dealing with large volumes of data. Recently SAS set a goal to be the vendor of choice for the analytic, data and visualization software needs for Hadoop. To achieve this aggressive goal the company will have to make significant further investments in not only its products but also marketing and sales. Our benchmark research on big data analytics shows that three out of four (76%) organizations view big data analytics as analyzing data from all sources, not just one, which sets the bar high for vendors seeking to win their business.

In the last few years SAS has been investing heavily to expand its portfolio in big data. Today its in-memory infrastructure can operate within Hadoop, execute MapReduce jobs, access the various commercial distributions of Hadoop, conduct data preparation and modeling in Hadoop and extend it to its data and visual discovery and exploration tools. SAS has architected its analytics tools and platform to use Hadoop’s Pig and Hive interfaces, apply MapReduce to process large data sets and use Hadoop Distributed File System (HDFS) to store and access the big data. To exploit Hadoop more deeply, the SAS LASR Analytic Server (part of SAS Visual Analytics) connects directly to HDFS to speed performance. SAS LASR Analytic Server is an in-memory computing platform for data processing and analysis that can scale up and operate in parallel within Hadoop to distribute the computation and data workloads. This flexibility in the architecture enables users to adapt SAS to any type of big data, especially Hadoop deployments, for just about any scale and configuration. To work with other database-oriented technologies the company has built technical partnerships not only with major players Teradata and SAP but also with the new breed of Hadoop vendors Cloudera, Hortonworks and Pivotal, as well as with IBM BigInsights. SAS also engineered access to SAP HANA, which establishes further integrated into SAP’s data platform for analytics and other applications.

At the Inside Intelligence gathering, SAS demonstrated its new Visual Statistics product. Like its Visual Analytics this one is available online for evaluation. It offers sophisticated support for analysts and data professionals who need more than just a visually interactive analytic tool of the sort that many providers now sell. Developing a product like Visual Statistics is a smart move according to our research, which finds that predictive analytics and statistics is the most important area of big data analytics, cited by 78 percent of organizations. At this point visual and data discovery are most common, but we see that users are looking for more. SAS Visual Statistics can conduct in-memory statistical processing and compute results inside Hadoop before the data is transferred to another analytic data repository or read directly into an analytics tool. A demonstration of these capabilities at the analyst summit revealed how these capabilities along with the use of tools in SAS 9.4 could raise the bar for sophisticated analytics tools for business.

vr_Info_Optimization_04_basic_information_tasks_consume_timeSAS also has a data management software suite for data integration, quality, mastering and governance and is working to make the product known for its big data support. This is another important area: Our research in big data analytics finds quality and consistency of data to be significant challenges for 56 percent of organizations and also that 47 percent are not satisfied with integration of information for creating big data analytics. SAS is expanding to provide data quality tools for Hadoop. Its portfolio is expansive in this area, but it should take steps to market these capabilities better, which spokespeople said it will do in 2014. Our recent research in information optimization found that organizations still are spending disproportionate amounts of time in preparing data (47%) and reviewing it (45%) for analytics. They need to address these difficulties to free their analysts to spend more time on analysis that produces recommendations for decision-makers and to collaborate on business improvement. SAS’s efforts to integrate data and analytics should help reduce the time spent on preparation and help analysts focus on what matters.

SAS also will expand its SAS Stream Processing Engine with a new release coming by midyear. This product can process data as it is being generated, which facilitates real-time analytics – that’s the vr_oi_how_operational_intellegence_is_usedthird-most important type of big data analytics according to our research. Applying analytics in real time is the most important aspect of in-memory computing for two-thirds (65%) of organizations and is critical as SAS expands its SAS LASR Analytic Server. Our benchmark research on operational intelligence shows that the processing of event data is critical for areas like activity or event monitoring (said 62% of participants) and alerting and notification (59%). SAS will need to expand its portfolio in these areas but it is fulfilling on what I call the four types of discovery for big data.

SAS also is moving deeper into cloud computing with support for both private and public clouds through investments in its own data centers. Cloud computing is an increasingly popular approach to building a sandbox environment for big data analytics. Our research finds that more than one-fifth of organizations prefer to use cloud computing in an on-demand approach. SAS will have to provide even more of its portfolio using big data in the cloud or risk customers turning to Amazon and others for processing and potentially other computing uses. SAS asserts it is investing and expanding in cloud computing.

vr_Big_Data_Analytics_15_new_technologies_enhance_analyticsSAS’s efforts to make it easier to work with big data and apply analytics is another smart bet; our research finds that most organizations today don’t have enough skilled resources in this area. One way to address this gap is to design software that is more intuitive, more visual and more interactive but sophisticated in how it works with the primitives of Hadoop; SAS is addressing this challenge. Our research finds growth of in-memory (now used by 42%) and Hadoop (50%) technologies, which will have more impact as they are applied directly to business needs and opportunities. SAS is at the intersection of data management and analytics for big data technologies, which could position it well for further growth in revenue. SAS is betting that big data will become a focal point in many deployments and they can help unify data and analytics across the enterprise. Our research agenda for 2014 finds this to be the big opportunity and SAS is fixated on being the vendor of choice for it. If you have not examined how SAS can connect big data architectures and facilitate use of this important technology, it will be worth your time to do so.


Mark Smith

CEO & Chief Research Officer

Businesses are always looking for ways to grow and to streamline their operations. These two goals can come into conflict because as organizations become larger it becomes more complicated to be agileVentana_Research_Benchmark_Research_Logo and efficient. To help them understand and modify their processes, businesses can derive insights from analytics applied to their data. Today that data is available not only in the enterprise and cloud computing environments but also from the Internet. To collect, process and analyze it all is a challenge, one that an increasing number of organizations are meeting through the use of big data technologies. The resulting insights can help them make strategic business decisions such as where to focus efforts and how to engage with customers. At Ventana Research we have been working hard to understand the advancing technology that supports big data and its value through information optimization and bring clarity to the industry through our research and analysis of trends and products. There are many opinions about big data and fixation on the attributes of it through the V’s (volume, variety and velocity) and how to use it, often biased toward one technology or vendor; our research and analysis of the entire market cuts through the noise to provide not just facts but insights on best practices and methods to apply this technology to business problems.

There’s no doubt that big data can help organizations turn their information assets into insights that are critical for achieving growth and interacting more successfully with customers. It can help them access and integrate information for business use in ways that were not possible before. Among these new methods are simpler ways to access and consume information, including search based on natural language and cognitive computing, which is bringing forward advanced science to the processing of vr_Info_Optimization_10_reasons_to_change_information_availabilityinformation. Big data also enables more effective visualization to support discovery and exploratory analytics. Machine data can be used to gain insights into the workings of technology that directly impacts the operations of the organization. We assert that big data drives operational efficiency through effective processing technology that expands the use of information though analytics and these innovative computing methods. The importance of these advances is shown in our recent benchmark research on information optimization, in which two-thirds (67%) of organizations said that improving operational efficiency is the leading reason to change how they make information available.

In 2013, large steps forward were made in big data technology. We saw the beginning of convergence among technological approaches as Hadoop, in-memory processing and data appliances intersect with specialized and traditional database systems. Users are learning how to gain value through insights from these technology investments. While we saw advances in visualization as shown in our predictive analytics research, using most of these tools requires advanced skills to ensure that the data is interpreted properly for facilitating actions, let alone decisions. Many organizations lack these skills in-house, our research shows.

Thus one challenge for 2014 is to acquire the competencies needed to get the best possible information from big data. Another is to improve processes for information optimization so that data, even about real-time events, can flow to business users and reduce the time it takes for them to use it effectively. New platforms and services can help make more types of information easier to understand and interact with. This is complemented by collaboration tools that can operate across mobile devices and get more information to more people wherever they are. We note also that unstructured data such as documents, images and text now is part of the information requirements for more than half of organizations that could use big data technology.

All of these tools and efforts towards information optimization can be useful as organizations try to improve the consistency and governance of their business-related information assets in key areas such as product information management and reduce duplication and conflicts in information about customers and employees along with focus on governance, risk and compliance (GRC). Failing to address these issues can lead to lost revenue, dissatisfied customers and decreased efficiency, all of which impact profitability. Our focus on product information management will continue as it is in high demand: We will release a new Value Index on the topic in 2014 to assess vendors and products and guide potential purchasers. There also have been advances in applications to help businesses manage information; for example, the use of master data management and data governance can help increase accuracy and outcomes from related business processes. In 2014 we will assess the current market for master data management as it impacts both business and IT through new benchmark research and continued coverage of technology developments. Because data no longer resides only in the enterprise, we will continue to track advances in using cloud computing for business applications and accessing and integrating the large amounts of data there. Cloud data is now a major factor in many organizations, and we will reassess the challenges with data in the cloud benchmark research to determine where investments and processes have progressed and where they still need to be improved.

Mainstream use of big data is leading organizations to invest in creating a new stream of information processes that can meet a variety of business needs. We have a whole line of new next-generation analytics research in the specific lines of business; the first are for financecustomer and human capital, and other areas planned for 2014 are in sales and marketing, which are still in the early adoption phase of big data but have generated significant interest about its potential.

Organizations have several options in the types of big datavr_Info_Optimization_11_innovations_important_for_information technology they choose. Some are using Hadoop and commercial versions of it from a variety of providers, others are using big data appliances, and still others are using in-memory computing and specialized databases. Our latest research finds that in 2014 more organizations plan to use Hadoop (24%), in-memory tools (23%) and appliances (15%) than will use an RDBMS (10%). This variety of technologies is generating new best practices in big data and how to use the resulting information. Big data is a critical innovative technology now important to 41 percent of organizations. Please consult my colleague Tony Cosentino’s research agenda in business analytics and big data, which outlines new methods to discover and explore big data effectively; he also has new research coming on big data analytics.

VRValueIndexLogoIt is now possible for organizations to learn and apply best practices in using big data effectively across business and IT. Once that is accomplished users can examine how to make information flow easily into applications and optimize their use, too. At this point, however, many have yet to focus on integration of data and sources to ensure an efficient flow both in the enterprise and throughout cloud computing. This challenge we call big data integration, and we are conducting new research to identity the issues, methods and best practices here. Our research shows that for more than four out of five organizations that have 16 or more sources of data it is important to simplify access and use of it all and integration will help address this challenge. This year we will also have our annual Value Index on vendors and products in data integration and will expand to include the requirements for big data and cloud computing. The availability of technology for data integration is fueling new efforts in what I call integration optimization, which even includes handling of real-time event streams in complex event processing to produce operational intelligence. We have research on this and an upcoming Value Index. Use of this real-time information and analytics is helping organizations respond to issues more quickly than waiting for historical analysis to take place.

We expect significant advances in big data and information optimization in 2014. In this market, however, most organizations lack experience and skilled resources. We warn them not to investigate big data just because “everyone is doing it.” That is not necessarily true, so we recommend first reviewing the processes in your organization in which it can provide value, and then look at the architectural and technological approaches that best fit your needs and available resources.


Mark Smith

CEO & Chief Research Officer

Mark Smith – Twitter

Top Rated


  • 169,762 hits

Get every new post delivered to your Inbox.

Join 18,302 other followers

%d bloggers like this: