You are currently browsing the category archive for the ‘Operational Intelligence’ category.

Many businesses are close to being overwhelmed by the unceasing growth of data they must process and analyze to find insights that can improve their operations and results. To manage this big data they find a rapidly expanding portfolio of technology products. A significant vendor in this market is SAS Institute. vr_Big_Data_Analytics_02_defining_big_data_analyticsI recently attended the company’s annual analyst summit, Inside Intelligence 2014 (Twitter Hashtag #SASSB). SAS reported more than $3 billion in software revenue for 2013 and is known globally for its analytics software. Recently it has become a more significant presence in data management as well. SAS provides applications for various lines of business and industries in areas as diverse as fraud prevention, security, customer service and marketing. To accomplish this it applies analytics to what is now called big data, but the company has many decades of experience in dealing with large volumes of data. Recently SAS set a goal to be the vendor of choice for the analytic, data and visualization software needs for Hadoop. To achieve this aggressive goal the company will have to make significant further investments in not only its products but also marketing and sales. Our benchmark research on big data analytics shows that three out of four (76%) organizations view big data analytics as analyzing data from all sources, not just one, which sets the bar high for vendors seeking to win their business.

In the last few years SAS has been investing heavily to expand its portfolio in big data. Today its in-memory infrastructure can operate within Hadoop, execute MapReduce jobs, access the various commercial distributions of Hadoop, conduct data preparation and modeling in Hadoop and extend it to its data and visual discovery and exploration tools. SAS has architected its analytics tools and platform to use Hadoop’s Pig and Hive interfaces, apply MapReduce to process large data sets and use Hadoop Distributed File System (HDFS) to store and access the big data. To exploit Hadoop more deeply, the SAS LASR Analytic Server (part of SAS Visual Analytics) connects directly to HDFS to speed performance. SAS LASR Analytic Server is an in-memory computing platform for data processing and analysis that can scale up and operate in parallel within Hadoop to distribute the computation and data workloads. This flexibility in the architecture enables users to adapt SAS to any type of big data, especially Hadoop deployments, for just about any scale and configuration. To work with other database-oriented technologies the company has built technical partnerships not only with major players Teradata and SAP but also with the new breed of Hadoop vendors Cloudera, Hortonworks and Pivotal, as well as with IBM BigInsights. SAS also engineered access to SAP HANA, which establishes further integrated into SAP’s data platform for analytics and other applications.

At the Inside Intelligence gathering, SAS demonstrated its new Visual Statistics product. Like its Visual Analytics this one is available online for evaluation. It offers sophisticated support for analysts and data professionals who need more than just a visually interactive analytic tool of the sort that many providers now sell. Developing a product like Visual Statistics is a smart move according to our research, which finds that predictive analytics and statistics is the most important area of big data analytics, cited by 78 percent of organizations. At this point visual and data discovery are most common, but we see that users are looking for more. SAS Visual Statistics can conduct in-memory statistical processing and compute results inside Hadoop before the data is transferred to another analytic data repository or read directly into an analytics tool. A demonstration of these capabilities at the analyst summit revealed how these capabilities along with the use of tools in SAS 9.4 could raise the bar for sophisticated analytics tools for business.

vr_Info_Optimization_04_basic_information_tasks_consume_timeSAS also has a data management software suite for data integration, quality, mastering and governance and is working to make the product known for its big data support. This is another important area: Our research in big data analytics finds quality and consistency of data to be significant challenges for 56 percent of organizations and also that 47 percent are not satisfied with integration of information for creating big data analytics. SAS is expanding to provide data quality tools for Hadoop. Its portfolio is expansive in this area, but it should take steps to market these capabilities better, which spokespeople said it will do in 2014. Our recent research in information optimization found that organizations still are spending disproportionate amounts of time in preparing data (47%) and reviewing it (45%) for analytics. They need to address these difficulties to free their analysts to spend more time on analysis that produces recommendations for decision-makers and to collaborate on business improvement. SAS’s efforts to integrate data and analytics should help reduce the time spent on preparation and help analysts focus on what matters.

SAS also will expand its SAS Stream Processing Engine with a new release coming by midyear. This product can process data as it is being generated, which facilitates real-time analytics – that’s the vr_oi_how_operational_intellegence_is_usedthird-most important type of big data analytics according to our research. Applying analytics in real time is the most important aspect of in-memory computing for two-thirds (65%) of organizations and is critical as SAS expands its SAS LASR Analytic Server. Our benchmark research on operational intelligence shows that the processing of event data is critical for areas like activity or event monitoring (said 62% of participants) and alerting and notification (59%). SAS will need to expand its portfolio in these areas but it is fulfilling on what I call the four types of discovery for big data.

SAS also is moving deeper into cloud computing with support for both private and public clouds through investments in its own data centers. Cloud computing is an increasingly popular approach to building a sandbox environment for big data analytics. Our research finds that more than one-fifth of organizations prefer to use cloud computing in an on-demand approach. SAS will have to provide even more of its portfolio using big data in the cloud or risk customers turning to Amazon and others for processing and potentially other computing uses. SAS asserts it is investing and expanding in cloud computing.

vr_Big_Data_Analytics_15_new_technologies_enhance_analyticsSAS’s efforts to make it easier to work with big data and apply analytics is another smart bet; our research finds that most organizations today don’t have enough skilled resources in this area. One way to address this gap is to design software that is more intuitive, more visual and more interactive but sophisticated in how it works with the primitives of Hadoop; SAS is addressing this challenge. Our research finds growth of in-memory (now used by 42%) and Hadoop (50%) technologies, which will have more impact as they are applied directly to business needs and opportunities. SAS is at the intersection of data management and analytics for big data technologies, which could position it well for further growth in revenue. SAS is betting that big data will become a focal point in many deployments and they can help unify data and analytics across the enterprise. Our research agenda for 2014 finds this to be the big opportunity and SAS is fixated on being the vendor of choice for it. If you have not examined how SAS can connect big data architectures and facilitate use of this important technology, it will be worth your time to do so.

Regards,

Mark Smith

CEO & Chief Research Officer

Adding geographic and location context to business information enables organizations to develop fuller understanding and optimize the activities of people that use the information. We call this location intelligence, and to achieve it requires location analytics, which focus on that context where the processing and presentation of geography and spatial aspects of data are utilized. Analysis of geographic information can provide business insights that help organizations make better business decisions. I have written about this new generation of location analytics previously and noted that it can provide fresh analytic perspectives on information collected and integrated from in-house applications and across the Internet.

In our latest benchmark research on location analytics two VentanaResearch_LocationAnalyticsout of five (41%) organizations said it is very important to have information about location to improve processes and performance. Participants in business roles (51%) insisted more on its importance than did those in IT (30%); this difference indicates the importance of location to business professional to improve activities and processes they are responsible for improving. However, the research also shows that most businesses aren’t taking advantage of this enhanced perspective. Our Performance Index analysis found the largest percentage (29%) of participants at the lowest Tactical level of location-related performance. Almost all organizations have room for improvement: Only 12 percent said they are very satisfied with the available location information and analytics. Similarly only 15 percent are very confident in the quality of their location information.

vr_LA_location_analytics_requires_experiencesBringing a new perspective, location analytics requires familiarity to deliver benefits. Users that are very experienced in location analytics most often said it has significantly improved the results of their activities and processes (62%), compared to 23 percent of those who described themselves as experienced. To achieve this value organizations must invest in skills and knowledge of this domain for their analysts and tools that can help them derive value from this type of information.

As with other new areas of technology and analysis, to improve effectiveness using location intelligence will require a business case for investment. Our research found several barriers to building the case: lack of resources and lack of awareness (each cited by 41%), no budget (33%) and the business case not being strong enough (30%). While training and understanding of the potential of location intelligence are essential, some decision-makers need better examples of its use in terms of business results to make it an investment priority. In this regard organizations said the most important factors are to increase speed of response to customers, improve the quality of business analysis and decisions, and increase the accuracy of information. Nearly all that have adopted location intelligence said this focus has improved the organization’s processes significantly (40%) or slightly (56%).

The research found several key business areas in which value can be found. Improving customer relations is a significant driver for change: Two-thirds (68%) of organizations using location analytics and a customer focus will be changing the way they use it in the next 12 to 18 months. A focus on customers is a natural use of location analytics; our research shows that for more than one-third (37%) of organizations it is the type of information for which location is most important. The benefit most often ranked first by those analyzing customer data is to improve the customer experience and customer satisfaction (20%). But customer facing areas are not the only place to apply location analytics that can be especially useful in areas: optimizing sales efforts for territory management, identifying fraud for risk mitigation, optimizing routes and customer interactions across field service, routing optimization for distribution and warehouse management for supply chain management or identify best markets for advertising effectiveness.

The emphasis on location analytics by the lines of business vr_LA_location_analytics_improves_resultsappears again in funding for initiatives. It comes not just from the general IT budget (40%) but also from the business technology budget (31%) and the business budget (28%). And the investments pay off. Those that have invested in location analytics have improved the results of their activities and processes significantly (34%) or slightly (51%). Business people, who are better-positioned to see the benefits, indicated significant improvement three times as often (48%) as did IT staff (16%).

Analysts in the lines of business should assess their existing efforts and determine how the use of location analytics could improve the results they provide to decision-makers. One key tool for this is visualization through maps, which can provide more intuitive presentation of information than more general visualization. Simplifying the presentation of location information and making it easier to identify insights are essential for business professionals, who should not have to be trained to see the insights. Location Analytics is more than just a map and pretty picture, but a method to process data to get information for delivering geographic insights that are actionable and impactful to business. Organizations not yet taking advantage of this valuable supplement to their business information should evaluate it.

Regards,

Mark Smith

CEO & Chief Research Officer

Mark Smith – Twitter

Top Rated

Stats

  • 165,885 hits
Follow

Get every new post delivered to your Inbox.

Join 17,644 other followers

%d bloggers like this: