You are currently browsing the tag archive for the ‘Predictive Analytics’ tag.
I recently attended the annual SAS analyst summit to hear the latest company, product and customer growth news from the multi-billion-dollar analytics software provider. This global giant continues to grow its business and solutions to help with fraud prevention, marketing and risk. It lets users apply its analytic and statistical technology in practical applications for business. SAS can meet midsized businesses’ demand with packaging and pricing to ensure it is not seen as only affordable to Global 2000 companies. SAS’ growth in analytics should be no surprise, as our research finds analytics to be the first-ranked priority among technologies for innovating business.
SAS’ largest area of growth is in its business analytics and business intelligence tools. Its new SAS Visual Analytics product appeals to a broad range of business and analyst needs. The latest upcoming version blends data and visual discovery with powerful analytics. SAS is also addressing usability, the most important technology consideration according to 63 percent of organizations in our research. SAS uses in-memory computing against big data to help meet the advanced needs of organizations. Its eventual intent is to have Visual Analytics be the focal point of its business intelligence product direction. SAS Visual Analytics 6.2 is expected to be generally available in second quarter of this year; a trial of the product is already available. At the event the company demonstrated its capabilities on tablets such as the Apple iPad. After seeing the demo my only recommendation is that SAS provide more collaborative aspects and ensure that analysts can make observations and notations, which is a challenge with most of the business intelligence and analytics offerings in the market today.
SAS’ view of big data echoes our view that it is part of a larger portfolio of business technology for storing and loading data. SAS has invested significantly into its high performance analytics (HPA) architecture, which enables it to operate in parallel or embedded within database technology. SAS has focused on efficient processing for applying mathematics, and has devised multiple architectural approaches to adapt to existing technology, including Hadoop, and to ensure it can operate in the most efficient manner. Our big data research finds that a third of organizations plan to evaluate and adopt a range of appliances, in-memory and specialized databases and Hadoop in 2013. SAS’ approach is to embrace and integrate with a range of big data approaches.
For information management, SAS has consolidated the previous Dataflux brand into the SAS organization and unified its product offerings. This is an important move, as joint offerings can confuse potential customers, though everything was available from SAS. Beneath the marketing is a solid product line that provides not just data integration, though we assessed SAS as a Hot Vendor in our 2012 Value Index for Data Integration based on a methodical assessment. Unlike other analyst approaches that scratch the surface of review in 2×2 assessments, we look at range of manageability, usability, reliability and other categories that span a range of data related areas. Not as well-known yet for its integration with Hadoop and even SAP HANA, SAS is addressing challenges in big data integration that we are researching in more depth for 2013. But SAS’ overall approach to data management aligns well to our information management research. I was impressed with SAS’ support for process and data orchestration and the overall ease of use of the product; it can be easily used by analysts and IT, with some great job monitoring capabilities.
For the chief marketing officer (CMO), SAS has expanded its Customer Intelligence Suite since my colleague Richard Snow assessed it last year. Expanded capabilities address a broad set of management and operational needs for marketing. SAS provides not just the analytics but campaign management, real-time decision management and personalization that helps ensure the best possible interaction and experience. Though it is not always seen as a key provider of applications for marketing, SAS has been steadily expanding its offerings organically and through acquisitions, and now, with a unified approach and user experience, is ready to strut its depth and sophistication, especially for B2C organizations.
SAS demonstrated a portfolio that engages everyone from the CMO to the analysts, managers and teams responsible for marketing activities that span from strategy and planning to interactions and ensuring great customer experience. An upcoming release expected in Q2 provides a new generation of user experience and integration that I have not seen in other offerings in the market. This sophisticated advancement in customer analytics aligns with my colleague Richard Snow’s view on the next generation of customer analytics that can leverage big data to meet forward-looking needs of organizations.
SAS sees the value in cloud computing, and now has its own global hosted technology infrastructure. It can help its customers set up a private cloud for its technology. SAS has had rapid global expansion to support the cloud since our last assessment. I especially like its management of users, applications and technology and the ease of working across deployments and upgrades. SAS will soon also provide a platform for assembling applications for a range of needs for business. This new step forward, expected later in the year, is significant, as SAS is not known for its ability to foster the development of applications, but it has had this capability in its portfolio, and now is making it simpler and accessible in the cloud. Our research finds that the on-demand model and even software hosted by the supplier plays a growing role not just for analytics but for a range of big data and mobile technology needs in over a third of organizations. For SAS, this capability goes well beyond just providing analytics in the cloud, and places it in the market of companies such as Salesforce, which provides Force.com as its cloud-based application development environment but also provides information and analytics applications. SAS continues to expand its OnDemand offerings that provide easier access to many of the sophisticated solutions in its portfolio.
SAS is also applying analytics to decision-making through a series of advancements with its Decision Management technology. As many organizations realize, the value of analytics is in using them to enable action to be taken. This is no easy task, as most analytics and their presentation are not designed for assessing, taking and monitoring actions. SAS has developed a suite of capabilities and tools to help in the preparation of data, modeling, optimization, workflow and rules, monitoring and reporting, along with supporting case management. After a close look at the product I found it to be well-designed with an easy-to-use interface, especially the decision flow builder, which can be used by business analysts to design processes and analytics. I especially like the SAS Scenario Manager, which allows for side-by-side examination of decisions to determine how to optimize activities. SAS’ full suite of integrated decision management capabilities is expected to be available in the second quarter of this year, and SAS has an aggressive roadmap for continuous improvement. Only IBM in this market has a comparable area of focus and integrated approach with a portfolio of tools for decision management across any industry. SAS is making a smart step forward and will need to elevate the visibility of this offering in its portfolio to ensure it gets the proper level of consideration.
SAS also provides software for risk management, GRC and fraud technology to handle the most sophisticated challenges facing organizations and lower risk in organizations. Our research into GRC finds that 79 percent of organizations are looking to identify and manage risks faster, and more than half (59%) need to improve their control environment. I will let my colleague do further analysis of SAS portfolio in this area in the future.
SAS continues to build out its partner ecosystem. It has made strides to expand into other companies’ technology ecosystems, including Teradata and EMC, and works with system integrators such as Accenture, Capgemini and Deloitte. SAS had a great customer panel at the analyst event, and while I’m under NDA and cannot tell you the customer names, they represented some of the largest brands in the world, and they operate and use SAS to meet a variety of analytics and real-time operation needs.
The company has been steadily advancing, as we found in our Value Index for Business Intelligence last year. However, as SAS is adopted by more analysts, it will face the same issues I cited for business intelligence, which has not adapted well to business needs.
SAS has great potential with its approach to do more than just analyze the past but also predict and optimize future business activities using applications and tools that utilize its analytics backbone. SAS believes that its ability to handle proactive and forward-looking analysis on the largest of big data distinguishes it from other software providers along with using in-memory technology and makes it easy to try its software. SAS’ ability to use mathematics and embed predictive analytics into its offerings makes it a unique application provider. I could not cover all the key advancements in its portfolio but anyone that spends a little time examining their portfolio will realize there is a lot more to SAS than most realize. It’s broadening and deepening of its portfolio puts it on the short list of companies to consider for bringing more sophistication and science to business analytics.
CEO & Chief Research Officer
Managing the access, storage and use of data effectively can provide businesses a competitive advantage. Last year I outlined what the big deal is in big data, as the initial focus on the volume, velocity and variety of data – what my colleague Tony Cosentino calls the three V’s – is only one small piece of how organizations should evaluate this technology. The more balanced approach is to include what he calls the three W’s – the what, so what and now what, which shifts the focus to an outcome-based view that can handle the time–to-value urgency found in business. Big data analytics can help assess the volume of data, while the velocity of data that is potentially in-motion is best handled by what we call operational intelligence. Beyond these, techniques and technology such as predictive analytics and visual discovery facilitate extracting more value from big data. Along with a wide variety of data, these tools help organizations focus on optimizing information assets. We will soon conduct benchmark research into information optimization to determine how organizations are dealing with their information today and what steps they are taking to improve. In-memory computing will surely be one of those steps, as it can significantly improve the time-to-insight equation.
Big data does not magically become valuable, nor is it easily implemented. Organizations must realize they still require good information management competencies that address the full lifecycle of access and integration across in-house and cloud computing resources. Our research into data in the cloud has found that most organizations are not prepared to handle this broad, distributed data dilemma and has become very important. Technology to harvest and integrate data must become easier to use for both business and IT users, which means overcoming the prevailing reliance on spreadsheets, which our research has found is a larger data challenge than most people realize.
Our research agenda for 2013 calls for us to examine not just the forms of big data technology, but also the impact and value of big data tools that organizations can use to maximize the value of their information and drive better insights. Realizing the vision of greater intelligence across processes and teams takes an investment of time, effort and money. Accomplishing such a feat requires focusing on information competencies to support big data effectively, which requires an assessment of the information processes to deliver data to business effectively. Our research into operational intelligence found that the use of events is a critical part of the big data environment. At the same time the skills of master data management and data governance do not go away, and in fact become important to address the business accuracy question that inevitably pops up when more data becomes available to be utilized. Our research into product information management has found that the drive for data quality is changing organizations’ approaches, and that a comprehensive information strategy using MDM and PIM together across business and IT can yield significant benefits compared to an IT-only approach. Our Product Information Management Value Index found some startling results about which vendors really meet the business needs of organizations. Product information is one of the many priorities as well as customer, employee and finance that need to be part of a big data effort and information optimization set of processes.
To embrace big data and optimize the use of information across an entire enterprise requires not just competencies and methods to ensure effective deployment, but also the ability to understand the business cycles for information and design the technology accordingly. Our research this year will examine the varying types of big data technology, including data appliances, Hadoop, in-memory computing and RDBMSes, all of which our big data research in 2012 indicated have a nice growth pattern going into 2013. The number of Hadoop tools in particular has expanded dramatically in the last year, making it easier for existing staffers to utilize that technology without having to hire dedicated programmers, as was the case for early adopters.
In addition, we have seen that large-scale in-memory computing architectures can provide significant value when it comes to working with big data. A new generation of data appliances arrived on the scene in 2012. Of course it is critical to explore new methods to provide analytics and gain insights from big data and not assume that your existing providers will provide you a competitive edge with innovative approaches. Also, businesses must not limit themselves to the use of structured data when they can also utilize varying forms of content to help get a more integrated view of information. Big data can become strategic, but businesses need some technical acuity to design the best possible architecture for their needs.
It will be critical to understand best practices for big data in 2013 and not get caught in the neverending cycle of evaluating technology. IT organizations need to deliver value to business iteratively in order to be seen as contributing to the value business expects from technology investments. It is also critical that IT and business analysts work together to find the right big data approach. They must reduce time spent on data-related tasks; our technology innovation benchmark research found that more than 40 percent of organizations currently spend too much. Supporting a variety of data will be critical, as even location data and resulting analytics, which we are researching, is a much larger priority in business than most people in IT realize. It might very well be that businesses must adopt a distributed set of big data technologies to meet their collective needs.
Being efficient in blending big data with existing applications requires good data integration. Our assessment in 2012 found a large selection of vendors in this area, but only some are exploiting the integration points of big data technology and where they might exist in cloud computing environments. Successful data integration might require a deeper examination of data virtualization, which our information management research found to be a growing priority. This year will see a new crop of information and business applications that exploit the value of big data and are aligned to specific line-of-business needs. At the same time the use of dedicated analytics designed for this approach, which we call big data analytics, will require examination in 2013; my colleague Tony Cosentino has plans for new research on lessons learned and the technology providers in this area, as he outlined in his agenda for 2013.
Businesses must make sure to examine new methods of assembling and harvesting information from existing applications and systems for business, including those that might not be classified as big data but can deliver the value required for business needs. What’s old can still be new, and the role of data integration has never been more important to automate the flow of data in the enterprise. We will research further into data integration in 2013 to determine best practices and build upon our existing Value Index on Data Integration with a new report that highlights the expansion of data integration vendors’ support for big data and cloud computing.
As information becomes potentially more easily managed, organizations need to expand upon traditional business intelligence and look at the role of predictive analytics, which our research has found is essential for optimizing business processes and aiding critical business decisions. We have found the largest obstacle to predictive analytics is difficulty integrating new tools into existing information architectures. For many organizations, our research has found getting the basics in business analytics requires a dedicated approach. With the commoditization of hardware and memory and the concomitant increased computing potential, as well as the emergence of cloud computing options, these new methods for taking advantage of big data become more cost-effective for a spectrum of small and big businesses.
All of these are exciting advancements in the science of information management, and CIOs should make a point of learning about and investing in all of them. Being more intelligent with big data is the mantra for 2013. Organizations that heed lessons learned and research the right path forward will reduce their risk of not delivering the value their businesses demand. While a big portion of the technology sector attaches itself to big data, being pragmatic and assessing the right path forward will be the most important best practice for 2013.
CEO & Chief Research Officer