You are currently browsing the category archive for the ‘Governance, Risk & Compliance (GRC)’ category.
January 27, 2014 in Big Data, Business Analytics, Business Collaboration, Business Intelligence (BI), Business Performance Management (BPM), Cloud Computing, Customer Performance Management (CPM), Financial Performance Management (FPM), Governance, Risk & Compliance (GRC), Information Applications (IA), Information Management (IM), Location Intelligence, Operational Intelligence, Operational Performance Management (OPM), Sales Performance Management (SPM), Social Media, Supply Chain Performance Management (SCPM), Workforce Performance Management (WPM) | Tags: Big Data, CIO, Information Optimization | by Mark Smith | Leave a comment
Businesses are always looking for ways to grow and to streamline their operations. These two goals can come into conflict because as organizations become larger it becomes more complicated to be agile and efficient. To help them understand and modify their processes, businesses can derive insights from analytics applied to their data. Today that data is available not only in the enterprise and cloud computing environments but also from the Internet. To collect, process and analyze it all is a challenge, one that an increasing number of organizations are meeting through the use of big data technologies. The resulting insights can help them make strategic business decisions such as where to focus efforts and how to engage with customers. At Ventana Research we have been working hard to understand the advancing technology that supports big data and its value through information optimization and bring clarity to the industry through our research and analysis of trends and products. There are many opinions about big data and fixation on the attributes of it through the V’s (volume, variety and velocity) and how to use it, often biased toward one technology or vendor; our research and analysis of the entire market cuts through the noise to provide not just facts but insights on best practices and methods to apply this technology to business problems.
There’s no doubt that big data can help organizations turn their information assets into insights that are critical for achieving growth and interacting more successfully with customers. It can help them access and integrate information for business use in ways that were not possible before. Among these new methods are simpler ways to access and consume information, including search based on natural language and cognitive computing, which is bringing forward advanced science to the processing of information. Big data also enables more effective visualization to support discovery and exploratory analytics. Machine data can be used to gain insights into the workings of technology that directly impacts the operations of the organization. We assert that big data drives operational efficiency through effective processing technology that expands the use of information though analytics and these innovative computing methods. The importance of these advances is shown in our recent benchmark research on information optimization, in which two-thirds (67%) of organizations said that improving operational efficiency is the leading reason to change how they make information available.
In 2013, large steps forward were made in big data technology. We saw the beginning of convergence among technological approaches as Hadoop, in-memory processing and data appliances intersect with specialized and traditional database systems. Users are learning how to gain value through insights from these technology investments. While we saw advances in visualization as shown in our predictive analytics research, using most of these tools requires advanced skills to ensure that the data is interpreted properly for facilitating actions, let alone decisions. Many organizations lack these skills in-house, our research shows.
Thus one challenge for 2014 is to acquire the competencies needed to get the best possible information from big data. Another is to improve processes for information optimization so that data, even about real-time events, can flow to business users and reduce the time it takes for them to use it effectively. New platforms and services can help make more types of information easier to understand and interact with. This is complemented by collaboration tools that can operate across mobile devices and get more information to more people wherever they are. We note also that unstructured data such as documents, images and text now is part of the information requirements for more than half of organizations that could use big data technology.
All of these tools and efforts towards information optimization can be useful as organizations try to improve the consistency and governance of their business-related information assets in key areas such as product information management and reduce duplication and conflicts in information about customers and employees along with focus on governance, risk and compliance (GRC). Failing to address these issues can lead to lost revenue, dissatisfied customers and decreased efficiency, all of which impact profitability. Our focus on product information management will continue as it is in high demand: We will release a new Value Index on the topic in 2014 to assess vendors and products and guide potential purchasers. There also have been advances in applications to help businesses manage information; for example, the use of master data management and data governance can help increase accuracy and outcomes from related business processes. In 2014 we will assess the current market for master data management as it impacts both business and IT through new benchmark research and continued coverage of technology developments. Because data no longer resides only in the enterprise, we will continue to track advances in using cloud computing for business applications and accessing and integrating the large amounts of data there. Cloud data is now a major factor in many organizations, and we will reassess the challenges with data in the cloud benchmark research to determine where investments and processes have progressed and where they still need to be improved.
Mainstream use of big data is leading organizations to invest in creating a new stream of information processes that can meet a variety of business needs. We have a whole line of new next-generation analytics research in the specific lines of business; the first are for finance, customer and human capital, and other areas planned for 2014 are in sales and marketing, which are still in the early adoption phase of big data but have generated significant interest about its potential.
Organizations have several options in the types of big data technology they choose. Some are using Hadoop and commercial versions of it from a variety of providers, others are using big data appliances, and still others are using in-memory computing and specialized databases. Our latest research finds that in 2014 more organizations plan to use Hadoop (24%), in-memory tools (23%) and appliances (15%) than will use an RDBMS (10%). This variety of technologies is generating new best practices in big data and how to use the resulting information. Big data is a critical innovative technology now important to 41 percent of organizations. Please consult my colleague Tony Cosentino’s research agenda in business analytics and big data, which outlines new methods to discover and explore big data effectively; he also has new research coming on big data analytics.
It is now possible for organizations to learn and apply best practices in using big data effectively across business and IT. Once that is accomplished users can examine how to make information flow easily into applications and optimize their use, too. At this point, however, many have yet to focus on integration of data and sources to ensure an efficient flow both in the enterprise and throughout cloud computing. This challenge we call big data integration, and we are conducting new research to identity the issues, methods and best practices here. Our research shows that for more than four out of five organizations that have 16 or more sources of data it is important to simplify access and use of it all and integration will help address this challenge. This year we will also have our annual Value Index on vendors and products in data integration and will expand to include the requirements for big data and cloud computing. The availability of technology for data integration is fueling new efforts in what I call integration optimization, which even includes handling of real-time event streams in complex event processing to produce operational intelligence. We have research on this and an upcoming Value Index. Use of this real-time information and analytics is helping organizations respond to issues more quickly than waiting for historical analysis to take place.
We expect significant advances in big data and information optimization in 2014. In this market, however, most organizations lack experience and skilled resources. We warn them not to investigate big data just because “everyone is doing it.” That is not necessarily true, so we recommend first reviewing the processes in your organization in which it can provide value, and then look at the architectural and technological approaches that best fit your needs and available resources.
CEO & Chief Research Officer
January 13, 2014 in Big Data, Business Analytics, Business Collaboration, Business Intelligence (BI), Business Performance Management (BPM), Cloud Computing, Customer Performance Management (CPM), Financial Performance Management (FPM), Governance, Risk & Compliance (GRC), Information Applications (IA), Information Management (IM), IT Performance Management (ITPM), Location Intelligence, Operational Intelligence, Operational Performance Management (OPM), Sales Performance Management (SPM), Social Media, Supply Chain Performance Management (SCPM), Workforce Performance Management (WPM) | Tags: Analytics, Big Data, CIO, Cognitive Computing, Discovery, Exploration, IBM Watson | by Mark Smith | Leave a comment
With much fanfare and a rarely seen introduction by CEO Ginni Rometty, IBM launched IBM Watson as a new business unit focused on cognitive computing technology and solutions, now being led by Senior Vice President Mike Rhodin. The announcement is summarized here:. Until now IBM Watson was important but had neither this stature in IBM’s organizational structure nor enough investment to support what the company proclaims is the third phase of computing. As IBM tells it, computing paradigms began with the century-old tabular computing, followed by the age of programmatic computing, in which IBM developed many products and advancements. The third phase is cognitive computing, an area in which the company has invested significantly to advance its technology. IBM has been on this journey for some time, long before the IBM Watson system beat humans on Jeopardy!. Its machine-learning efforts started with the IBM 704 and computer checkers in the 1950s, followed by decades of utilizing the computing power of the IBM 360 mainframe, the IBM AS/400, the IBM RS/6000 and even IBM XT computers in the 1980s. Now IBM Watson is focused on reaching the full potential of cognitive computing.
If IBM is right that cognitive computing will be the next wave of innovation in the industry and a new phase of computing, it has placed itself at the center of a substantial new market opportunity. Even at the most basic level to simplify the process of making information more available is what IBM Watson provides and our information optimization research finds is very important to 65 percent of organization. The company says it has invested $1 billion and placed up to two thousand skilled people in the new business unit. It also is spending $100 million on the incubation of companies that are building on the Watson platform, and has a new and dedicated building in Lower Manhattan, known as Silicon Alley, its focal point for cognitive computing ideas that IBM has unveiled to the public. There is no question that IBM Watson is innovate as we recognized in 2012 as being the recipient of our overall operation innovation technology innovation award.
These recent actions build on IBM’s announcement last fall that it is commercializing IBM Watson to enable developers and partners to innovate on the platform. Its launch of the IBM Watson Developers Cloud marketplace introduces new offerings and content essential to building its ecosystem of resources to meet existing and future demand for applications of cognitive computing. The step is essential in that it will maximize the number of products using IBM Watson and provide IBM with a springboard to exponentially grow its efforts. At the same time, IBM is working with academic institutions
IBM’s announcements included new products to complement the IBM Watson portfolio and give it a broader footprint and value to customers. The first new product announced is IBM Watson Discovery Advisor, a tool that helps pharmaceutical companies plow through massive volumes of big data. This is a good place to start, as harvesting the right information for specific roles and purposes is the foundation of cognitive computing, enabling organizations not just to access information but to synthesize it.
The next announcement was IBM Watson Analytics, a product previously known as Project Neo and introduced to the market last fall, which my colleague Tony Cosentino covered. Incubated in IBM’s business analytics group and using a spectrum of analytic and discovery technology, the product and people who worked on it and other efforts are being transferred to the IBM Watson business unit. Though it was not initially built for IBM Watson today, the discovery and exploratory technology integrates the pillars of analytics, helping facilitate a knowledge discovery process whereby you can explore data through natural language and discover new insights. The move to shift IBM Watson Analytics was unexpected and introduces new pressure to market and sell the product. It has growing potential for line-of-business analysts, who will want to examine this and other tools from IBM’s business analytics group. Only time will tell if IBM will be able to fully monetize the product’s potential through its IBM Watson effort, but the move could be its short-term method to gain customers and revenue. It definitely will be a complement when it interfaces to IBM Watson and utilizes the knowledge that Watson creates.
The next major product announcement was IBM Watson Explorer, a big data analytics tool that enables collaborative discovery, navigation and search across information in applications. Both analytics tools advance the science of big data technologies but focus on more than just the mechanics of what big data does, as described by the “four V’s”: volume, variety, velocity and veracity. Rather, they address the value of what is possible through the so-called W’s, focusing on the who, what, where, when and why. This is what we call information optimization, facilitating the business potential of not just big data but of cognitive computing. For its part, IBM is applying its big data and information management efforts to IBM Watson, categorizing them as IBM Watson foundations. This is critical as our information optimization research finds that organizations do not have enough capabilities to integrate and normalize information from disparate sources as the largest shortcoming of technology in 45 percent of organizations. By integrating and utilizing these big data and business analytics tools as part of IBM Watson, cognitive computing will from a competency perspective be more advanced even if these products are not directly needed for enabling IBM Watson.
With this much at stake IBM was not going to leave customer endorsements to chance, and while it has taken some criticism that customer commitment might not be as high as it has claimed, that question was answered at the IBM Watson event. For one, the medical and healthcare industry was front and center to validate its commitment to IBM, represented by organizations such as the Cleveland Clinic and Memorial Sloan-Kettering Cancer Center. Most interesting was an early peek at the potential of mass consumerization of IBM Watson. The first example was presented by e-commerce facilitator Fluid: Its Fluid XPS is focused on changing the digital experience of consumers by gaining access to information about their needs for products and services in a holistic manner. The example it promoted cold weather gear for camping by asking a question as a front end to the North Face website. A second example was the potential to have IBM Watson be the natural language interface for finding a vacation destination, specifying certain criteria like class, price, type and climate of location; today this requires repetitive tasks such as filling out forms and making your own comparisons to determine where you want to go and for what price. The concept was presented by Terry Jones, the former CEO of Sabre and Travelocity and chairman of Kayak. He has more than 40 years of experience in the travel industry and now consults about business innovation through his company, called Essential Ideas. IBM also demonstrated how cognitive computing can provide the next generation of marketing can synthesize the interactions and psychology of individuals to more effectively market to them. These examples point to the potential of enabling natural-language recognition technology to discover relevant responses that guide users’ actions and decisions.
As part of my analysis over the past couple of years, I’ve been following this step forward and wrote about the new category of cognitive computing. In 2013, IBM brought forward IBM Watson Engagement Advisor and focused on smarter customer service through a simpler engagement approach to improving the customer experience, a topic my colleague Richard Snow has assessed. This effort by IBM is as our customer engagement research has found is centered on improving the customer experience as found in 74 percent of organizations. I have also seen IBM demonstrate similar solutions for employees and managers as part of human capital management. More important, these solutions embrace mobile computing whereby devices can be used as input and response tools anywhere, at any time which our research finds smartphones accessible to be used in 73 percent of organizations according to our information optimization research. Intellectually, IBM continues to advance its research and scientific developments to ensure that it can transition its work into products that customers can use. At the launch of the IBM Watson business unit, IBM Research’s Dr. Guruduth Banavar brought forward some of the latest thinking on cognitive science and the ability to teach machines to reason and what is called neurosynaptics, a discussion that is available on IBM Research’s cognitive computing page. The material is fascinating; it provides insight on the future of computing and how it will impact roles and businesses in the next decade.
As it begins to scale its offering, part of IBM’s challenge is to manage the continuous information feeds that effectively make IBM Watson smarter. While IBM does not talk much about the content aspects of what is required, it is clearly more than just loading files, and these efforts are just as important as librarians are to libraries, whereby they are not just stewards to a collection of books but ensure the value and improvement of the library. There is still a level of mystery on the technical mechanics and readiness of the platform that the company needs to address before the natural-language interface is ready to work its magic. In addition, IBM is still using a natural-language form of text and working through how it can make voice mainstream with IBM Watson, as Apple and Google and others have done. IBM has been working on speech in research for some time and more recently with Nuance, who IBM announced a partnership with back in 2011, but it has yet to demonstrate this capability to the mainstream public, which indicates hesitation on how fast it plans to use voice and speech as the interface. While IBM was not able to fully monetize its early efforts in speech technology, it is now becoming mainstream in the consumer market but has yet to evolve significantly in the business markets as part of enterprise software. I am looking forward to seeing more of what it can do in terms of voice and speech input and Watson talking iteratively to help expedite what is truly natural language for humans.
IBM does not often create new business units and elevate them to this level of commitment and investment for the future. While the business goals for IBM Watson are lofty from both revenue and computing perspectives, no other company – not even Microsoft, Oracle or SAP – has both the established technology foundation and the people and financial resources that IBM has to make this a reality. IBM should be congratulated for making the investment in cognitive computing and helping create new jobs and opportunity that will incubate not just in Silicon Alley but across the globe as others realize the full potential. Our technology innovation research finds that increasing the value of an organization is very important to over half (56%) of organizations which is exactly what IBM is hoping will increase its business opportunity. If you want to catch up on the dialogue and resources related to this topic, you can search #IBMWatson on Twitter and follow @IBMWatson. If you want to learn more about IBM Watson and cognitive computing, go to www.ibm.com/watson and you will find more information about the technology, our research on the topic and the value of this new computing paradigm.
CEO & Chief Research Officer