You are currently browsing the category archive for the ‘Operational Performance Management (OPM)’ category.
Data is an essential ingredient for every aspect of business, and those that use it well are likely to gain advantages over competitors that do not. Our benchmark research on information optimization reveals a variety of drivers for deploying information, most commonly analytics, information access, decision-making, process improvements and customer experience and satisfaction. To accomplish any of these purposes requires that data be prepared through a sequence of steps: accessing, searching, aggregating, enriching, transforming and cleaning data from different sources to create a single uniform data set. To prepare data properly, businesses need flexible tools that enable them to enrich the context of data drawn from multiple sources, collaborate on its preparation to serve business needs and govern the process of preparation to ensure security and consistency. Users of these tools range from analysts to operations professionals in the lines of business.
Data preparation efforts often encounter challenges created by the use of tools not designed for these tasks. Many of today’s analytics and business intelligence products do not provide enough flexibility, and data management tools for data integration are too complicated for analysts who need to interact ad hoc with data. Depending on IT staff to fill ad hoc requests takes far too long for the rapid pace of today’s business. Even worse, many organizations use spreadsheets because they are familiar and easy to work with. However, when it comes to data preparation, spreadsheets are awkward and time-consuming and require expertise to code them to perform these tasks. They also incur risks of errors in data and inconsistencies among disparate versions stored on individual desktops.
In effect inadequate tools waste analysts’ time, which is a scarce resource in many organizations, and can squander market opportunities through delays in preparation and unreliable data quality. Our information optimization research shows that most analysts spend the majority of their time not in actual analysis but in readying the data for analysis. More than 45 percent of their time goes to preparing data for analysis or reviewing the quality and consistency of data.
Businesses need technology tools capable of handling data preparation tasks quickly and dependably so users can be sure of data quality and concentrate on the value-adding aspects of their jobs. More than a dozen such tools designed for these tasks are on the market. The best among them are easy for analysts to use, which our research shows is critical: More than half (58%) of participants said that usability is a very important evaluation criterion, more than any other, in software for optimizing information. These tools also deal with the large numbers and types of sources organizations have accumulated: 92 percent of those in our research have 16 to 20 data sources, and 80 percent have more than 20 sources. Complicating the issue further, these sources are not all inside the enterprise; they also are found on the Internet and in cloud-based environments where data may be in applications or in big data stores.
Organizations can’t make business use of their data until it is ready, so simplifying and enhancing the data preparation process can make it possible for analysts to begin analysis sooner and thus be more productive. Our analysis of time related to data preparation finds that when this is done right, significant amounts of time could be shifted to tasks that contribute to achieving business goals. We conclude that, assuming analysts spend 20 hours a week working on analytics, most are spending six hours on preparing data, another six hours on reviewing data for quality and consistency issues, three more hours on assembling information, another two hours waiting for data from IT and one hour presenting information for review; this leaves only two hours for performing the analysis itself.
Dedicated data preparation tools provide support for key tasks in areas that our research and experience finds that are done manually by about one-third of organizations. These data tasks include search, aggregation, reduction, lineage tracking, metrics definition and collaboration. If an organization is able to reduce the 14 hours previously mentioned in data-related tasks (that including preparing data, reviewing data and waiting for data from IT) by one-third, it will have an extra four hours a week for analysis – that’s 10 percent of a 40-hour work week. Multiply this time by the number of individual analysts and it becomes significant. Using the proper tools can enable such a reallocation of time to use the professional expertise of these employees.
This savings can apply in any line of business. For example, our research into next-generation finance analytics shows that more than two-thirds (68%) of finance organizations spend most of their analytics time on data-related tasks. Further analysis shows that only 36 percent of finance organizations that spend the most time on data-related tasks can produce metrics within a week, compared to more than half (56%) of those that spend more time on analytic tasks. This difference is important to finance organizations seeking to take a more active role in corporate decision-making.
Another example is found in big data. The flood of business data has created even more challenges as the types of sources have expanded beyond just the RDBMS and data appliances; Hadoop, in-memory and NoSQL big data sources exist in at least 25 percent of organizations, according to our big data integration research. Our projections of growth based on what companies are planning indicates that Hadoop, in-memory and NoSQL sources will increase significantly. Each of these types must draw from systems from various providers, which have specific interfaces to access data let alone load it. Our research in big data finds similar results regarding data preparation: The tasks that consume the most time are reviewing data for quality and consistency (52%) and preparing data (46%). Without automating data preparation for accessing and streamlining the loading of data, big data can be an insurmountable task for companies seeking efficiency in their deployments.
A third example is in the critical area of customer analytics. Customer data is used across many departments but especially marketing, sales and customer service. Our research again finds similar issues regarding time lost to data preparation tasks. In our next-generation customer analytics benchmark research preparing data is the most time-consuming task (in 47% of organizations), followed closely by reviewing data (43%). The research also finds that data not being readily available is the most common point of dissatisfaction with customer analytics (in 63% of organizations). Our research finds other examples, too, in human resources, sales, manufacturing and the supply chain.
The good news is that these business-focused data preparation tools have usability in the form of spreadsheet-like interfaces and include analytic workflows that simplify and enhance data preparation. In searching for and profiling of data and examining fields based on analytics, use of color can help highlight patterns in the data. Capabilities for addressing duplicate and incorrect data about, for example, companies, addresses, products and locations are built in for simplicity of access and use. In addition data preparation is entering a new stage in which machine learning and pattern recognition, along with predictive analytics techniques, can help guide individuals to issues and focus their efforts on looking forward. Tools also are advancing in collaboration, helping teams of analysts work together to save time and take advantage of colleagues’ expertise and knowledge of the data, along with interfacing to IT and data management professionals. In our information optimization research collaboration is a critical technology innovation, according to more than half (51%) of organizations. They desire several collaborative capabilities ranging from discussion forms to knowledge sharing to requests on activity streams.
This data preparation technology provides support for ad hoc and other agile approaches to working with data that maps to how business actually operate. Taking a dedicated approach can help simplify and speed data preparation and add value by enabling users to perform analysis sooner and allocate more time to it. If you have not taken a look at how data preparation can improve analytics and operational processes, I recommend that you start now. Organizations are saving time and becoming more effective by focusing more on business value-adding tasks.
CEO and Chief Research Officer
At its annual MicroStrategy World conference, this provider of analytics and business intelligence systems for business and IT introduced a new version of its flagship product, MicroStrategy 9s. Among many advances it adds enterprise grade security with MicroStrategy Usher as part of the maintenance update to its 9.4.1 release. Security is increasingly critical for analytics and BI. Technologies that work intensively with data, including reporting, business intelligence, analytics and data preparation, have access to a range of applications and databases and could leave gaps in access controls and security of essential business data. Already in 2015 the data breach at Anthem put more than 80 million medical records at risk. Our benchmark research in big data analytics finds that integration into security and user access frameworks is a very important capability to 37 percent of organizations.
MicroStrategy has spent years and significant investment of resources on MicroStrategy Usher. On the company’s mobile platform Usher provides multifactor authentication that includes fingerprinting for biometrics, unique identification through smartphone verification and a mobile device pass code to ensure secure access. Further identification security is offered through geo-fencing, which requires a user to be within range of a specified access point such as a building or a computer, by permitting access only during a specific time window or using QR code scanning to rapidly match the individual with authorized access. At the conference MicroStrategy demonstrated these security techniques along with AES-128 GCM encryption of data on devices and in transit.
Ventana Research recognized MicroStrategy Usher with a Technology Innovation Award in 2014 for using mobile technology to provide secure access to applications, information and even physical locations; in the last case it has partnered with building companies such as Honeywell. Organizations should realize the risk of having only single sign-on through insecure passwords, which can expose access to all their systems. Usher is available as part of MicroStrategy 9s or as a stand-alone product.
On another advanced technology front, MicroStrategy has advanced its cloud computing offering to embrace Amazon Web Services (AWS). This partnership can provide flexibility for its customers while reducing the effort it must put into building out its own data centers to support its products. Here also the company is addressing security of information and systems. MicroStrategy Secure Cloud includes enterprise security capabilities to ensure authenticated access. This is an important feature: Our big data integration benchmark research finds that security concerns are the most common barrier (in 54% of organizations) to using big data through cloud computing. Many buyers want to reduce the amount of software they have to license and install, but they must have confidence that their data is safe in on-demand cloud systems. Keeping it simple but secure for cloud computing is critical, and our research finds room for improvement here across the technology industry.
Regarding big data MicroStrategy has advanced its computing power through PRIME, its in-memory and parallel processing technology. Our research in big data analytics finds that in-memory systems are the most sought-after innovation in half of organizations seeking to enhance analysis. That research also shows that the number and dispersal of data sources is a major issue. More than one-third (36%) of organizations have six to 15 sources and one-fourth (26%) have 16 or more information sources that need to be integrated for optimum access. And blending data from so many sources raises security concerns; in our research in big data integration security is the fourth-most important data activity, cited by 61 percent of organizations. MicroStrategy continues its efforts to help organizations get access to any data sources. For example, it now supports the Oracle Database 12c with in-memory and databases releases and has embedded its BI software in Oracle appliances that become big data machines. In addition MicroStrategy has embraced Apache Spark to maximize compute potential across Hadoop, NoSQL and RDBMSs and expands its database support every quarter. Earlier in 2014 MicroStrategy announced support for MarkLogic 7 NoSQL technology, which is becoming more relevant in the analytics and BI market. MicroStrategy also has been continuing its efforts to support data preparation tasks performed by analysts rather than administrators as some other vendors do. Our big data analytics research finds that in creating and deploying information and using analytics, almost half (47%) of organizations spend the largest portion of time in data preparation; this is true across customer service, finance, HR and other operational areas.
Like many others MicroStrategy discovered painfully in 2014 that overall the IT market for purchasing software to deploy on-premises has slowed down and specifically that the BI market has reached a saturation point in organizations that have more than US$10 billion in revenue. Even so the company exceeded $579 million in total revenue for 2014. Its investments in cloud computing over the last several years have helped it through this transition. To adjust to the market changes it has brought in new executives including CTO Tim Lang, who has decades of experience dating back to Holistic Systems and Crystal Software, and CMO Mark Gambill who has over 18 years’ experience leading marketing teams; these two are streamlining their respective areas to make them more efficient. MicroStrategy continues to get brand-name customers on board with its mobile and analytic offerings, and its focus on security in 2015 should help in these efforts. In the past several years demand has been strong for visualization-related discovery and analytics tools among the lines of business and analysts, and now MicroStrategy has caught up and matched these capabilities. Visualization and presentation of analytics and data remain very important, according to our research: Half (49%) of organizations said it helps those who have the right skills perform analytics faster. Our research also finds that the majority of businesses want analytics to be simpler and easier to understand and available quickly on their mobile devices.
In our 2014 Value Index on Mobile BI, MicroStrategy earned the Hot Vendor rating for its mobile business intelligence technology across device platforms, primarily for Apple and Android support. But MicroStrategy will have to ante up its support for Microsoft who is making inroads with smartphones and even more so with its Surface 3 tablets and also with new generations of notebooks using Windows Touch, which enables the use of gestures in Windows; Microsoft is finding opportunities in businesses that are phasing out older desktop and laptop computers with new ones running Windows Touch technology. MicroStrategy in 2014 simplified packaging and pricing of MicroStrategy Mobile and its Web products that operate on its MicroStrategy Server platform. MicroStrategy is building on a solid year of technology advancement as I pointed out in 2014 and is poised for innovation. It provided a sneak peek at MicroStrategy World of MicroStrategy 10 (now in beta and early release), which will up the ante with the next generation of in-memory analytics and simpler administration. Developments include further support of Apple Mac OS and iOS and advancing visualization and sophistication of analytics, which may be available fairly early in 2015.
I believe that MicroStrategy should do more with search and collaborative capabilities, which our research finds to be a priority for business professionals (less so for IT). All BI vendors must be careful to listen to all audiences and stop paying attention only to IT. They will need to make sure that those in the lines of business understand how they can use analytics within the context of solving problems and opportunities, and make it as easy as possible for them do so on their own. Vendors also must understand their purchasers’ own business priorities. For instance, our next-generation customer analytics benchmark research shows that improving the customer experience and customer service strategy are the top drivers in this analytics segment. The release of MicroStrategy 9s and the upcoming MicroStrategy 10 continue this company’s prominence as an enterprise-class provider. We advise organizations to assess its evolving portfolio to identify missing elements in their existing BI and analytics deployments, with an eye on mobile platforms and security of data.
CEO and Chief Research Officer