Datawatch Bolsters Data Preparation for all Information Assets

The need for businesses to process and analyze data has grown in intensity along with the volumes of data they are amassing. Our benchmark research consistently shows that preparing data is the most widespread impediment to analytic and operational efficiency. In our recent research on data and analytics in the cloud, more than half (55%) of organizations said that preparing data for vr_DAC_20_justification_for_data_preparationanalysis is a major impediment, followed by other preparatory tasks: reviewing data for quality and consistency (48%) and waiting for data and information (28%). Organizations that want to apply analytics to make more effective decisions and take prompt actions need to find ways to shorten the work that comes before it. Conventional analytics and business intelligence tools are not designed for data preparation, but new software tools can enable business users independently or in concert with IT to perform the tasks needed.

Datawatch offers one such tool, which I have described as doing advanced information optimization. The company has helped organizations get more out of their information assets for years. One could argue that Datawatch has been doing data preparation longer than anyone else in the market; its Monarch product dates back to the 1990s and specializes in extracting and manipulating data from a range of documents and systems. The latest release, version 13, takes significant steps forward with a new user environment and functionality designed for analysts and users in operational positions who need a more intuitive approach to data preparation. Its drag-and-drop approach helps make data preparation doable for most business users.

Datawatch has continued to expand its data access support for various systems including salesforce.com and Hadoop for big data, providing support for both cloud-based and on-premises systems. It has significant support already for the PDF, JSON, and XML standards, as well as text, invoices, documents and log files. It also enables getting data from and putting data back into Microsoft Excel spreadsheets. Datawatch takes an “inspect and recommend” approach to data preparation, which can be especially useful with unstructured data in documents and files. Monarch can identify the data’s format and structure, present what can be extracted and enable setup of a template to use with similar data. A capability the company calls reliable redaction helps in sharing information; after data is extracted it applies masking for privacy purposes to ensure that no regulations and compliance policies are violated. Monarch also supports dataset preparation with many analytic tools like Tableau in which users struggle to access, blend, enrich and use data. Datawatch also has a visual analytics tool it acquired that complements its data preparation.

As organizations examine Datawatch’s tool, they should be aware that it is for more than one-off data preparation, providing a repeatable approach that can save time not only for individuals but for entire teams. Datawatch enables many individuals to share data and collaborate on work. This helps analysts and operations professionals work together, as well as involving IT in complex data-related processes.

Datawatch continues to grow its customer base in data preparation. A prominent example is Marbridge Foundation, which we recognized with our 2015 Leadership Award in Information Optimization for its work to make content and data from its systems comply with the Affordable Care Act – a project driven by its CFO. Marbridge took advantage of the flexibility in Datawatch’s software to extract and process data from Adobe Acrobat files in PDF format along with other applications to gain confidence that it is in compliance with this critical legislation. Data preparation provides the most value in lines of business such as finance, human resources, operations, marketing, sales, operations, customer and supply chain. Having a tool for data preparation that connects to business processes can produce VR2015_LeadershipAwardWinnersignificant impacts.

Data preparation software serves a growing market of business users who need to handle data but are not adept enough to do it with general analytics and business intelligence tools or to use data integration tools intended for IT professionals. Software providers that still use administrative or other IT-centric interfaces fall short of offering an independent and universal approach to data preparation. In addition to supporting conventional data sources and systems Datawatch handles content and documents to help establish a consistent approach to preparing all data needed in business with one tool.

With its heritage in data preparation Datawatch is well positioned to be on the short list of tools that help organizations realize the full value of information assets from inside or outside of the business. We recommend that organizations consider Datawatch and its advances in Monarch 13 for meeting a broader set of data preparation needs.

Regards,

Mark Smith

CEO and Chief Research Officer

Big Data Research Agenda and Trends are Bolder in 2015

Big data has become a big deal as the technology industry has invested tens of billions of dollars to create the next generation of databases and data processing. After the accompanying flood of new categories and marketing terminology from vendors, most in the IT community are now beginning to understand the potential of big data. Ventana Research thoroughly covered the evolving state of the big data and information optimization sector in 2014 and will continue this research in 2015 and beyond. As it progresses the importance of making big data systems interoperate with existing enterprise and information architecture along with digital transformation strategiesVentanaResearchLogo300pxbecomes critical. Done properly companies can take advantage of big data innovations to optimize their established business processes and execute new business strategies. But just deploying big data and applying analytics to understand it is just the beginning. Innovative organizations must go beyond the usual exploratory and root-cause analyses through applied analytic discovery and other techniques. This of course requires them to develop competencies in information management for big data.

Among big data technologies, the open source Hadoop has been commercialized by now established providers including Cloudera, Hortonworks and MapR and made available in the cloud through platforms such as Qubole, which received a Ventana Research Technology Innovation Award in 2014. Other big data technologies are growing as well; for example, use of in-memory and vr_BDI_03_plans_for_big_data_technologyspecialized databases also is growing like Hadoop in more than 40 percent of organizations, according to our big data integration benchmark research. These technologies have been integrated into databases or what I call hybrid big data appliances like those from IBM, Oracle, SAP and Teradata that bring the power of Hadoop to the RDBMS and exploit in-memory processing to perform ever faster computing. When placed into hosted and cloud environments these appliances can virtualize big data processing. Another new provider, Splice Machine, brings the power of SQL processing in a scalable approach that uses Hadoop in a cloud-based approach; it received a Ventana Research Technology Leadership Award last year. Likewise advances in NoSQL approaches help organizations process and utilize semistructured information along with other information and blend them with analytics as Datawatch does. These examples show that disruptive technologies still have the potential to revolutionize our approaches to managing information.

Our firm also explores what we call information optimization,Ventana_Research_2014_Tech_Innovation_Award_Main which assesses techniques for gaining full value from business information. Big data is one of these when used effectively in an enterprise information architecture. In this context the  “data lake” analogy is not helpful in representing the full scope of big data, suggesting simply a container like a data marts or data warehouse. With big data, taking an architectural approach is critical. This viewpoint is evident in our 2014 Ventana Research Technology Innovation Award in Information Management to Teradata for its Unified Data Architecture. Another award winner, Software AG, blends big data and information optimization using its real-time and in-memory processing technologies.

Businesses need to process data in rapid cycles, many in real time and what we call operational intelligence, which utilizes events and streams and provides the ability to sense and respond immediately to issues and opportunities in organizations that adapt to a data-driven culture.vr_oi_how_operational_intellegence_is_used Our operational intelligence research finds that monitoring, alerting and notification are the top use cases for deployment, in more than half of organizations. Also machine data can help businesses optimize not just IT processes but business processes that help govern and control the security of data in the enterprise. This imperative is evident in the dramatic growth of suppliers such as Splunk, Sumo Logic and Savi Technology, all of which won Ventana Research Technology Innovation awards for how they process machine and business data in large volumes at rapid velocity.

Another increasing trend in big data is presenting it in ways that ordinary users can understand quickly. Discovery and advanced visualization is not enough for business users who are not trained to interpret these presentations. Some vendors can present locationvr_Big_Data_Analytics_08_top_capabilities_of_big_data_analyticsand geospatial data on maps that are easier to understand. At the other end of the user spectrum data scientists and analysts need more robust analytic and discovery tools, including predictive analytics, which is a priority for many organizations, according toour big data analytics research. In 2015 we will examine the next generation of predictive analytics in new benchmark research. But there is more work to do to present insights from information that are easy to understand. Some analytics vendors are telling stories by linking pages of content, but these narratives don’t as yet help individuals assess and act. Most analytics tools can’t match the simple functionality of Microsoft PowerPoint, placing descriptive titles, bullets and recommendations on a page with a graphic that represents something important to these business professional who reads it. Deeper insights may come from advances in machine learning and cognitive computing that have arrived on the market and bring more science to analytics.

So we strong potential for the outputs of big data, but they don’t arrive just by loading data into these new computing environments. Pragmatic and experienced professionals realize that information management processes do not disappear. A key one in this area is data preparation, which helps  ready vr_BDI_12_managing_big_data_integrationdata sets for processing into big data environments. Preparing data is the second-most important task for 46 percent of organizations in our big data integration research. A second is data integration, which some new tools can automate. This can enable lines of business and IT to work together on big data integration, as 41 percent of organizations in our research are planning to do. To address this need a new generation of technologies came into their own in 2014 including those that received Ventana Research Technology Innovation Awards like Paxata and Tamr but also Trifacta.

Yet another area to watch is the convergence of big data and cloud computing. The proliferation of data sources in the cloud forces organizations to managed and integrate data from a variety of cloud and Internet sources, hence the rise of information as a service for business needs. Ventana Research Technology Innovation Award winner DataSift provides information as a service to blend social media data with other big data and analytics. Such techniques require more flexible environments for integration that can operate anywhere at any time. Dell Boomi, MuleSoft, SnapLogic and others now challenge established data integration providers such as Informatica and others including IBM, Oracle and SAP. Advances in master data management, data governance, data quality and integration backbones, and Informatica and Information Builders help provide better consistency of any type of big data for any business purpose. In addition our research finds that data security is critical for big data in 61 percent of organizations; only 14 percent said that is very adequate in their organization.

There is no doubt that big data is now widespread; vr_Info_Optimization_12_big_data_is_widely_usedalmost 80 percent of organizations in our information optimization research, for example, will be using it some form by the end of 2015. This is partly due to increased use across the lines of business; our research on next-generation customer analytics in 2014 shows that it is important to improving understanding customers in 60 percent of organizations, is being used in one-fifth of organizations and will be in 46 percent by the end of this year. Similarly our next-generation finance analytics research in 2014 finds big data important to 37 percent of organizations, with 13 percent using it today and 42 percent planning to by the end of 2015. And we have already measured how it will impact human capital management and HR and where organizations are leveraging it in this area of importance.

I invite you to download and peruse our big data agenda for 2015. We will examine how organizations can vr_BDI_08_benefits_of_big_data_integrationinstrument information optimization processes that use big data and pass this guidance along. We will explore big data’s role in sales and product areas and produce new research on data and analytics in the cloud. Our research will uncover best practices that innovative organizations use not only to prepare and integrate big data but also more tightly unify it with analytics and operations across enterprise and cloud computing environments. For many organizations taking on this challenge and seeking its benefits will require new information platforms and methods to access and provide information as part of their big data deployments. (Getting consistent information across the enterprise is the top benefit of big data integration according to 39 percent of organizations.) We expect 2015 to be a big year for big data and information optimization. I look forward to providing more insights and information about big data and helping everyone get the most from their time and investments in it.

Regards,

Mark Smith

CEO and Chief Research Officer