You are currently browsing the tag archive for the ‘Informatica’ tag.

Big data has become a big deal as the technology industry has invested tens of billions of dollars to create the next generation of databases and data processing. After the accompanying flood of new categories and marketing terminology from vendors, most in the IT community are now beginning to understand the potential of big data. Ventana Research thoroughly covered the evolving state of the big data and information optimization sector in 2014 and will continue this research in 2015 and beyond. As it progresses the importance of making big data systems interoperate with existing enterprise and information architecture along with digital transformation strategiesVentanaResearchLogo300pxbecomes critical. Done properly companies can take advantage of big data innovations to optimize their established business processes and execute new business strategies. But just deploying big data and applying analytics to understand it is just the beginning. Innovative organizations must go beyond the usual exploratory and root-cause analyses through applied analytic discovery and other techniques. This of course requires them to develop competencies in information management for big data.

Among big data technologies, the open source Hadoop has been commercialized by now established providers including Cloudera, Hortonworks and MapR and made available in the cloud through platforms such as Qubole, which received a Ventana Research Technology Innovation Award in 2014. Other big data technologies are growing as well; for example, use of in-memory and vr_BDI_03_plans_for_big_data_technologyspecialized databases also is growing like Hadoop in more than 40 percent of organizations, according to our big data integration benchmark research. These technologies have been integrated into databases or what I call hybrid big data appliances like those from IBM, Oracle, SAP and Teradata that bring the power of Hadoop to the RDBMS and exploit in-memory processing to perform ever faster computing. When placed into hosted and cloud environments these appliances can virtualize big data processing. Another new provider, Splice Machine, brings the power of SQL processing in a scalable approach that uses Hadoop in a cloud-based approach; it received a Ventana Research Technology Leadership Award last year. Likewise advances in NoSQL approaches help organizations process and utilize semistructured information along with other information and blend them with analytics as Datawatch does. These examples show that disruptive technologies still have the potential to revolutionize our approaches to managing information.

Our firm also explores what we call information optimization,Ventana_Research_2014_Tech_Innovation_Award_Main which assesses techniques for gaining full value from business information. Big data is one of these when used effectively in an enterprise information architecture. In this context the  “data lake” analogy is not helpful in representing the full scope of big data, suggesting simply a container like a data marts or data warehouse. With big data, taking an architectural approach is critical. This viewpoint is evident in our 2014 Ventana Research Technology Innovation Award in Information Management to Teradata for its Unified Data Architecture. Another award winner, Software AG, blends big data and information optimization using its real-time and in-memory processing technologies.

Businesses need to process data in rapid cycles, many in real time and what we call operational intelligence, which utilizes events and streams and provides the ability to sense and respond immediately to issues and opportunities in organizations that adapt to a data-driven culture.vr_oi_how_operational_intellegence_is_used Our operational intelligence research finds that monitoring, alerting and notification are the top use cases for deployment, in more than half of organizations. Also machine data can help businesses optimize not just IT processes but business processes that help govern and control the security of data in the enterprise. This imperative is evident in the dramatic growth of suppliers such as Splunk, Sumo Logic and Savi Technology, all of which won Ventana Research Technology Innovation awards for how they process machine and business data in large volumes at rapid velocity.

Another increasing trend in big data is presenting it in ways that ordinary users can understand quickly. Discovery and advanced visualization is not enough for business users who are not trained to interpret these presentations. Some vendors can present locationvr_Big_Data_Analytics_08_top_capabilities_of_big_data_analyticsand geospatial data on maps that are easier to understand. At the other end of the user spectrum data scientists and analysts need more robust analytic and discovery tools, including predictive analytics, which is a priority for many organizations, according toour big data analytics research. In 2015 we will examine the next generation of predictive analytics in new benchmark research. But there is more work to do to present insights from information that are easy to understand. Some analytics vendors are telling stories by linking pages of content, but these narratives don’t as yet help individuals assess and act. Most analytics tools can’t match the simple functionality of Microsoft PowerPoint, placing descriptive titles, bullets and recommendations on a page with a graphic that represents something important to these business professional who reads it. Deeper insights may come from advances in machine learning and cognitive computing that have arrived on the market and bring more science to analytics.

So we strong potential for the outputs of big data, but they don’t arrive just by loading data into these new computing environments. Pragmatic and experienced professionals realize that information management processes do not disappear. A key one in this area is data preparation, which helps  ready vr_BDI_12_managing_big_data_integrationdata sets for processing into big data environments. Preparing data is the second-most important task for 46 percent of organizations in our big data integration research. A second is data integration, which some new tools can automate. This can enable lines of business and IT to work together on big data integration, as 41 percent of organizations in our research are planning to do. To address this need a new generation of technologies came into their own in 2014 including those that received Ventana Research Technology Innovation Awards like Paxata and Tamr but also Trifacta.

Yet another area to watch is the convergence of big data and cloud computing. The proliferation of data sources in the cloud forces organizations to managed and integrate data from a variety of cloud and Internet sources, hence the rise of information as a service for business needs. Ventana Research Technology Innovation Award winner DataSift provides information as a service to blend social media data with other big data and analytics. Such techniques require more flexible environments for integration that can operate anywhere at any time. Dell Boomi, MuleSoft, SnapLogic and others now challenge established data integration providers such as Informatica and others including IBM, Oracle and SAP. Advances in master data management, data governance, data quality and integration backbones, and Informatica and Information Builders help provide better consistency of any type of big data for any business purpose. In addition our research finds that data security is critical for big data in 61 percent of organizations; only 14 percent said that is very adequate in their organization.

There is no doubt that big data is now widespread; vr_Info_Optimization_12_big_data_is_widely_usedalmost 80 percent of organizations in our information optimization research, for example, will be using it some form by the end of 2015. This is partly due to increased use across the lines of business; our research on next-generation customer analytics in 2014 shows that it is important to improving understanding customers in 60 percent of organizations, is being used in one-fifth of organizations and will be in 46 percent by the end of this year. Similarly our next-generation finance analytics research in 2014 finds big data important to 37 percent of organizations, with 13 percent using it today and 42 percent planning to by the end of 2015. And we have already measured how it will impact human capital management and HR and where organizations are leveraging it in this area of importance.

I invite you to download and peruse our big data agenda for 2015. We will examine how organizations can vr_BDI_08_benefits_of_big_data_integrationinstrument information optimization processes that use big data and pass this guidance along. We will explore big data’s role in sales and product areas and produce new research on data and analytics in the cloud. Our research will uncover best practices that innovative organizations use not only to prepare and integrate big data but also more tightly unify it with analytics and operations across enterprise and cloud computing environments. For many organizations taking on this challenge and seeking its benefits will require new information platforms and methods to access and provide information as part of their big data deployments. (Getting consistent information across the enterprise is the top benefit of big data integration according to 39 percent of organizations.) We expect 2015 to be a big year for big data and information optimization. I look forward to providing more insights and information about big data and helping everyone get the most from their time and investments in it.

Regards,

Mark Smith

CEO and Chief Research Officer

VentanaResearch_TechInnovation_Award_Winner_2013At the Informatica World 2014 conference, the company known for its data integration software unveiled the Intelligent Data Platform. In the last three years Informatica has expanded beyond data integration and now has a broad software portfolio that facilitates information management within the enterprise and through cloud computing. The Intelligent Data Platform forms a framework for its portfolio. This expression of broad potential is important for Informatica, which has been slow to position its products as capable of more than data integration. A large part of the value it provides lies in what its products can do to help organizations strengthen their enterprise architectures for managing applications and data. We see Informatica’s sweet spot in facilitating efficient use of data for business and IT purposes; we call this information optimization.

Informatica’s Intelligent Data Platform is built in three layers. The bottom layer is Informatica Vibe, the virtual data machine that I covered at its launch last year. Informatica Vibe won our Ventana Research 2013 Technology Innovation Award for information optimization. It virtualizes information management technology to operate on any platform whether on-premises or in any form of cloud computing.

Above Informatica Vibe in the platform is a data infrastructure layer, which contains all the technologies that act upon data, from integration through archiving, masking, mastering, quality assurance, security, streaming and other tasks. At the core of this second layer is Informatica PowerCenter, which provides data integration and other capabilities central to processing of data into information. PowerCenter provides parsing, profiling, joining and filtering but also is integral for data services through Informatica’s Data Integration Hub that operates in a publish-and-subscribe model. The latest PowerCenter release, version 9.6, focuses on providing agility in development and provides a series of packaged editions that provide certain levels of functionality; users choose among them to fit their requirements. This developer support includes advances in test data management and data masking for enterprise-class needs. There are editions for Informatica Data Quality, too. The latest release of Informatica MDM, 9.7, improves the user experience for data stewards along with enhanced performance and governance. Not much was mentioned at the conference about Informatica’s Product Information Management (PIM) offering that our most recent Value Index vendor and product assessment rated Hot.

The third layer is data intelligence. Here Informatica has added capabilities to organize, infer and recommend action from data and to provision and map data to business needs. In addition Informatica’s Business Glossary and Metadata Manager help establish consistent definitions and use of data for operational or analytical tasks. Informatica RulePoint, a product that also was not mentioned much at the conference, processes events through workflow in a continuous rule based manner; depending on how processing occurs, its function is to support complex event processing or event streaming.

On top of the Intelligent Data Platform, Informatica has added a couple of new innovations. Project Springbok, which is not yet released, is a tool for preparation of data for analytics and operations through its Innovation division. This new product will use Informatica’s expertise in providing access to and integration of data sources, which according to our information optimization benchmark research is the top analyst requirement in 39 percent of organizations. Despite data warehouse efforts, analysts and business users still have to access many data sources. Simplifying information is critical for nearly all organizations that have more than 16 data sources. Demonstrations showed that Springbok can dynamically create and automate the transformations that run in PowerCenter. It also offers access to a master reference to ensure that data is processed in a consistent manner. IT professionals gain visibility into what business units are doing to show how they can help in provisioning data. Even in beta release Springbok has significant potential to address the range of data issues analysts face and reduce the time they spend on data-related tasks. Our research has shown for several years that this data challenge presses organizations to diversify the tools they use, and software vendors in this market have responded. Informatica will have to compete with more than a dozen others and demonstrate its superiority for integration. Our research finds that the lines of business and IT now share responsibility for information availability in 42 percent of organizations. Informatica will have to demonstrate its value to line of business analysts who are evaluating a new generation of tools for data and analytics.

A second innovation is a new data security product called Secure@Source, also being developed in the Innovation unit, is designed to protect data assets where they are stored and processed. This product moves Informatica into the information security market segment. Secure@Source helps users discover, detect, assess and protect data assets in their persistent locations and during consumption by applications or Internet services. The question is whether Informatica can convince current customers to examine it or will have to approach information security professionals who are not users of Informatica. Security of data is among the top five required data activities according to our research and a key part of the manageability requirements that organizations find important in considering products. Informatica has an opportunity to insert itself into the dialogue in this area if it properly presents the new product to IT and business people alike.

vr_Big_Data_Analytics_02_defining_big_data_analyticsIn big data Informatica has made steady progress, but to reach its potential in this segment will require more investments in the mixed big data environments, not just Hadoop. As our research has shown for three years, customers want big data to distribute processing and integration of data across sources. Our recent research on big data analytics finds that three out of four (76%) define big data analytics as being about accessing and analyzing all sources of data. This poses a challenge for data integration, and our new research on big data integration finds that most have a long way to go in accessibility and mastering of data. Informatica begins to address this and has an opportunity in helping develop a new generation of data architecture.

vr_NG_Finance_Analytics_09_too_much_time_to_prepare_dataIn cloud computing, the company has consolidated its efforts to ensure that the cloud is part of its core technology. It released new versions for its cloud-based integration, quality, master and real-time data management products; these begin to address the challenge of process and application integration, which are important considerations for businesses in determining whether integrate or replace point cloud solutions to improve efficiency of tasks and business processes. Informatica has continued to focus on integrating mostly with the large cloud computing providers and has yet to invest in streamlining processes in particular lines of business. This has left openings for other cloud integration providers to compete, making it harder than expected for Informatica to dominate in this segment. The next step here is up to Informatica.

I believe that one of the highest potential opportunities for Informatica is in the application architectures of organizations whose business processes have been distributed through a collection of cloud-based applications that lack interconnectivity and integration. For example, finance departments often have software from different providers for budgeting and planning, consolidation and reporting, accounting and payroll management. When these applications are spread across the cloud, connecting them is a real challenge, let alone trying to get information from sales force automation and customer service applications. The implications of this are shown in our finance analytics research : Data-related tasks consume the most time and impede the efficiency of financial processes as they do in all other line of business areas that we have researched. Similar situations exist in customer-related areas (marketing, sales and customer service) and employee management processes (recruiting, onboarding, performance, compensation and learning). Informatica has made progress with Informatica Cloud Extend for interconnecting tasks across applications, which can help streamline processes. While perhaps not obvious to data integration specialists, this level of process automation and integration is essential to the future of cloud computing. Informatica also announced it will offer master data management in the cloud; this should help it not just to place a data hub in the cloud but to help companies interoperate separate cloud applications more efficiently.

Overall the Informatica Intelligent Data Platform is a good reference model for tasks related to turning data into information assets. But it could be much distinct in how its automation accelerates the processing of data faster and helps specific roles work faster and smarter. This platform does not provide a context for enterprise architectures that are stretched between on-premises and various cloud deployments. Organizations will have to determine whether Informatica’s approach fits their future data and architectural needs. As Informatica pushes its platform approach, it has to ensure it is seen as a leader in big data integration, helping business analysts with data, supporting a larger number of application sources and connecting cloud computing through unifying business applications. This won’t be easy to accomplish as Informatica has not been as progressive in the broader approach to big data and use across operations and analytics.

VR_leadershipwinnerInformatica has been growing substantially and is getting close to US$1 billion in annual software revenue. We have recognized its success through rating it a Hot vendor in our Data Integration Value Index and naming one of its customers, the CIO of UMass Memorial Health Care, the Chief Information Officer in our 2013 Leadership Awards. Informatica has been continuing substantial investment in R&D. Its acquisitions of data-related software companies have helped it grow, and Informatica has invested to integrate the products with PowerCenter. With almost half (49%) of organizations planning to change their information availability processes, the opportunity for Informatica is significant; its challenge is to gain the confidence and recognition by business customers, who now play a larger role in the selection and purchasing of software. This will require Informatica to speak their language of business and not just technology but the business processes that they are held accountable. Informatica is a major player in information management; now it must become as significant a choice for streamlining business processes and use of applications and data across the enterprise and cloud computing to enable information optimization.

Regards,

Mark Smith

CEO & Chief Research Officer

Mark Smith – Twitter

Top Rated

Stats

  • 182,142 hits
Follow

Get every new post delivered to your Inbox.

Join 18,819 other followers

%d bloggers like this: