You are currently browsing the category archive for the ‘Operational Performance Management (OPM)’ category.

VentanaResearch_TechInnovation_Award_Winner_2013At the Informatica World 2014 conference, the company known for its data integration software unveiled the Intelligent Data Platform. In the last three years Informatica has expanded beyond data integration and now has a broad software portfolio that facilitates information management within the enterprise and through cloud computing. The Intelligent Data Platform forms a framework for its portfolio. This expression of broad potential is important for Informatica, which has been slow to position its products as capable of more than data integration. A large part of the value it provides lies in what its products can do to help organizations strengthen their enterprise architectures for managing applications and data. We see Informatica’s sweet spot in facilitating efficient use of data for business and IT purposes; we call this information optimization.

Informatica’s Intelligent Data Platform is built in three layers. The bottom layer is Informatica Vibe, the virtual data machine that I covered at its launch last year. Informatica Vibe won our Ventana Research 2013 Technology Innovation Award for information optimization. It virtualizes information management technology to operate on any platform whether on-premises or in any form of cloud computing.

Above Informatica Vibe in the platform is a data infrastructure layer, which contains all the technologies that act upon data, from integration through archiving, masking, mastering, quality assurance, security, streaming and other tasks. At the core of this second layer is Informatica PowerCenter, which provides data integration and other capabilities central to processing of data into information. PowerCenter provides parsing, profiling, joining and filtering but also is integral for data services through Informatica’s Data Integration Hub that operates in a publish-and-subscribe model. The latest PowerCenter release, version 9.6, focuses on providing agility in development and provides a series of packaged editions that provide certain levels of functionality; users choose among them to fit their requirements. This developer support includes advances in test data management and data masking for enterprise-class needs. There are editions for Informatica Data Quality, too. The latest release of Informatica MDM, 9.7, improves the user experience for data stewards along with enhanced performance and governance. Not much was mentioned at the conference about Informatica’s Product Information Management (PIM) offering that our most recent Value Index vendor and product assessment rated Hot.

The third layer is data intelligence. Here Informatica has added capabilities to organize, infer and recommend action from data and to provision and map data to business needs. In addition Informatica’s Business Glossary and Metadata Manager help establish consistent definitions and use of data for operational or analytical tasks. Informatica RulePoint, a product that also was not mentioned much at the conference, processes events through workflow in a continuous rule based manner; depending on how processing occurs, its function is to support complex event processing or event streaming.

On top of the Intelligent Data Platform, Informatica has added a couple of new innovations. Project Springbok, which is not yet released, is a tool for preparation of data for analytics and operations through its Innovation division. This new product will use Informatica’s expertise in providing access to and integration of data sources, which according to our information optimization benchmark research is the top analyst requirement in 39 percent of organizations. Despite data warehouse efforts, analysts and business users still have to access many data sources. Simplifying information is critical for nearly all organizations that have more than 16 data sources. Demonstrations showed that Springbok can dynamically create and automate the transformations that run in PowerCenter. It also offers access to a master reference to ensure that data is processed in a consistent manner. IT professionals gain visibility into what business units are doing to show how they can help in provisioning data. Even in beta release Springbok has significant potential to address the range of data issues analysts face and reduce the time they spend on data-related tasks. Our research has shown for several years that this data challenge presses organizations to diversify the tools they use, and software vendors in this market have responded. Informatica will have to compete with more than a dozen others and demonstrate its superiority for integration. Our research finds that the lines of business and IT now share responsibility for information availability in 42 percent of organizations. Informatica will have to demonstrate its value to line of business analysts who are evaluating a new generation of tools for data and analytics.

A second innovation is a new data security product called Secure@Source, also being developed in the Innovation unit, is designed to protect data assets where they are stored and processed. This product moves Informatica into the information security market segment. Secure@Source helps users discover, detect, assess and protect data assets in their persistent locations and during consumption by applications or Internet services. The question is whether Informatica can convince current customers to examine it or will have to approach information security professionals who are not users of Informatica. Security of data is among the top five required data activities according to our research and a key part of the manageability requirements that organizations find important in considering products. Informatica has an opportunity to insert itself into the dialogue in this area if it properly presents the new product to IT and business people alike.

vr_Big_Data_Analytics_02_defining_big_data_analyticsIn big data Informatica has made steady progress, but to reach its potential in this segment will require more investments in the mixed big data environments, not just Hadoop. As our research has shown for three years, customers want big data to distribute processing and integration of data across sources. Our recent research on big data analytics finds that three out of four (76%) define big data analytics as being about accessing and analyzing all sources of data. This poses a challenge for data integration, and our new research on big data integration finds that most have a long way to go in accessibility and mastering of data. Informatica begins to address this and has an opportunity in helping develop a new generation of data architecture.

vr_NG_Finance_Analytics_09_too_much_time_to_prepare_dataIn cloud computing, the company has consolidated its efforts to ensure that the cloud is part of its core technology. It released new versions for its cloud-based integration, quality, master and real-time data management products; these begin to address the challenge of process and application integration, which are important considerations for businesses in determining whether integrate or replace point cloud solutions to improve efficiency of tasks and business processes. Informatica has continued to focus on integrating mostly with the large cloud computing providers and has yet to invest in streamlining processes in particular lines of business. This has left openings for other cloud integration providers to compete, making it harder than expected for Informatica to dominate in this segment. The next step here is up to Informatica.

I believe that one of the highest potential opportunities for Informatica is in the application architectures of organizations whose business processes have been distributed through a collection of cloud-based applications that lack interconnectivity and integration. For example, finance departments often have software from different providers for budgeting and planning, consolidation and reporting, accounting and payroll management. When these applications are spread across the cloud, connecting them is a real challenge, let alone trying to get information from sales force automation and customer service applications. The implications of this are shown in our finance analytics research : Data-related tasks consume the most time and impede the efficiency of financial processes as they do in all other line of business areas that we have researched. Similar situations exist in customer-related areas (marketing, sales and customer service) and employee management processes (recruiting, onboarding, performance, compensation and learning). Informatica has made progress with Informatica Cloud Extend for interconnecting tasks across applications, which can help streamline processes. While perhaps not obvious to data integration specialists, this level of process automation and integration is essential to the future of cloud computing. Informatica also announced it will offer master data management in the cloud; this should help it not just to place a data hub in the cloud but to help companies interoperate separate cloud applications more efficiently.

Overall the Informatica Intelligent Data Platform is a good reference model for tasks related to turning data into information assets. But it could be much distinct in how its automation accelerates the processing of data faster and helps specific roles work faster and smarter. This platform does not provide a context for enterprise architectures that are stretched between on-premises and various cloud deployments. Organizations will have to determine whether Informatica’s approach fits their future data and architectural needs. As Informatica pushes its platform approach, it has to ensure it is seen as a leader in big data integration, helping business analysts with data, supporting a larger number of application sources and connecting cloud computing through unifying business applications. This won’t be easy to accomplish as Informatica has not been as progressive in the broader approach to big data and use across operations and analytics.

VR_leadershipwinnerInformatica has been growing substantially and is getting close to US$1 billion in annual software revenue. We have recognized its success through rating it a Hot vendor in our Data Integration Value Index and naming one of its customers, the CIO of UMass Memorial Health Care, the Chief Information Officer in our 2013 Leadership Awards. Informatica has been continuing substantial investment in R&D. Its acquisitions of data-related software companies have helped it grow, and Informatica has invested to integrate the products with PowerCenter. With almost half (49%) of organizations planning to change their information availability processes, the opportunity for Informatica is significant; its challenge is to gain the confidence and recognition by business customers, who now play a larger role in the selection and purchasing of software. This will require Informatica to speak their language of business and not just technology but the business processes that they are held accountable. Informatica is a major player in information management; now it must become as significant a choice for streamlining business processes and use of applications and data across the enterprise and cloud computing to enable information optimization.

Regards,

Mark Smith

CEO & Chief Research Officer

I recently attended the 2014 global analyst summit in San Francisco hosted by Pitney Bowes, an old technology company (now in business for 94 years) that has a new focus in its software along with an entirely new executive team. These leaders unveiled a business and technology strategy meant to demonstrate the company’s commitment to software. For many years it has been known mostly for mail services and postage metering, but Pitney Bowes also has made investments in software that can help companies change their business processes by optimizing their information assets. Over the past few years the company has had its ups and downs as regards its corporate mission, as I wrote in 2012. Most of the turmoil was due to conflicting agendas from past management, but other factors were the company was not as clear in communicating the value of its combined software portfolio and not capitalizing on the demand in lines of business and IT for information management and analytics software.

vr_Info_Optimization_10_reasons_to_change_information_availabilityPitney Bowes has several important assets. Its location intelligence software can provide consumers with accurate information about and directions to locations, enable businesses to target customers more accurately and help businesses be more responsive by adding a geographic context to customer information. Pitney Bowes also has advanced its efforts in information management to integrate and enrich data through spatial processing with a specialization in customer informationMy analysis of Spectrum, its information management software suite, found innovation in its support for analytical and operational use. Organizations are changing how they make information available as lines of business and IT share responsibility for improving information availability, which they do in 42 percent of organizations participating in our information optimization research; the most common reasons for doing so are to improve operational efficiency (cited by 67%) and to gain a competitive advantage (63%).

The latest version of the Spectrum Technology Platform includes data enrichment and quality in its data integration offering. It has advanced in search, query design, in-memory caching and support for Hadoop. The platform also is the foundation for a master information hub that can build relationship-based maps that Pitney Bowes calls knowledge graphs. These maps are more powerful than data relationship-based models that can’t map complex relationships and present them visually. In IT domains this is called master data management, but it goes beyond the usual entity relationship modeling to visualize and manage customer information from a business perspective; this can help bring business users into the process. It also can discover the locations of any information, which our information optimization research finds is important to more than one-fifth (22%) of companies. Having consistent information, particularly about customers, is critical for interaction across a business in providing the best possible customer experience. The Spectrum Platform also can process data in real time, which is attractive to the company’s new customers including large retailers and social media companies like Facebook. We find that only one-fourth (26%) of organizations are happy with their current technology used to provide information, which indicates an opportunity for Pitney Bowes to take a more aggressive position in the market.

vr_Customer_Analytics_02_drivers_for_new_customer_analyticsIn addition Pitney Bowes has customer and marketing analytics software called Portrait Analytics, which offers an engaging way to visualize and interact with customer information. It also can predict potential results through the Portrait Miner application. Another product, Portrait Uplift, can help companies can use the analytics to apply adaptive learning from customer interactions to model how customers will interact and where changes might be needed to stimulate purchasing and reduce churn. These applications have significant potential for marketing and other customer-focused functions because they do not require a data scientist to use them. In our next-generation customer analytics research more than half (59%) of participants said it is very important to improve them while only 15 percent are satisfied with their current efforts. Predictive analytics is the type of advanced analytics most important to 69 percent of organizations, and only one-fifth (22%) are happy with their current software. Pitney Bowes is in position to look for opportunities here as long as it focuses its efforts to the top drivers that are improving the customer experience (63%) and improve the customer service strategy (57%).

The company has not lost its focus on location intelligence software. Its flagship product here remains MapInfo, but it has made investments to highlight its potential for enterprise location intelligence by supporting it in social media and Internet location-based services. A series of new releases of MapInfo after the current version 12 are coming in 2014. The company says that enhancements will include the user experience, a more contextual interface, 64-bit processing, a layout designer, contextual menus and display management. To take advantage of multiple-core processing, Pitney Bowes has segmented processes to accelerate performance. Our latest research in location analytics found that reliability, which includes performance and scalability, is the third-most important software evaluation criterion. The updates also will expand mapping from vector-based to grid-based analysis. In addition mobile technology is a priority for MapInfo, as it should be: This is the second-most important technology innovation according to our location analytics research. MapInfo Stratus is a cloud-based extension of MapInfo Pro. Some support for mobile devices is available today, and more improvements to the experience are coming. This and other advances address the innovations that are changing computing and users’ expectations and are critical to keep MapInfo relevant.

vr_LA_location_analytics_delivers_business_valuePitney Bowes built its position in location intelligence on geocoding, integrating data sets across on-premises and cloud systems for access from a range of applications. Pitney Bowes includes 122 countries in its geocoding, and its software can provide multiple levels of accuracy based on what is available in a cascading approach. It also provides reverse geocoding, which helps identify locations through longitude and latitude; this is used by major social media. Pitney Bowes has advanced the capabilities so users can type ahead to find locations in proximity; this is especially critical for mobile application support. The ubiquity of mobile devices and Internet use and the growth of complex event processing and visualization are bringing new opportunities for Pitney Bowes’ geocoding and location processing. Our location analytics research finds that many organizations view current methods as not reliable, taking too many resources and too slow. Using a dedicated approach can deliver business value like improve customer experience (20%) and gain competitive advantage (17%).

vr_LA_dedicated_technology_provides_satisfactionPitney Bowes has announced the release of Spectrum Spatial, a location intelligence platform built on MapInfo. Additionally Spectrum Spatial Analyst is an interactive tool to examine spatial and location attributes of data. The Spectrum Spatial technology and Spectrum Spatial for BI are the basis for Pitney Bowes establishing new partnerships with IBM and SAP. Having location analytics with business intelligence is still rare and has potential to enhance business analysis as found in almost a third (30%) of organizations while using a dedicated approach with GIS and location analytics provides satisfaction in almost half (49%) of organizations.

Our big data research finds that only 16 percent of organizations are using geospatial analysis in big data analytics; we think this is an overlooked opportunity given the value of location in enhancing information assets. Pitney Bowes has focused on providing location intelligence to suppliers of standard RDBMSs and data warehousing, which are only a subset of the big data environment. But the company realizes the importance of big data and has announced expanded support for it in Spectrum to ensure it can rapidly access these new sources. It includes new in-memory support (50%) for SAP HANA and Hadoop (42%), which are the two types of big data support revealed as most important in our research.

vr_Big_Data_Analytics_15_new_technologies_enhance_analyticsOverall Pitney Bowes software combines data management and analytics with its specialization in location to meet today’s need to optimize information in real time to support both consumers and business decision-makers. In the past year it has gained traction with social media companies and other types of businesses through direct approaches. Its main challenges are that its brand is not known for solving these types of problems and that it has not been able to assert its presence through marketing. Our research on buyers and users of software for information management and big data analytics has identified demand for these capabilities; Pitney Bowes should use its own software to better market and sell its products as well as the technology deserves. Its new executives have made the strategy more clear; now it is time for the organization to execute it through more and better marketing to ensure that potential customers consider its products that deliver business insights.

Regards,

Mark Smith

CEO & Chief Research Officer

Mark Smith – Twitter

Top Rated

Stats

  • 156,407 hits
Follow

Get every new post delivered to your Inbox.

Join 17,457 other followers

%d bloggers like this: