You are currently browsing the tag archive for the ‘CIO’ tag.
In recent years line-of-business applications including accounting, human resources, manufacturing, sales and customer service have appeared in the cloud. Cloud -based software as a service (SaaS) has replaced on-premises applications that were previously part of ERP and CRM environments. They have helped companies become more efficient but have also introduced interoperability challenges between business processes. Their advantage is that cloud software can be rented, configured and used within a day or week. The disadvantage is that they don’t always connect with one another seamlessly, as they used to and when managed by a third party there is limited connectivity to integrate them.
Smooth interoperation is critical for business processes that use ERP. The hybrid computing approach to ERP was assessed by my colleague Robert Kugel, who identified the challenges in these early approaches to ERP in the cloud. As on-premises ERP, these applications were fully connected and integrated with process and data integration wired together through additional technologies. This configurability of ERP has been a large challenge and has led to failed implementations, as Robert noted. Understanding the complexities of ERP today is key to coming up with a solution.
One of the most fundamental business processes in organizations is tracking an order to fulfillment. That needs to be connected without manual intervention for transactional efficiency and to ensure data, analytics and planning are available. In fact our next generation of finance analytics benchmark research finds that ERP is the second-most important source of data being analyzed, after spreadsheets, and is the first-ranked source in almost one-third (32%) of organizations. In the transition to cloud computing this has become more complex. Applications that support the order-to-fulfillment process, for example, are increasingly being used individually in cloud computing, and many are being mingled with on-premises applications that require integration and automation to avoid manual intervention.
To start, orders are created by sales. Sales opportunities are created and closed with products and services purchased by contract or electronically. These must be placed in an order management system through which the record can be used by a multitude of systems and processes. Purchasing and fulfillment details are moved into an invoice as part of a billing application, then entered into accounting and accounts receivable where it is managed to payment. Finance takes orders and consolidates them into reports, typically in a financial performance management system. The order is then provided to manufacturing or fulfillment applications which build and deliver products, then fulfilled through warehouse and distribution management. The customer name and order number is also placed into an application that handles support calls or deploys services.
This is the reality of business today. Many departments and individuals, with different applications, are needed to support orders, fulfillment and service. If these applications are not connected organizations have to perform manual intervention, re-enter data, or copy and paste, which not only wastes time and resources but can introduce errors. Half (56%) of organizations do the integration through spreadsheets or exporting data, with custom coding being second-most popular (for 39%), according to our business data in the cloud research. (I will leave HR applications and the supporting people component out of this process though they play a critical role as an enterprise resource to support the order-to-fulfillment process and are part of many ERP deployments.) In addition performing any level of business planning is not simple as data is needed to determine past performance and plan for the future.
There is no simple way to make all this efficient. It has historically been managed by ERP, usually through a single vendor’s application suite. Now businesses want it done in the cloud, either on-premises or through other cloud applications. Proper attention must be paid to the needs and competencies of the departments and business processes involved. Thus migration and transition to ERP are not simple, nor is building an application architecture that supports process and data efficiently. Assessment and planning are necessary to ensure that risks are addressed. Switching to a new ERP system in the cloud will still require an application architecture to maintain proper operations across departments and existing applications, whether they are on-premises, in a private cloud, in the public cloud or in a hybrid combination. This integration among sales force automation, customer service and other systems is outside the scope of most cloud ERP deployments who have not provided the most robust integration points for the applications and data.
Robert Kugel writes that ERP must take a giant leap in order to operate in the cloud. I agree. Our firm often gets requests for assistance in finding the right approach to ERP and business process. While midsize companies find ERP in the cloud increasingly attractive, there are significant challenges to adapting and integrating such applications as part of business processes, which many customers overlook in their desire for a cloud-based solution. The majority of cloud ERP vendors have not provided integration and workflow of information from their applications to others in an open and seamless manner, complicating deployments and adding unexpected costs for customers.
ERP suppliers moving from on-premises to cloud computing have acknowledged the complexities. Many of the legacy ERP vendors have struggled to enable application interoperability and the differing management requirements of IT and business. This struggle has resulted in a lack of confidence by organizations wishing to migrate to ERP in the cloud and makes them wonder whether to look at alternative approaches with individual applications using integration across them and data to support business processes. Automation is another major concern. The lack of it across business processes has impeded finance groups in closing the books efficiently; our research on the fast, clean close shows many organizations losing four or more days.
In the meantime technological advances in integration technologies that operate in the cloud computing environment can interconnect with on-premises systems. The ability to simultaneously distribute transactions and data across systems makes the ability to architect business processes and workflows a reality. For some organizations the use of business process management and other integration technology approaches are being adopted. The new technologies are able to blend into many applications, so that users do not know they are working on applications from different vendors, nor do they need to know. These advances enable application architecture to interoperate and automate the flow of data from on-premises and cloud computing environments, providing new opportunities to interoperate from ERP or other applications. Many organizations are doing this today, and more than one-third of companies are planning or need to move data from cloud to cloud, cloud to on-premises or on-premises to cloud, data according our business data in the cloud research.
No matter whether a company is in manufacturing or services, they must address the integration complexities to gain efficiency and support growth without adding more resources. While many new ERP providers in the cloud are taking simpler approaches from the applications interface to how it processes transactions, most have not learned the importance of integration to other systems and the need for accessing and integrating data for transactions and analytics. Having consistency of data across applications and systems is a major obstacle in more than one-fifth (21%) of organizations and a minor obstacle in two-fifths (43%), and they find similar difficulties in the complexities of accessing data, according to our business data in the cloud research. In addition they lack effective planning (which of course is the P in ERP), and reporting is less than sufficient and often must be complemented by third-party tools.
All this introduces yet more complexity for business and IT in determining how they can move forward with ERP in the cloud, adapting existing and new applications to interoperate. The outlook for ERP in the cloud is thus uncertain. If these vendors do not adapt to the reality of what their customers want (as opposed to what they want their customers to do), it will remain cloudy. Responding to pressure to take an open approach that is easy to integrate, however, ERP providers could see a sunny forecast.
CEO & Chief Research Officer
At the Informatica World 2014 conference, the company known for its data integration software unveiled the Intelligent Data Platform. In the last three years Informatica has expanded beyond data integration and now has a broad software portfolio that facilitates information management within the enterprise and through cloud computing. The Intelligent Data Platform forms a framework for its portfolio. This expression of broad potential is important for Informatica, which has been slow to position its products as capable of more than data integration. A large part of the value it provides lies in what its products can do to help organizations strengthen their enterprise architectures for managing applications and data. We see Informatica’s sweet spot in facilitating efficient use of data for business and IT purposes; we call this information optimization.
Informatica’s Intelligent Data Platform is built in three layers. The bottom layer is Informatica Vibe, the virtual data machine that I covered at its launch last year. Informatica Vibe won our Ventana Research 2013 Technology Innovation Award for information optimization. It virtualizes information management technology to operate on any platform whether on-premises or in any form of cloud computing.
Above Informatica Vibe in the platform is a data infrastructure layer, which contains all the technologies that act upon data, from integration through archiving, masking, mastering, quality assurance, security, streaming and other tasks. At the core of this second layer is Informatica PowerCenter, which provides data integration and other capabilities central to processing of data into information. PowerCenter provides parsing, profiling, joining and filtering but also is integral for data services through Informatica’s Data Integration Hub that operates in a publish-and-subscribe model. The latest PowerCenter release, version 9.6, focuses on providing agility in development and provides a series of packaged editions that provide certain levels of functionality; users choose among them to fit their requirements. This developer support includes advances in test data management and data masking for enterprise-class needs. There are editions for Informatica Data Quality, too. The latest release of Informatica MDM, 9.7, improves the user experience for data stewards along with enhanced performance and governance. Not much was mentioned at the conference about Informatica’s Product Information Management (PIM) offering that our most recent Value Index vendor and product assessment rated Hot.
The third layer is data intelligence. Here Informatica has added capabilities to organize, infer and recommend action from data and to provision and map data to business needs. In addition Informatica’s Business Glossary and Metadata Manager help establish consistent definitions and use of data for operational or analytical tasks. Informatica RulePoint, a product that also was not mentioned much at the conference, processes events through workflow in a continuous rule based manner; depending on how processing occurs, its function is to support complex event processing or event streaming.
On top of the Intelligent Data Platform, Informatica has added a couple of new innovations. Project Springbok, which is not yet released, is a tool for preparation of data for analytics and operations through its Innovation division. This new product will use Informatica’s expertise in providing access to and integration of data sources, which according to our information optimization benchmark research is the top analyst requirement in 39 percent of organizations. Despite data warehouse efforts, analysts and business users still have to access many data sources. Simplifying information is critical for nearly all organizations that have more than 16 data sources. Demonstrations showed that Springbok can dynamically create and automate the transformations that run in PowerCenter. It also offers access to a master reference to ensure that data is processed in a consistent manner. IT professionals gain visibility into what business units are doing to show how they can help in provisioning data. Even in beta release Springbok has significant potential to address the range of data issues analysts face and reduce the time they spend on data-related tasks. Our research has shown for several years that this data challenge presses organizations to diversify the tools they use, and software vendors in this market have responded. Informatica will have to compete with more than a dozen others and demonstrate its superiority for integration. Our research finds that the lines of business and IT now share responsibility for information availability in 42 percent of organizations. Informatica will have to demonstrate its value to line of business analysts who are evaluating a new generation of tools for data and analytics.
A second innovation is a new data security product called Secure@Source, also being developed in the Innovation unit, is designed to protect data assets where they are stored and processed. This product moves Informatica into the information security market segment. Secure@Source helps users discover, detect, assess and protect data assets in their persistent locations and during consumption by applications or Internet services. The question is whether Informatica can convince current customers to examine it or will have to approach information security professionals who are not users of Informatica. Security of data is among the top five required data activities according to our research and a key part of the manageability requirements that organizations find important in considering products. Informatica has an opportunity to insert itself into the dialogue in this area if it properly presents the new product to IT and business people alike.
In big data Informatica has made steady progress, but to reach its potential in this segment will require more investments in the mixed big data environments, not just Hadoop. As our research has shown for three years, customers want big data to distribute processing and integration of data across sources. Our recent research on big data analytics finds that three out of four (76%) define big data analytics as being about accessing and analyzing all sources of data. This poses a challenge for data integration, and our new research on big data integration finds that most have a long way to go in accessibility and mastering of data. Informatica begins to address this and has an opportunity in helping develop a new generation of data architecture.
In cloud computing, the company has consolidated its efforts to ensure that the cloud is part of its core technology. It released new versions for its cloud-based integration, quality, master and real-time data management products; these begin to address the challenge of process and application integration, which are important considerations for businesses in determining whether integrate or replace point cloud solutions to improve efficiency of tasks and business processes. Informatica has continued to focus on integrating mostly with the large cloud computing providers and has yet to invest in streamlining processes in particular lines of business. This has left openings for other cloud integration providers to compete, making it harder than expected for Informatica to dominate in this segment. The next step here is up to Informatica.
I believe that one of the highest potential opportunities for Informatica is in the application architectures of organizations whose business processes have been distributed through a collection of cloud-based applications that lack interconnectivity and integration. For example, finance departments often have software from different providers for budgeting and planning, consolidation and reporting, accounting and payroll management. When these applications are spread across the cloud, connecting them is a real challenge, let alone trying to get information from sales force automation and customer service applications. The implications of this are shown in our finance analytics research : Data-related tasks consume the most time and impede the efficiency of financial processes as they do in all other line of business areas that we have researched. Similar situations exist in customer-related areas (marketing, sales and customer service) and employee management processes (recruiting, onboarding, performance, compensation and learning). Informatica has made progress with Informatica Cloud Extend for interconnecting tasks across applications, which can help streamline processes. While perhaps not obvious to data integration specialists, this level of process automation and integration is essential to the future of cloud computing. Informatica also announced it will offer master data management in the cloud; this should help it not just to place a data hub in the cloud but to help companies interoperate separate cloud applications more efficiently.
Overall the Informatica Intelligent Data Platform is a good reference model for tasks related to turning data into information assets. But it could be much distinct in how its automation accelerates the processing of data faster and helps specific roles work faster and smarter. This platform does not provide a context for enterprise architectures that are stretched between on-premises and various cloud deployments. Organizations will have to determine whether Informatica’s approach fits their future data and architectural needs. As Informatica pushes its platform approach, it has to ensure it is seen as a leader in big data integration, helping business analysts with data, supporting a larger number of application sources and connecting cloud computing through unifying business applications. This won’t be easy to accomplish as Informatica has not been as progressive in the broader approach to big data and use across operations and analytics.
Informatica has been growing substantially and is getting close to US$1 billion in annual software revenue. We have recognized its success through rating it a Hot vendor in our Data Integration Value Index and naming one of its customers, the CIO of UMass Memorial Health Care, the Chief Information Officer in our 2013 Leadership Awards. Informatica has been continuing substantial investment in R&D. Its acquisitions of data-related software companies have helped it grow, and Informatica has invested to integrate the products with PowerCenter. With almost half (49%) of organizations planning to change their information availability processes, the opportunity for Informatica is significant; its challenge is to gain the confidence and recognition by business customers, who now play a larger role in the selection and purchasing of software. This will require Informatica to speak their language of business and not just technology but the business processes that they are held accountable. Informatica is a major player in information management; now it must become as significant a choice for streamlining business processes and use of applications and data across the enterprise and cloud computing to enable information optimization.
CEO & Chief Research Officer