VR-BUG-WEBIn recent years line-of-business applications including accounting, human resources, manufacturing, sales and customer service have appeared in the cloud. Cloud -based software as a service (SaaS) has replaced on-premises applications that were previously part of ERP and CRM environments. They have helped companies become more efficient but have also introduced interoperability challenges between business processes. Their advantage is that cloud software can be rented, configured and used within a day or week. The disadvantage is that they don’t always connect with one another seamlessly, as they used to and when managed by a third party there is limited connectivity to integrate them.

Smooth interoperation is critical for business processes that use ERP. The hybrid computing approach to ERP was assessed by my colleague Robert Kugel, who identified the challenges in these early approaches to ERP in the cloud. As on-premises ERP, these applications were fully connected and integrated with process and data integration wired together through additional technologies. This configurability of ERP has been a large challenge and has led to failed implementations, as Robert noted.  Understanding the complexities of ERP today is key to coming up with a solution.

One of the most fundamental business processes in organizations is tracking an order to fulfillment. That needs to be connected without manual intervention for transactional efficiency and to ensure data, analytics and planning are available. In fact our next generation of finance analytics benchmark research finds that ERP is the second-most important source of data being analyzed, after spreadsheets, and is the first-ranked source in almost one-third (32%) of organizations. In the transition to cloud computing this has become more complex. Applications that support the order-to-fulfillment process, for example, are increasingly being used individually in cloud computing, and many are being mingled with on-premises applications that require integration and automation to avoid manual intervention.

To start, orders are created by sales. Sales opportunities are created and closed with products and services purchased by contract or electronically. These must be placed in an order management system through which the record can be used by a multitude of systems and processes. Purchasing and fulfillment details are moved into an invoice as part of a billing application, then entered into accounting and accounts receivable where it is managed to payment. Finance takes orders and consolidates them into reports, typically in a financial performance management system. The order is then provided to manufacturing or fulfillment applications which build and deliver products, then fulfilled through warehouse and distribution management. The customer name and order number is also placed into an application that handles support calls or deploys services.

This is the reality of business today. Many departments and individuals, with different applications, are needed to support orders, fulfillment and service. If these applications are not connected organizations have to perform manual intervention, re-enter data, or copy and paste, which not only wastes time and resources but can introduce errors. Half (56%) of organizations do the integration through spreadsheets or exporting data, with custom coding being second-most popular (for 39%), according to our business data in the cloud research. (I will leave HR applications and the supporting people component out of this process though they play a critical role as an enterprise resource to support the order-to-fulfillment process and are part of many ERP deployments.) In addition performing any level of business planning is not simple as data is needed to determine past performance and plan for the future.

There is no simple way to make all this efficient. It has historically been managed by ERP, usually through a single vendor’s application suite. Now businesses want it done in the cloud, either on-premises or through other cloud applications. Proper attention must be paid to the needs and competencies of the departments and business processes involved. Thus migration and transition to ERP are not simple, nor is building an application architecture that supports process and data efficiently. Assessment and planning are necessary to ensure that risks are addressed. Switching to a new ERP system in the cloud will still require an application architecture to maintain proper operations across departments and existing applications, whether they are on-premises, in a private cloud, in the public cloud or in a hybrid combination. This integration among sales force automation, customer service and other systems is outside the scope of most cloud ERP deployments who have not provided the most robust integration points for the applications and data.

Robert Kugel writes that ERP must take a giant leap in order to operate in the cloud. I agree. Our firm often gets requests for assistance in finding the right approach to ERP and business process. While midsize companies find ERP in the cloud increasingly attractive, there are significant challenges to adapting  and integrating such applications as part of business processes, which many customers overlook in their desire for a cloud-based solution. The majority of cloud ERP vendors have not provided integration and workflow of information from their applications to others in an open and seamless manner, complicating deployments and adding unexpected costs for customers.

vr_fcc_financial_close_and_automation_updatedERP suppliers moving from on-premises to cloud computing have acknowledged the complexities. Many of the legacy ERP vendors have struggled to enable application interoperability and the differing management requirements of IT and business. This struggle has resulted in a lack of confidence by organizations wishing to migrate to ERP in the cloud and makes them wonder whether to look at alternative approaches with individual applications using integration across them and data to support business processes. Automation is another major concern. The lack of it across business processes has impeded finance groups in closing the books efficiently; our research on the fast, clean close shows many organizations losing four or more days.

In the meantime technological advances in integration technologies that operate in the cloud computing environment can interconnect with on-premises systems. The ability to simultaneously distribute transactions and data across systems makes the ability to architect business processes and workflows a reality. For some organizations the use of business process management and other integration technology approaches are being adopted. The new technologies are able to blend into many applications, so that users do not know they are working on applications from different vendors, nor do they need to know. These advances enable application architecture to interoperate and automate the flow of data from on-premises and cloud computing environments, providing new opportunities to interoperate from ERP or other applications. Many organizations are doing this today, and more than one-third of companies are planning or need to move data from cloud to cloud, cloud to on-premises or on-premises to cloud, data according our business data in the cloud research.

No matter whether a company is in manufacturing or services, they must address the integration complexities to gain efficiency and support growth without adding more resources. While many new ERP providers in the cloud are taking simpler approaches from the applications interface to how it processes transactions, most have not learned the importance of integration to other systems and the need for accessing and integrating data for transactions and analytics. Having consistency of data across applications and systems is a major obstacle in more than one-fifth (21%) of organizations and a minor obstacle in two-fifths (43%), and they find similar difficulties in the complexities of accessing data, according to our business data in the cloud research. In addition they lack effective planning (which of course is the P in ERP), and reporting is less than sufficient and often must be complemented by third-party tools.

All this introduces yet more complexity for business and IT in determining how they can move forward with ERP in the cloud, adapting existing and new applications to interoperate.  The outlook for ERP in the cloud is thus uncertain. If these vendors do not adapt to the reality of what their customers want (as opposed to what they want their customers to do), it will remain cloudy. Responding to pressure to take an open approach that is easy to integrate, however, ERP providers could see a sunny forecast.

Regards,

Mark Smith

CEO & Chief Research Officer

vr_Big_Data_Analytics_02_defining_big_data_analyticsTeradata continues to expand its information management and analytics technology for big data to meet growing demand. My analysis last year discussed Teradata’s approach to big data in the context of its distributed computing and data architecture. I recently got an update on the company’s strategy and products at the annual Teradata analyst summit. Our big data analytics research finds that a broad approach to big data is wise: Three-quarters of organizations want analytics to access data from all sources and not just one specific to big data. This inclusive approach is what Teradata as designed its architectural and technological approach in managing the access, storage and use of data and analytics.

Teradata has advanced its data warehouse appliance and database technologies to unify in-memory and distributed computing with Hadoop, other databases and NoSQL in one architecture; this enables it to move to center stage of the big data market. Teradata Intelligent Memory provides optimal accessibility to data based on usage characteristics for DBAs, analysts and business users consuming data from Teradata’s Unified Data Architecture (UDA). Teradata also introduced QueryGrid technology, which virtualizes distributed access to and processing of data across many sources, including the Teradata range of appliances, Teradata Aster technology, Hadoop through its SQL-H, other databases including Oracle’s and data sources including the SAS, Perl, Python and even R languages. Teradata can provide push-down processing of getting data and analytics processed through parallel execution in its UDA including data from Hadoop. Teradata QueryGrid data virtualization layer can dynamically access data and compute analytics as needed making it versatile to meet a broadening scope of big data needs.

Teradata has embraced Hadoop through a strategic relationship with Hortonworks. Its commercial distribution, Teradata Open Distribution for Hadoop (TDH) 2.1, and originates from Hortonworks. It recently announced Teradata Portfolio for Hadoop 2, which has many components. There is also a new Teradata Appliance for Hadoop; this is its fourth-generation machine and includes previously integrated and configured software with the hardware and services. Teradata has embraced and integrated Hadoop into its UDA to ensure it is a unified part of its product portfolio that is essential as Hadoop is still maturing and is not ready to operate in a fully managed and scalable environment.

Teradata has enhanced its existing portfolio of workload-specific appliances. It includes the Integrated Big Data Platform 1700, which handles up to 234 petabytes, the Integrated Data Warehouses 2750 for up to 21 petabytes for scalable data warehousing and the 6750 for balanced active data warehousing. Each appliance is configured for enterprise-class needs, works in a multisystem environment and supports balancing and shifting of workloads with high availability and disaster recovery. They are available in a variety of ratios including disks, arrays and nodes, which makes them uniquely focused for enterprise use. The appliances run version 15 of the Teradata database with Teradata Intelligent Memory and interoperate through integrated workload management. In a virtual data warehouse the appliances can provide maximum compute power, capacity and concurrent user potential for heavy work such as connecting to Hadoop and Teradata Aster. UDA enables distributed management and operations of workload-specific platforms to use data assets efficiently. Teradata Unity now is more robust in moving and loading data, and Ecosystem Manager now supports monitoring of Aster and Hadoop systems across the entire range of data managed by Teradata.

Teradata is entering the market for legacy SAP applications with Teradata Analytics for SAP, which provides integration and data models across lines of business to use logical data from SAP applications more efficiently. Teradata acquired this product from a small company in last year; it uses an approach common among data integration technologies today and can make data readily available through new access points to SAP HANA. The product can help organizations that have not committed to SAP and its technology roadmap, which proposes using SAP HANA to streamline processing of data and analytics from business applications such as CRM and ERP. For others that are moving to SAP, Teradata Analytics for SAP can provide interim support for existing SAP applications.

Teradata continues to advance JavaScript Object Notation (JSON) integration for support of document-oriented databases that are schemaless and semistructured. JSON has become a critical tool as more applications need to store and access data efficiently. NoSQL databases have become more popular recently: 25 percent of organizations in our big data analytics research are using them today, 20 percent  plan to use them within two years, and another 23 percent are evaluating NoSQL. With this focus Teradata provides for its customers application and operational support beyond just supporting data for analytic purposes.

Teradata continues expansion of its Aster Discovery Platform to process analytics for discovery and exploration and also advances visualization and interactivity with analytics, which could encroach on partners that provide advanced analytics capabilities like discovery and exploration. Organizations looking for analytic discovery tools should consider this technology overlap. Teradata provides a broad and integrated big data platform and architecture with advanced resource management to process data and analytics efficiently. In addition it provides archiving, auditing and compliance support for enterprises. It can support a range of data refining tasks including fast data landing and staging, lower workload concurrency, and multistructured and file-based data.

Teradata efforts are also supported in what I call a big data or data warehouse as a service and is called Teradata Cloud. Its approach is can operate across and be accessed from a multitenant environment where it makes its portfolio of Teradata, Aster and Hadoop available in what they call cloud compute units. This can be used in a variety of cloud computing approaches including public, private, hybrid and for backup and discovery needs. It has gained brand name customers like BevMo and Netflix who have been public references on their support of Teradata Cloud. Utilizing this cloud computing approach eliminates the need for placing Teradata appliances in the data center while providing maximum value from the technology. Teradata advancements in cloud computing comes at a perfect time where our information optimization research finds that a quarter of organizations now prefer a cloud computing approach with eight percent prefer it to be hosted by a supplier in a specific private cloud approach.

vr_Info_Optimization_10_reasons_to_change_information_availabilityWhat makes Teradata’s direction unique is moving beyond its own appliances to embrace the enterprise architecture and existing data sources; this makes it more inclusive in access than other big data approaches like those from Hadoop providers and in-memory approaches that focus more on themselves than their customers’ actual needs. Data architectures have become more complex with Hadoop, in-memory, NoSQL and appliances all in the mix. Teradata has gathered this broad range of database technology into a unified approach while integrating its products directly with those of other vendors. This inclusive approach is timely as organizations are changing how they make information available, and our information optimization benchmark research finds improving operational efficiency (for 67%) and gaining a competitive advantage (63%) to be the top two reasons for doing that. Teradata’s approach to big data helps broaden data architectures, which will help organizations in the long run. If you have not considered Teradata and its UDA and new QueryGrid technologies for your enterprise architecture, I recommend looking at them.

Regards,

Mark Smith

CEO & Chief Research Officer

Mark Smith – Twitter

Top Rated

Stats

  • 164,806 hits
Follow

Get every new post delivered to your Inbox.

Join 17,644 other followers

%d bloggers like this: