You are currently browsing the tag archive for the ‘Data Integration’ tag.

I recently attended the annual SAS analyst summit to hear the latest vr_bti_br_technology_innovation_prioritiescompany, product and customer growth news from the multi-billion-dollar analytics software provider. This global giant continues to grow its business and solutions to help with fraud prevention, marketing and risk. It lets users apply its analytic and statistical technology in practical applications for business. SAS can meet midsized businesses’ demand with packaging and pricing to ensure it is not seen as only affordable to Global 2000 companies. SAS’ growth in analytics should be no surprise, as our research finds analytics to be the first-ranked priority among technologies for innovating business.

SAS’ largest area of growth is in its business analytics and business intelligence tools. Its new SAS Visual Analytics product appeals to a vr_ngbi_br_importance_of_bi_technology_considerationsbroad range of business and analyst needs. The latest upcoming version blends data and visual discovery with powerful analytics. SAS is also addressing usability, the most important technology consideration according to 63 percent of organizations in our research. SAS uses in-memory computing against big data to help meet the advanced needs of organizations. Its eventual intent is to have Visual Analytics be the focal point of its business intelligence product direction. SAS Visual Analytics 6.2 is expected to be generally available in second quarter of this year; a trial of the product is already available. At the event the company demonstrated its capabilities on tablets such as the Apple iPad. After seeing the demo my only recommendation is that SAS provide more collaborative aspects and ensure that analysts can make observations and notations, which is a challenge with most of the business intelligence and analytics offerings in the market today.

SAS’ view of big data echoes our view that it is part of a larger portfolio of vr_bigdata_big_data_technologies_plannedbusiness technology for storing and loading data. SAS has invested significantly into its high performance analytics (HPA) architecture, which enables it to operate in parallel or embedded within database technology. SAS has focused on efficient processing for applying mathematics, and has devised multiple architectural approaches to adapt to existing technology, including Hadoop, and to ensure it can operate in the most efficient manner. Our big data research finds that a third of organizations plan to evaluate and adopt a range of appliances, in-memory and specialized databases and Hadoop in 2013. SAS’ approach is to embrace and integrate with a range of big data approaches.

For information management, SAS has consolidated the previous Dataflux brand into the SAS organization and unified its product vr_infomgt_information_management_initiativeofferings. This is an important move, as joint offerings can confuse potential customers, though everything was available from SAS. Beneath the marketing is a solid product line that provides not just data integration, though we assessed SAS as a Hot Vendor in our 2012 Value Index for Data Integration based on a methodical assessment. Unlike other analyst approaches that scratch the surface of review in 2×2 assessments, we look at range of manageability, usability, reliability and other categories that span a range of data related areas. Not as well-known yet for its integration with Hadoop and even SAP HANA, SAS is addressing challenges in big data integration that we are researching in more depth for 2013. But SAS’ overall approach to data management aligns well to our information management research. I was impressed with SAS’ support for process and data orchestration and the overall ease of use of the product; it can be easily used by analysts and IT, with some great job monitoring capabilities.

For the chief marketing officer (CMO), SAS has expanded its Customer Intelligence Suite since my colleague Richard Snow assessed it last year. Expanded capabilities address a broad set of management and operational needs for marketing. SAS provides not just the analytics but campaign management, real-time decision management and personalization that helps ensure the best possible interaction and experience. Though it is not always seen as a key provider of applications for marketing, SAS has been steadily expanding its offerings organically and through acquisitions, and now, with a unified approach and user experience, is ready to strut its depth and sophistication, especially for B2C organizations.

SAS demonstrated a portfolio that engages everyone from the CMO to the analysts, managers and teams responsible for marketing activities that span from strategy and planning to interactions and ensuring great customer experience. An upcoming release expected in Q2 provides a new generation of user experience and integration that I have not seen in other offerings in the market. This sophisticated advancement in customer analytics aligns with my colleague Richard Snow’s view on the next generation of customer analytics that can leverage big data to meet forward-looking needs of organizations.

SAS sees the value in cloud computing, and now has its own global hosted technology infrastructure. It can help its customers set up a private cloud for its technology. SAS has had rapid global expansion to supportvr_bti_br_access_preferences_for_innovative_technologies the cloud since our last assessment. I especially like its management of users, applications and technology and the ease of working across deployments and upgrades. SAS will soon also provide a platform for assembling applications for a range of needs for business. This new step forward, expected later in the year, is significant, as SAS is not known for its ability to foster the development of applications, but it has had this capability in its portfolio, and now is making it simpler and accessible in the cloud. Our research finds that the on-demand model and even software hosted by the supplier plays a growing role not just for analytics but for a range of big data and mobile technology needs in over a third of organizations. For SAS, this capability goes well beyond just providing analytics in the cloud, and places it in the market of companies such as Salesforce, which provides Force.com as its cloud-based application development environment but also provides information and analytics applications. SAS continues to expand its OnDemand offerings that provide easier access to many of the sophisticated solutions in its portfolio.

SAS is also applying analytics to decision-making through a series of advancements with its Decision Management technology. As many organizations realize, the value of analytics is in using them to enable action to be taken. This is no easy task, as most analytics and their presentation are not designed for assessing, taking and monitoring actions. SAS has developed a suite of capabilities and tools to help in the preparation of data, modeling, optimization, workflow and rules, monitoring and reporting, along with supporting case management. After a close look at the product I found it to be well-designed with an easy-to-use interface, especially the decision flow builder, which can be used by business analysts to design processes and analytics. I especially like the SAS Scenario Manager, which allows for side-by-side examination of decisions to determine how to optimize activities. SAS’ full suite of integrated decision management capabilities is expected to be available in the second quarter of this year, and SAS has an aggressive roadmap for continuous improvement. Only IBM in this market has a comparable area of focus and integrated approach with a portfolio of tools for decision management across any industry. SAS is making a smart step forward and will need to elevate the visibility of this offering in its portfolio to ensure it gets the proper level of consideration.

SAS also provides software for risk managementGRC and fraud technology to handle the most sophisticated challenges facing organizations and lower risk in organizations. Our research into GRC finds that 79 percent of organizations are looking to identify and manage risks faster, and more than half (59%) need to improve their control environment. I will let my colleague do further analysis of SAS portfolio in this area in the future.

SAS continues to build out its partner ecosystem. It has made strides to expand into other companies’ technology ecosystems, including Teradata and EMC, and works with system integrators such as Accenture, Capgemini and Deloitte. SAS had a great customer panel at the analyst event, and while I’m under NDA and cannot tell you the customer names, they represented some of the largest brands in the world, and they operate and use SAS to meet a variety of analytics and real-time operation needs.

The company has been steadily advancing, as we found in our Value Index for Business Intelligence last year. However, as SAS is adopted by more analysts, it will face the same issues I cited for business intelligence, which has not adapted well to business needs.

SAS has great potential with its approach to do more than just analyze the past but also predict and optimize future business activities using applications and tools that utilize its analytics backbone. SAS believes that its ability to handle proactive and forward-looking analysis on the largest of big data distinguishes it from other software providers along with using in-memory technology and makes it easy to try its software. SAS’ ability to use mathematics and embed predictive analytics into its offerings makes it a unique application provider. I could not cover all the key advancements in its portfolio but anyone that spends a little time examining their portfolio will realize there is a lot more to SAS than most realize. It’s broadening and deepening of its portfolio puts it on the short list of companies to consider for bringing more sophistication and science to business analytics.

Regards,

Mark Smith

CEO & Chief Research Officer

I recently attended the annual Informatica analyst summit to get the latest on that company’s strategy and plans. The data integration provider offers a portfolio of information management software that supports today’s big data and information optimization needs. Informatica is busy making changes in its presentation to the market and its marketing and sales efforts. New executives, including new CMO Marge Breya, are working to communicate what is possible with Informatica’s product portfolio, and it’s more than just data integration.

Big data and cloud computing have placed challenges on IT in its roles as both a facilitator and in providing governance and compliance with policies and regulations, including access and security. IT compliance costs are GRC and ITincreasing, according to 53 percent of heavily regulated organizations, and even 17 percent of those subject to little or no regulation, according to our governance, risk and compliance research. CIOs should examine Informatica’s product portfolio to see how to increase efficiency in the access, governance and integration of data in IT systems for more effective business processes including those that are GRC related.

Governance over transactional, interaction and analytical systems is a complex task. Late last year I wrote about Informatica’s latest efforts in big data and cloud computing; the company is now shipping its PowerCenter Big Data Edition, which facilitates integration with Hadoop. I have written about how integration with big data is broken today as organizations struggle not just with Hadoop but also with other big data technologies. Informatica provides tools to parse data so it can be profiled and processed efficiently. For example, Informatica can perform natural language processing to extract entities from text within unstructured data that can help in a range of tasks including those related to IT need to perform reviews of data.

With its latest tools, Informatica has stepped beyond the Informatica Cloud Winter 2013 release, which started the software down the path of bringing master data management (MDM) and data governance into the cloud. The Cloud Spring 2013 release, expected in April, is about providing enterprise capabilities in the cloud. New Cloud Data Masking can help secure sensitive or confidential data; our data in the cloud research found that data security is the number one concern in 63 percent of organizations. A data loader for Salesforce makes a bulk read and write license available; I have written about how providing data plumbing is your business, as Salesforce has failed to meet customers’ needs in this area.

Informatica last month acquired Active Endpoints, whose Cloud Extend applies cloud-based workflow services to what would regularly just be state-based applications, such as Salesforce applications for SFA. Cloud Extend lets managers map out the steps that should be taken in an application and prompts users for action. This application, which is designed for line of business and analysts, can provide value for both business and centralized IT. Informatica is making it more efficient to set up and establish integration across the cloud, and its ability to subset data and support sandbox environments helps its customers reduce costs and time to get up and running.

Informatica has announced it is offering prepackaged integration with NetSuite and Workday applications that operate in the cloud in its Cloud Connector Marketplace Mall. This is a welcome step; Informatica needs to invest further to develop cloud connectors for the larger group of cloud computing applications in use today, as it has many more to address to reach critical mass or universal connectivity. The good news is that many software organizations that operate in the cloud, including MicroStrategy, Ultimate Software and Xactly, are embedding Informatica to improve their ability to be efficient with data and support customers’ needs. In its Spring release Informatica will also provide connectivity to Amazon Redshift, Oracle CRM On Demand and Microsoft Dynamics AX. The announced move to support Amazon Redshift is important as more organizations look to embrace cloud computing for their data storage and processing needs.

At the analyst summit Informatica presented its vision of the future of cloud as an IT-led activity, saying that the days that line of business owned and led cloud effort are past. In this the company could not be more wrong, as subscription and access to cloud applications and services by business continues to grow as their need for them increases when they get little to no support from IT. While IT might be getting engaged and starting to leverage this utility of computing, they are no way leading or controlling what business is doing. We continue to see this in sales, marketing, customer service, operations, human resources and even finance. In the end, business is held accountable for business processes and outcomes, and I do not see any research points that indicate this will change in the near future. What is needed is more of an adaptive environment where analysts and business can facilitate more interactions through data requests and tasks, not just stewardship and increasing the quality of the data that exists, which is only part of the bottleneck.

Informatica also provided more insight to how it uses Virtual Data Machine, where Informatica products can operate across platforms and environments yet be insulated from their differences. I would expect to hear more from the company on where this can play a role in cloud and hosted environments as much as it can in on-premises environments. Ultimately this technology should be able to support more integration points and partners as it has done with Teradata; Informatica recently announced further support for Teradata Unified Data Architecture, where it can streamline data integration from within the Informatica Virtual Data Machine to environments like Teradata.

Informatica also continues its strategic partnership with Heiler, which it is in the process of acquiring and expected by year end if approved by German regulatory review. Since my analysis of the announcement last fall the companies have been working to integrate MDM with product information management (PIM). Informatica has come to recognize that PIM is not MDM; they have different business and IT requirements, but together they can be a valuable combination. This simple position is not generally accepted by the majority of IT analysts, who have led many of the largest of software companies into the IT approach, which our PIM benchmark research has found is wrong, and which led me to write a perspective on how PIM is for business. Heiler, which we rated as Hot in our 2012 Value Index for Product Information Management, plus Informatica, which was Hot in our 2012 Value Index for Data Integration, combined might be the next PIM powerhouse.

How OI is UsedInformatica continues to expand its portfolio to support a range of real-time operations needs. It recently released a new version of Informatica Ultra Messaging that my colleague Robert Kugel assessed. Beyond the near-real-time features is Informatica’s capability of handling complex event processing (CEP) and what we call operational intelligence in its products. Unfortunately, with such a busy product portfolio, Informatica’s CEP and operational intelligence capabilities are rarely marketed and not very well known. Our benchmark research finds that activity or event monitoring is a top priority in 62 percent of organizations, and that is exactly what Informatica PowerCenter offers.

I expect to see more big steps forward for Informatica, as it has many development initiatives that are still confidential that will continue its expansion as an information-centered software provider. As technology providers such as Informatica are further pressured to demonstrate business value, we will see a further shift to what we call information optimization, which is in the end what business needs on a more timely and consistent basis, as I have outlined in our research agenda.

Informatica finds its customers moving to being stewards of business data but need to move further to supporting analysts’ needs for data to perform analytics. Our latest research finds that 42 percent of organizations are still impeded by data-related tasks preventing them from handling analytic ones. This has led to the startling reality, found in Business Intelligence and Spreadsheetsour latest research into spreadsheets, that spreadsheets are used 74 percent of the time for business intelligence tasks, despite the fact that they are responsible for a high amount of errors from the manual copy, paste and calculation tasks.  The need to remedy data-related problems should help Informatica bridge the data divide between business and IT. Informatica continues to be bullish on its growth opportunities, and it does not have to convince me, as our research for a decade has shown the need for rationalization to improve efficiency and profitability.

Regards,

Mark Smith
CEO & Chief Research Officer

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 17,018 other followers

Mark Smith – Twitter

Top Rated

Blog Stats

  • 146,853 hits
Follow

Get every new post delivered to your Inbox.

Join 17,018 other followers

%d bloggers like this: