Read the Original Article at http://www.informationweek.com/news/showArticle.jhtml?articleID=228200231
What business leaders desperately want from IT is greater agility. They need systems that can be quickly adapted to support changing requirements--to get the most out of an acquisition, enter a new market, or find new customers. "Agile" has been a hot topic in software development for several years, and now it's the buzz in business intelligence.
You can see the need for agility in some of the most powerful trends shaping BI. The demand for quick access to new insights has spurred development of fast what-if planning tools and in-memory analysis capabilities. It has fueled interest in predictive analytics to get out ahead of emerging demand and risk. You might think mobile delivery, cloud computing, and Google-style querying also would figure heavily in the agility story, but our annual InformationWeek Analytics Business Intelligence and Information Management Survey shows that those are still emerging ideas, far short of widespread implementation.
While most businesspeople want BI agility, its foundation--sound information management--is something they don't want to hear about, much less spend money on. Information management includes data discovery, modeling, integration, and cleansing as well as database optimization--all tough, time-consuming challenges. And just when IT gets its arms around existing information, the company invariably hits the reset button by acquiring another company.
Looking at this year's survey, the wish lists of BI practitioners haven't changed all that much. Fast analysis, rapid deployment, and other attributes associated with agility are all in big demand. But there's a growing realization, or perhaps it's resigned acceptance, that better business intelligence goes hand in hand with the thankless task of better information management.
For the last two years, respondents to our survey have cited several information management-related problems among the top barriers to adopting BI tools company-wide. Data quality problems are cited most often, by 55% in both 2009 and this year, followed by ease-of-use challenges, and integration and compatibility with existing platforms. Among the people directly responsible for information management, the biggest impediments to success are accessing relevant, timely, reliable data (59%); cleansing, deduping, and ensuring consistent data (51%); and integrating data (49%).
Pfizer's Approach Passes M&A Test
Peter Green knows the agility problem well as director of business integration at pharmaceutical giant Pfizer. Green and his team have had a hand in executing Pfizer's acquisitions over the past decade, including blockbuster deals for Warner Lambert in 2001, Pharmacia in 2004, and Wyeth in 2009. The intellectual property behind these acquired companies exists as data, particularly about the drugs in their research pipelines. In each case, that data had to be quickly integrated to take advantage of the investment.
Most drug companies have similar processes and use the same scientific terms to describe the compounds they develop. But each company tends to define its information in slightly different ways, so post-merger integration has tended to be "slow and painful," Green says. The data about a drug in a company's pipeline includes descriptions of the molecules, results of the many tests done, and its stage in the various regulatory processes, such as approval from the U.S. Food and Drug Administration.
Pfizer has learned that integration doesn't have to be a drawn-out process. Green spearheaded the development of Pfizer's OneSource platform, which now integrates R&D data across the company, giving executives detailed visibility into a drug-development pipeline worth billions. Most important, it's a flexible platform that can quickly incorporate new data.
Pfizer's old approach involved what Green calls "heavy-iron integration"--meaning conventional extract, transform, and load (ETL) batch processing into a central data warehouse. But over the last 18 months, the company has moved to a virtualized approach. Instead of lifting data into a data warehouse for analysis, Pfizer uses data federation technology from Composite Software to map directly to the data source, creating a virtual view of the data instead of building a data warehouse.
One big advantage of this approach is that it lets data managers agree on data definitions and metadata and then map, test, and deploy integrated views through Composite's abstraction layer. Virtualizing data does away with copying and merging, yet it provides an integrated view of all mapped data.
Pfizer's approach proved itself in 2009 when the company integrated Wyeth's R&D pipeline data into OneSource within less than a week of the deal being finalized. Top executives could then see the pipeline of most-promising drugs, as well as potential overlaps and synergies between the two companies, much more quickly than in the past. With the 2004 Pharmacia deal, it took several months to integrate similar data, Green says.
How? Through virtualization, the hardest parts of integration could be handled ahead of time. Pfizer and Wyeth teams couldn't share actual data before the merger was legally completed, but OneSource let them agree on definitions, reference terms, formats, and exact interpretations of key milestones in their respective drug pipelines. "We knew that the format of the data coming across from Wyeth on day one of the acquisition would fit into our data model, so we could slipstream it into our existing data feeds," Green says.
Having done all the information management heavy lifting, Pfizer built lightweight, Adobe Flex-based data visualizations directly on top of OneSource in order to view the data. Thus, Green sees little need for the heavy-duty data transformation and reporting infrastructure that would otherwise be provided by BI suites. That's not to say that Pfizer is ditching BI suites entirely. OneSource is limited only to sharing R&D insight. Finance and marketing, for example, use BI suites including SAP BusinessObjects and IBM Cognos.
But the lightweight approach is used where speed is most vital in the pharma industry--in R&D. It takes years to develop a drug and get it approved, and aggressive generic drug developers are ready to pounce on expiring patents. That gives drug companies a brief opportunity to capitalize on their research investments. Delays can reduce the revenue potential of a single drug by hundreds of millions of dollars.
There are plenty of time-is-money scenarios in most industries, so agile, virtualized integration should have widespread appeal.
Modeling For Speed
While Pfizer has focused on streamlining information management to avoid a data warehouse for its R&D pipeline, many software vendors are focusing on streamlining the creation and ongoing maintenance of conventional data warehouses. That makes sense, given that data marts and data warehouses are used by 70% of our survey respondents.
One of the big complaints about data marts and data warehouses is that they require a long time to gather requirements and then design and deliver a functioning resource. And when new data sources emerge or requirements change, you have to go back and revise the data model, ETL, queries, and so on. All that information management work is done before you get to the downstream BI reports, metrics, and dashboards.
Companies are using automation tools to cut the time and manual work out of the data warehouse development process. Specialized vendors such as Balanced Insight, Kalido, and WhereScape, for example, offer automated modeling environments. These tools can generate data definitions, ETL scripts, SQL queries, and even common online analytical processing cube types and metadata or semantic layers used by the big-name BI platforms. Through faster data warehouse development and easier model changes, the goal is a faster route to new BI reports, dashboards, and applications running on top of those warehouses.
BI vendors also have gotten the message about the need for speed and simpler information management. Open source BI vendor Pentaho, for example, introduced an integrated development environment in early 2010 to let analysts have a single place to create and revise ETL scripts, data models, and BI visualizations. The new IBM Cognos 10 modeling environment includes embedded wizards and tips to help analysts get their data models right the first time, to avoid the substandard query performance that comes from poor design.
If information management teams get the models right, BI analysts and power can employ higher-level metadata layers to define and quickly redefine data dimensions, hierarchies, and views of data sources--without requiring information management pros to change the underlying data models. Oracle this summer introduced a common information model in Oracle Business Intelligence Enterprise Edition 11g to ease metadata modeling across Oracle warehouses, OLAP and BI tools, and applications. SAP's Business Objects Universe, due for release in 2011, promises data federation capabilities for easier data access. Metadata and semantic layers are also available from IBM Cognos, Information Builders, Microsoft, MicroStrategy, and SAS Institute. The aim of all these developments is to speed the hard-but-necessary information management work so users can get on with developing new BI reports, metrics, dashboards, and visualizations.
Another approach to rapid data warehouse development is using prebuilt data models specific to industries, such as banking, insurance, and health care. These models are available from prominent vendors including IBM and Oracle, and from modeling specialists, such as Embarcadero.
But beware. A lot of companies embrace agile BI ideas like prebuilt data models for the wrong reasons--because they "don't have the discipline to enforce BI development rigor," says information management guru Jill Dyché of Baseline Consulting.
Also, some of your source data "will need to be wrestled and coerced into the predefined model, and invariably, some data just won't fit," warns Margy Ross, a data warehousing specialist with Kimball Group. The best way to identify and understand requirements--and to ensure that they're reflected in the data model--is to spend time with business users to understand their requirements, Ross says. It's a timeless best practice that fosters grassroots support for data warehousing and BI initiatives.
Advanced Analytics For Agility
At the top of our survey respondents' BI wish lists--for two years running--is advanced analytics, averaging a 3.8 on our scale where 5 is extremely interested. One-third of all respondents rate it extremely interesting. This, too, is an agility-related subject, in that companies are pursuing advanced analytics to be proactive rather than reactive. They want to anticipate demand so they can price products, forecast manufacturing, and plan with their supply chains accordingly. They want to anticipate risks and act before they become bad loans or inventory write-offs. They want to anticipate when a customer's about to drop their service.
Auto insurer Infinity Property & Casualty is using advanced analytics to profile claims as they're filed and send them to appropriate specialists, based on whether they're likely to be big or small, involve complications such as collecting from another insurer, and might deserve fraud investigation. It's modeling these traits using predictive analytics from SPSS (acquired by IBM). It took six months to develop the required models and rules to integrate predictions into Infinity's claim system. But now it takes about 48 hours to get a claim to the right specialist, compared with about 40 days before using predictive analytics, says Bill Dibble, Infinity's senior VP of claims.
Speed isn't important for just customer satisfaction and cost savings. In stopping fraud, "the evidence gets cold, and it gets harder to prove fraud the longer you wait," Dibble notes. Infinity's success rate in proving fraud has increased from 60% to 87% of suspected cases now that red flags are quickly recognized with the aid of predictive analytics.
Over the last few months, Infinity also has started using text analytics to examine notes and descriptions in claims, which can send up a red flag over keywords such as "medication" that might point to the condition of drivers involved in an accident. Location descriptions can be presented as GPS coordinates and matched against, for example, freeway off-ramps where accidents are frequently faked. The key to all these analyses is spotting potential problems at the start of the process and then channeling them to the right specialists to act on the information.
Mobile BI: The Performance Problem
Despite all the momentum in smartphone sales, our survey respondents don't put mobile delivery anywhere near the top of their companies' BI wish lists. Mobile rates eighth out of the 10 leading-edge BI technologies we asked about, with just 13% saying they're extremely interested.
One reason might be performance. There's a good case to be made that BI suites have become bloated because they try to make up for underlying information management deficiencies. That can cause particular problems when trying to bring BI to mobile devices.
MicroStrategy chief operating officer Sanju Bansal calls poor performance "the dirty secret in BI." Query times can be 20 to 45 seconds, he says, which is too long for a mobile device. MicroStrategy tried to crack that problem with a new platform this year for the iPhone, iPad, and BlackBerry, providing multilevel caching, an in-memory data structure, and a more efficient network interchange to speed up delivery of visualizations without giving up drill-down capabilities.
Other vendors have been highlighting iPhone- and iPad-based mobile BI apps, notably QlikTech with its QlikView for the iPad and Mellmo's Roambi, which is compatible with SAP BusinessObjects, IBM Cognos, Oracle Essbase, and Microsoft Analysis and Reporting services.
Over at Pfizer, Green hasn't taken on the mobile problem yet. But for moving the R&D pipeline reporting to mobile devices, Pfizer's lightweight OneSource approach has a lot of advantages. Green says the next step for Pfizer is to provide a dozen or so core Adobe Flex-based R&D pipeline data visualizations on the iPhone and iPad. If Flex can't be used, due to Apple limiting certain Adobe technology on the iPhone, he says his team can just as easily develop in Apple Cocoa or Microsoft Silverlight. "Mobile delivery begs for a thin skin on the front end, so this is another argument for a lightweight approach to delivering BI," Green says.
Limited Role For Cloud, Google-Style Search
Another buzz-worthy category that's low on the wish lists of our survey respondents is cloud computing. Software-as-a-service and cloud computing-based BI/analytics rates ninth on our list of 10 BI technologies, with only 11% extremely interested. Companies are most interested in the lower IT support and rapid deployment of cloud options (each cited by about a third of respondents), but there are very high levels of concern about security and privacy (cited by 65%) and data integration (43%).
If there's one wished-for technology on our list that just won't die, it's the ability to do Internet-style querying of both structured and unstructured information--one simple, Google-like interface to explore databases and documents, Web pages, e-mail messages, blogs, and the like. Half of the people in our survey say they're interested or extremely interested in that capability. If users didn't have to deal with anything like SQL, data-source selection, or less-than-intuitive interfaces, the thinking goes, BI can break through and--finally--be used by the masses.
But there isn't that much progress on this front. Back in 2006, BusinessObjects, Cognos, Information Builders, Endeca, and Fast Search & Transfer all had combinations of search and BI, the first three in partnership with Google and its OneBox appliance. Not much came of most of those efforts. But two companies are still vigorously pursuing the dream: Endeca, with its recently introduced Latitude product, and Attivio, founded in 2007 by former Fast employees after the company was acquired by Microsoft.
What about exploiting social networking features--helping people collaborate around BI findings to improve intelligence? That's even more early stage than mobile and search are--10th on our list of sought-after features, with just 7% extremely interested. The reality is that when it comes to BI, a lot of companies aren't ready for these Enterprise 2.0 nice-to-haves.
They need a better information management foundation first. The BI projects getting funding are those that are delivering better insight into the big costs and risks to be avoided, and the big sales and profit opportunities to be seized. In a super-competitive and fast-changing market, that usually means BI projects that fit the overarching theme of helping the business become more agile.