TechWeb

4 Technologies That Are Reshaping Business Intelligence

Aug 28, 2009 (08:08 PM EDT)

Read the Original Article at http://www.informationweek.com/news/showArticle.jhtml?articleID=219500363


Past performance is no guarantee of future results. This investment-prospectus lingo has never been more apt for business in general than in this post-financial-meltdown, pre-recovery economy. Yet now more than ever, top executives, corporate directors, and financial markets want no surprises.

So it's pretty clear why business intelligence initiatives continue to top CIO priorities, as executives from the boardroom on down demand better visibility. The problem is that BI often has fallen short of ideal, delivering insight into the past but not into up-to-the-moment performance or future prospects.

InformationWeek Reports

That's about to change. Next-generation BI has arrived, and three major factors are driving it: the spread of predictive analytics, more real-time performance monitoring, and much faster analysis, thanks to in-memory BI. A fourth factor, software as a service, promises to further alter the BI market by helping companies get these next-generation systems running more quickly.

Predictive analytics is a white-hot growth segment that got hotter with IBM's $1.2 billion deal to buy SPSS, a company that uses algorithms and combinations of calculations to spot trends, risks, and opportunities in ways not possible with historical reporting.

Between the extremes of rearview-mirror reporting and advanced predictive analytics lies real-time monitoring. Front-line managers and executives increasingly want to know what's happening right now--as in this second, not yesterday or even 10 minutes ago. This is where stream processing technologies are moving beyond niche industry uses. Real-time monitoring detects events or patterns of events as data streams through transactional systems, networks, or communications buses. Proven on Wall Street and in other data-soaked industries, stream processing technologies deliver subsecond insight that conventional BI can't touch.

Forward-looking and real-time analysis aren't brand-new BI concepts, but in-memory processing is making them more practical. Until next-generation in-memory products emerged, you usually needed pre-built cubes, pre-defined queries, summarized data, and long-running queries for "what if" exploration. All those requirements killed spontaneous exploration. In-memory products, unlike tools that explore historical data on disk, load vast data sets into RAM so people can perform queries in seconds that would take minutes or even hours with conventional tools.

DIG DEEPER
Predictive Analysis
Scouring demand data can flag trends and problems sooner.
The fourth factor in the next generation of BI addresses another place where speed is needed: in the deployment phase. With software-as-a-service options, BI doesn't always require the months-long distraction of building a data warehouse or a new data mart application, something particularly attractive for small IT shops (see story, "SaaS Makes Its Mark In Business Intelligence".

This next generation of BI technology is still evolving and comes with plenty of risk. Prediction typically requires statistical expertise that's scarce and pricey. Real-time monitoring of stream processing technology can be a lifesaver, but only if you can respond as quickly as you detect opportunity or risk. Fast in-memory-analysis tools are selling briskly, but they may require companies to pony up for higher-performance 64-bit hardware. And if you're going to expose these powerful BI tools to new users, be mindful of misinterpretation.

Avoid these pitfalls, however, and there's no turning back to guesswork forecasting, weeks-old reports, and glacial querying.




Analytics and predictive capabilities have been around for decades, but interest has mushroomed in recent years thanks in no small measure to the 2007 bestseller Competing On Analytics, by Tom Davenport and Jeanne Harris, with its examples of companies profiting by peering into the future. (The book came a year after InformationWeek published an extensive cover story on the subject -- you can read that in "Businesses Mine Data To Predict What Happens Next".) BI vendors that lacked analytic tools have rushed to integrate them into their BI suites, with SAP BusinessObjects and IBM Cognos cutting integration deals with SPSS. In May, IBM launched an Analytics & Optimization practice, and then last month took the plunge with the SPSS deal.

With less fanfare, interest in analytics has also fueled popularity of the open source R language for statistical analysis, which proponents say is used by more than 250,000 programmers. For example, R serves as the foundation of an RStat predictive analytics module released in June by Information Builders.

One of the first beta customers for RStat is Dealer Services, which wants to use predictive analytics to screen potential customers. The company offers inventory financing for used-car dealerships. Of course, big banks and finance companies have long used statistical and predictive analytics in lending, "but the ratings and scores the credit card companies use have never worked for us," says Dealer Services CIO Chris Brady. "We're working on a model to score used-car dealers when they first apply for a loan."

With General Motors and Chrysler recently shedding thousands of dealers, plenty of former franchisees have become independents and are seeking third-party financing from companies like Dealer Services. Brady hopes her purpose-built model can predict the best loan prospects and eliminate up to 10 of the 15 hours it takes to review an application. If the model sees a sure bet, why pay a high-salary credit analyst to rubber-stamp every detail?

Surprisingly, Dealer Services already had SPSS software, but the lender uses Information Builders' WebFocus suite. Brady says integration of analytics and the BI environment was crucial. "The SPSS product itself is fine, but we had to pull data out of our transaction systems, reformat it, use the analytic tools to develop the model, and then run batch analysis on yet another server," she explains. With the integration of WebFocus and RStat, "once the model is finished, it's as easy as working with a report." SAP and IBM say they offer similarly tight links between SPSS analytic tools and their BI environments.

Integration also reduces the need for statisticians, whose talents are in short supply and can demand starting salaries of $125,000. The idea is that experts can develop and deploy models while business users run analysis within a familiar interface and with few data preparation steps.

chart: How will you develop advanced analytics expertise? Pre-built applications are another option for getting predictive without a huge investment in analytic expertise. Software with built-in models for a specific industry or for a company function like marketing are the fastest-growing segment for SAS, the leader with 33% of the $1.5 billion analytics market in 2008, IDC estimates. The recession has "really put a focus on solving problems like credit risk and market risk in finance, fraud detection in banking, and price optimization in retail," says Jim Davis, chief marketing officer at SAS.

Brady's not so sure about the analytics-for-the-masses approach. She chooses the data dimensions to be considered herself, including dealer size and type, number of locations, payment patterns, histories of bounced checks, and inventory practices. To build the model, she's testing algorithms including neural networks. And models are never done, because they must be revalidated and updated as conditions change. "A savvy business user could play with the tools to test a few variables and hypotheses," Brady says, "but I wouldn't suggest they tackle more sophisticated analysis."

Companies expect to be able to grow their own analytics expertise. Forty-eight percent of companies will do in-house training to groom BI experts and power users, while only 34% have these pros on staff, finds an InformationWeek Analytics/IntelligentEnterprise.com survey (see chart, "How will you develop advanced analytics expertise?").




Monitor And Analyze In Real Time

You hear "real time" a lot from BI vendors, but they seldom mean subsecond or even subminute response. You can use techniques such as trickle feeding or change data capture to get a conventional data warehouse down to subminute latency, but it might be more troublesome and expensive than stream processing alternatives, which are moving outside their finance niche.

Low-latency BI, faster business activity monitoring, and ultra-low-latency complex event processing are all examples of stream processing technologies. They typically include instant alerts so people can react when a particular threshold, event, or pattern is seen. But at these speeds--anywhere from a few seconds for low-latency BI to milliseconds for complex even processing--most companies also need to couple low-latency insight with automated response.

At Insurance.com, keeping a high-traffic e-commerce site humming requires real-time monitoring of at least a dozen supporting business systems, from the e-commerce platform and customer-matching algorithms to Web servers and a rate-call engine that gets quotes from insurance carriers. The company built a monitoring application in 2004, but by early 2008 it was coming up short.

The breaking point came when Insurance.com decided to monitor rate calls by state, says Scott Noerr, director of IT services. Upgrading the in-house app to do that meant six to eight weeks' time for three developers.

A build-versus-buy analysis ended in March 2008 with the selection of IBM Cognos Now, a monitoring and dashboard appliance that fits in the low-latency BI category. IT met the monitoring need while adding alerting, escalation, and custom-graphics interfaces that the homegrown app lacked. Insurance.com considered IT-specific tools for network monitoring, site monitoring, and performance monitoring, but that would have required a hodgepodge of tools that didn't render a holistic view from one interface. Like most BI products, IBM Cognos Now is designed to tap into a variety of source systems and data types. Insurance.com's deployment took about six weeks and required one full-time-equivalent staffer.

The alerting features were the first big improvement "because we no longer have to watch an interface to discover we have a problem," Noerr says. But the best hope for increasing revenue comes from automation and escalation features added last fall. One application monitors 15 variables to determine call-center agent capacity. When it spots excess capacity, the app automatically adjusts CRM software to push leads to agents more quickly--a great example of real-time insight tied to real-time response.

The second app monitors the customer lead-to-close process and sends an alert to the designated managers if it detects performance glitches. If the condition persists, alerts escalate to higher-level executives.

Complex event processing is a technology that companies are starting to use more broadly to make monitoring more real time. Sprung from the lab projects and custom deployments of fast-trading Wall Street brokerages in the 1990s, commercial off-the-shelf products have taken shape the last five years. Mainstream uses have emerged in supply chains, shipping and logistics, retail, utilities, and other time-sensitive applications. Shipping giant UPS, for example, not only made stream processing vendor Truviso a company standard, it also invested in the startup.

chart: What's most important in BI software? UPS decided it needed to replace a legacy application that tracked and did load balancing for as many as 50 million transactions made by visitors to UPS.com, as well as shipping requests through UPS's PC-based WorldShip application. The old system did classic rearview-mirror reporting--it collected server log data each night, and reported on transaction attempts, successes, and failures by servers the next morning. "When problems used to crop up and people would call us to ask 'What do you see?' all we could tell them was 'We'll tell you tomorrow what we see,'" says Jim Saddel, a systems manager at UPS. "Now we can look at the dashboard and see right away whether it's an across-the-board problem or an isolated problem on a specific server."

UPS upgraded its Truviso deployment in April to add e-mail and text-based alerts. When managers see an alert about borderline performance, they can investigate and hopefully prevent an outage.

Lots of vendors talk a good game about moving BI into operational areas like the ones at Insurance.com and UPS. But slow, batch-oriented technologies are too often the norm, and they can't keep pace with decisions that have to be made in the moment. Stream processing technologies promise to make "real time" reports, dashboards, and decision-support applications a reality.




Commit To In-Memory

The third element poised to change BI is the much faster analysis that's possible using in-memory calculations. In-memory tools can quickly slice and dice large data sets without resorting to summarized data, pre-built cubes, or IT-intensive database tuning.

Products such as Spotfire (acquired by Tibco), Applix TM1 (acquired by IBM, now IBM Cognos TM1), and QlikTech were pioneers in the category, and in recent months more vendors have joined the in-memory ranks, or laid out plans to do so. Microsoft, for example, plans to add in-memory analysis to next year's release of SQL Server 2008 R2. MicroStrategy added optional in-memory analysis capabilities in January to its BI suite.

The power and appeal of in-memory products have grown in recent years as multicore, multithreaded, and 64-bit server technologies have become more commonplace and affordable. These hardware advances enable in-memory products to analyze the equivalent of multiple data marts or even small data warehouses in RAM. The technology also eliminates, or at least minimizes, the need for extensive data prep and performance tuning by IT. For end users, that means faster self-service BI without waiting in the IT queue.

SAP gave a jolt to in-memory approaches this spring with SAP BusinessObjects Explorer, which blends the Internet-search-style querying of its Polestar interface with the in-memory analysis of SAP's Business Warehouse Accelerator appliance. The product is available with or without the super-charging of in-memory 64-bit technology, but without it, it's an Internet-search-style querying tool. The big handicap: The in-memory version accesses data only in SAP Business Warehouse. An upgrade due this fall is expected to access myriad data sources.

Sara Lee is an Explorer beta-tester turned customer. Having completed a pilot test this spring, the food conglomerate bought the system with expectations that the speed will let it eventually open up BI to many more employees. A lot of people don't use BI now because "every time you ask a question, you can go get yourself a cup a coffee before you'll get an answer," says Vincent Vloemans, director of global information management at Sara Lee. "With this technology, you get answers in a second, and that implies you also start asking questions out of curiosity."

Sara Lee will test Explorer in two areas. First, its continuous improvement/lean group will use it to help optimize processes such as purchase-to-pay and order-to-cash. That requires country-by-country analysis to know which units perform best and worst, and why. "Answering those questions is easier if you can navigate data quickly," Vloemans says.

Second, its finance unit in Europe thinks faster answers will improve its standard BI reporting. "These people are constantly planning and reviewing the business, and they also get a lot of 'what if' questions thrown at them from senior management for which they don't have pre-defined reports," Vloemans says.

If those two deployments go well, he thinks the tool could be exposed company-wide. But that will require security controls and careful thought about the dangers of bad intelligence -- like assuming "sales" is measured the same in each business unit. Warns Vloemans, "That's a BI problem in general, but when you give a powerful tool to more users, you need to be even more mindful about how people will interpret the data."

Your employees want that speed--fast data query and analysis is cited more than any other feature as most important among BI buyers. Real-time insight and prediction fall lower on the list, though that's not surprising given they're unfamiliar capabilities for many BI practitioners. Query and analysis is as old as BI itself, and who doesn't want a faster and easier version of what you already use every day? Don't be lulled, though: While prediction and real-time insight are over-the-horizon capabilities for many, they'll be table stakes within a few short years.

Doug Henschen is editor in chief of IntelligentEnterprise.com.

Continue to the sidebar:
SaaS Makes Its Mark In Business Intelligence