Chief Jon Greiner recently expanded his staff of crime analysts from one to 11 without hiring a single new officer at the Ogden Police Department in Utah.
Instead, Greiner equipped his existing force of eight lieutenants and two assistant chiefs with new, easy-to-use, Web-based business intelligence tools that enable the police veterans to combine and manipulate data from arrest records, court documents, probation logs, jurisdictional maps and other sources to identify patterns and pinpoint hot spots so they can stop crimes before they happen.
“My police officers – who are 30 years younger – are gamers, and I thought that if I could put something user-friendly in their hands, they could do great things as crime analysts,” explains Greiner. Today, the officers are using the new BI tools to perform geographic profiling of crimes and analysis of police data “in seconds,” he says. Before, it could take days for the department’s single crime analyst to fulfill a report request. An added bonus is that experienced police officers with extensive street experience are now able to apply their firsthand knowledge to crime analysis.
“You have practitioners asking the what-if questions, which has changed the way we police,” Greiner says.
Welcome to Business Intelligence 2.0, a world in which one of BI’s original big promises is finally being met, and a broader class of everyday business users – as opposed to statisticians or data analysts – are tapping into innovative technologies and Web-based BI capabilities. Police officers, physicians, accountants and salespeople are mashing up and analyzing structured and unstructured data from far-flung sources in the ways that make the most contextual sense to them.
“All of these new technologies are about making it easier to build and consume analytical applications,” says Gartner Inc. analyst Kurt Schlegel. Today, he notes, companies frequently cite a lack of both end-user and developer skills as a major barrier when deploying traditional BI applications. Indeed, anecdotal evidence suggests that no more than 20% of users in most organizations use reporting, ad hoc query and online analytical processing tools on a regular basis.
Instead, most companies rely on already overburdened IT departments or in-house teams of BI experts to fulfill users’ requests for reports, analyses and forecasts, a process that can take weeks or longer. Then, when decision-makers finally receive a report, they often discount or distrust it because the data is no longer relevant or timely.
Universal Mind’s SpatialKey heat map shows patterns of crime density to aid officers of the Ogden Police Department.
However, that’s beginning to radically change, thanks to highly intuitive, easier-to-use Web-based user interfaces and better data management and access schemes, such as service-oriented data architectures, which enable users to mash up data in increasingly standardized formats from a variety of sources.
“We’re seeing mashups with GIS mapping technology as well as on- demand BI solutions that let users combine and display their own data with data from external sources,” says IDC analyst Dan Vesset. “The goal is to get IT out of development [of user interfaces and reports] and get them more involved in data quality and data integrations. That’s their highest value-add.” (See “BI 2.0 means change for IT.”)
“Another very big change is an awareness of BI’s potential at the business management layer in companies,” Vesset notes. “Business is seeing real value in analytics. Many organizations are starting information management groups and BI competency centers that sit on the business side.”
A New Way of Thinking
One example is the Massachusetts Housing Finance Agency. The BI team there has incorporated geographical mapping capabilities, including location intelligence features from Pitney Bowes Inc.’s MapInfo software, into its Cognos BI dashboard as a way to make information accessible in geographical form to users across the entire agency. Before, only 12 superusers had access to geographical tools. Now, all 300 of the agency’s workers can access and manipulate BI data in geographical form, says Carl Richardson, BI project manager.
“We anticipate that more people will do analysis,” says Richardson. “It will allow the average user to think geographically when it comes to data. They could create both thematic and point maps on data, which is important to them in their individual reporting groups.”
One example would be to combine data on housing units, loans and public transportation so that it could then be analyzed and displayed in a map format to show how many of the agency’s housing units are located close to public transportation.
“Being able to work with that data in graphic form as opposed to putting that data in a database would allow us to react a lot quicker,” Richardson says.
For now, mapping tools are probably the most popular kind of BI mashups, also known as “bashups,” but experts say the possibilities are almost endless. Technologies such as integrated search and in-memory analytics will make it easier to index large amounts of structured data and build high-performance analytical applications against increasingly large data sets. They also promise to empower users to explore data and discover new insights in new ways.
At Excellus BlueCross BlueShield, a health insurer in Rochester, N.Y., enterprise architect Mike Axelrod is using JustSystems Inc.’s XFY software and experimenting with linking claims data and wellness program data so employers can analyze the cost effectiveness of different programs and benefits.
Looking ahead, Axelrod says he foresees a scenario in which people who work out at a gym could have the exercise equipment they use upload data to a Web-based health log. Users could then combine that data with other information to analyze their overall health and progress toward their personal goals.
Mashups, Axelrod says, “solve the old-school problem of data isolation.”
Excellus is also using JustSystems’ mashup technology in its customer-service call centers to display data on a single screen, even though that data may reside in multiple systems – including green-screen and Web-based applications, Axelrod says. JustSystems’ XFY technology can handle multiple pieces of XML data simultaneously on the screen, according to the company.
Using Excellus’ existing service-oriented architecture interface, JustSystems software can retrieve information from a claims application and present that information to a customer- service rep through a browser. If the agent also needs data from a policy application, it would retrieve and display that as well, but the claims information would still be on the screen.
Tracking and Accountability
Another user, New York-based Thomson Financial, is using Serena Software Inc.’s MashUp Composer to combine information from the company’s Salesforce.com application and information about its various product offerings so salespeople can produce customized sales proposals for specific customers in about three minutes.
Managers can then track the proposals, including the authorization and extension of product trials that salespeople offer customers, through the sales process.
“From an integration perspective, it’s not overly complicated, but it added a ton of value,” says John Hastings-Kimball, the former vice president of workflow solutions who recently left Thomson to work for Serena.
“Before, there was very little accountability to senior management, and a salesperson could easily extend product trials to customers,” he says. “In some cases, we had customers that were on a trial for more than a year. Now, if a trial gets extended beyond a certain point in time, it sets off a trigger and the salesperson ends up in the VP’s office to explain.”
The upshot of the BI mashup is that Thomson cut the time from a product trial to a customer conversion from an average of 75 days to between 36 and 40 days.
While the business benefits of user-friendly BI and BI mashups can be great, they aren’t without challenges.
As is the case with traditional BI and data integration as a whole, data quality is paramount, experts say.
“The most important aspect is data quality,” says IDC’s Vesset. “That includes data governance, master data management and all of the related infrastructure that needs to be in place to make sure you have the right data.
“A lot of this has nothing to do with technology. It’s about agreeing on common data definitions and agreeing on exactly what each performance indicator means so you can manage on analytics rather than having lots of one-off products and [BI] projects.”
In larger enterprises, IT itself can be a big roadblock, says Hastings-Kimball. “A lot of the [new and emerging] BI mashup vendors don’t carry a big name. What I ran into at Thomson is that if it didn’t carry the name Siebel, Salesforce.com or SAP, IT didn’t want to hear about it,” he says. “IT wants these big sprawling enterprise applications. If it’s not that, it scares the IT folks.”