Short answer: when it's an analytic application. How can you tell the difference? Well, that's not always so easy. Most often, this is dictated by tool limitations, but also, the answer can be driven by the level of detail that is useful at an executive level. Executive information must be quickly scanned and the point should be readily evident. This group has a need to move quickly and optimize their time spent on any given issue. Too much detail and their eyes will glaze over. Not because they don't have the intellectual horsepower to deal (as much as we like to make fun of our executives, how do yo think they got there in the first place?) but when you start to bog them down too much in a single subject, they have a hard time concentrating, for worrying about all they stuff they have left to do today. So keep it high level and make sure that your visualization of the data makes the point readily apparent.
At the executive level, lots of root-cause analysis is not necessary. That's what they have a staff for. The danger of adding too many dimensions to an executive dashboard is that the deeper (more granular) the data gets, the more familiar you have to be on a day-to-day basis on the inner workings of the operations. Otherwise, the data might look scary, or just wrong, if you are not intimately familiar with the nuances of the deep information.
The bottom line is this: when you start your executive dashboard project, everyone will think they want everything. As the pro leading the development effort, it is your job to guide them so that everyone gets the piece of the overall solution that provides the most value for their role within the organization.
Monday, August 17, 2009
Thursday, September 11, 2008
More Than Automation
If you've ever been stuck with creating an ROI for a business intelligence project, you know that the number one justification for a BI project is resource savings as a result of automating routing reports. When it comes to getting your project paid for, don't get me wrong, I understand you've gotta do what you've gotta do. But if you limit your project goals to report automation, you'll be cheating your organization out of some important potential gains. Pushing your project team to think beyond automation to a new level of performance can realize much greater gains than "redeploying" a few data analysts.
For example, if you have a $4 billion dollar a year business, I don't mean to be Captain Obvious here, but that's over $10 million dollars a day. If your organization is like many, it probably takes a few days after the end of the month to compile the reports used for decision-making. At the rate of $10 million dollars a day (I just like saying that) even small, incremental improvements in decision-making can have a huge positive impact. If your project is costing you, say, $1 million, instead of just focusing on automation, think about the kind of changes that could pay for your project.
Ask yourself, or your customer, on what Key Performance Indicators am I judged and/or incented? Next, what metrics feed into those KPI's? Finally, what kinds of decisions do we make as a result of those metrics. Now, let's start to break those metrics down a bit. At what time period are they of statistical value? In other words, would you make a decision based on a week's worth of data, or do you really need to accumulate a quarter's worth before you can make a change? This is a critical question to ask of each metric to start to recognize opportunities for improvements on speed-to-market type decisions. This simple question can be the difference between real competitive advantage and mere automation.
For example, if you have a $4 billion dollar a year business, I don't mean to be Captain Obvious here, but that's over $10 million dollars a day. If your organization is like many, it probably takes a few days after the end of the month to compile the reports used for decision-making. At the rate of $10 million dollars a day (I just like saying that) even small, incremental improvements in decision-making can have a huge positive impact. If your project is costing you, say, $1 million, instead of just focusing on automation, think about the kind of changes that could pay for your project.
Ask yourself, or your customer, on what Key Performance Indicators am I judged and/or incented? Next, what metrics feed into those KPI's? Finally, what kinds of decisions do we make as a result of those metrics. Now, let's start to break those metrics down a bit. At what time period are they of statistical value? In other words, would you make a decision based on a week's worth of data, or do you really need to accumulate a quarter's worth before you can make a change? This is a critical question to ask of each metric to start to recognize opportunities for improvements on speed-to-market type decisions. This simple question can be the difference between real competitive advantage and mere automation.
Saturday, August 30, 2008
Speedometers v. Speeding Tickets
Just a quick pop-in to make a note about dashboards. There is a tendency to put a bunch of metrics on one page, maybe spice it up with a couple of charts (green, yellow, and red, of course!), and call it a dashboard. I think it's worthwhile to make a distinction between the concepts of a report and a dashboard.
Say you're driving down the road. You look ahead and spot a State Trooper hiding out in the bushes. You quickly glance down at your speedometer and note that you are exceeding the posted speed limit. You let off on the accelerator a bit, gliding past the Trooper unnoticed. That is the value of your dashboard. You knew how to evaluate the conditions, which indicators were relevant under the current circumstances, and you knew what actions to take to deliver the desired results. You were able to take action because you had the information quickly enough to avoid an undesirable consequence.
A report is a speeding ticket. Your behind is already in court. By the time you get the report (which, by definition, is at the end of a reporting period) the consequence is a foregone conclusion. Please don't call it a dashboard simply because you have before you, on one page, results from a number of different systems.
There is one characteristic of a good corporate dashboard that is an improvement on a real-world dashboard--such as one found in a car or in the cockpit of an airplane--and that is the ability to drilldown on a high-level result. I would submit that, without that capability, what you really have is a report based on a shorter time period.
Speaking of time period, to determine whether a metric is appropriate for a dashboard environment, you have to determine at what point the data is relevant. For example, if you have sent out a mailing and the results wildly fluctuate on a daily basis and it isn't statistically valid until you have the whole mailing back anyway, it's not useful to to have your finger on the pulse of those results. Plus, maybe you can't react on that in the short-term, anyway. On the other hand, if you sell Olympic-themed t-shirts and your orders vary with the tally of gold medals won, dialing in your just-in-time manufacturing operation could have a big impact on waste and profitability. So when you're designing your performance analytics schema, focus on decision-making processes where quicker information results in better decision-making and the business is prepared to react more quickly as a result.
Say you're driving down the road. You look ahead and spot a State Trooper hiding out in the bushes. You quickly glance down at your speedometer and note that you are exceeding the posted speed limit. You let off on the accelerator a bit, gliding past the Trooper unnoticed. That is the value of your dashboard. You knew how to evaluate the conditions, which indicators were relevant under the current circumstances, and you knew what actions to take to deliver the desired results. You were able to take action because you had the information quickly enough to avoid an undesirable consequence.
A report is a speeding ticket. Your behind is already in court. By the time you get the report (which, by definition, is at the end of a reporting period) the consequence is a foregone conclusion. Please don't call it a dashboard simply because you have before you, on one page, results from a number of different systems.
There is one characteristic of a good corporate dashboard that is an improvement on a real-world dashboard--such as one found in a car or in the cockpit of an airplane--and that is the ability to drilldown on a high-level result. I would submit that, without that capability, what you really have is a report based on a shorter time period.
Speaking of time period, to determine whether a metric is appropriate for a dashboard environment, you have to determine at what point the data is relevant. For example, if you have sent out a mailing and the results wildly fluctuate on a daily basis and it isn't statistically valid until you have the whole mailing back anyway, it's not useful to to have your finger on the pulse of those results. Plus, maybe you can't react on that in the short-term, anyway. On the other hand, if you sell Olympic-themed t-shirts and your orders vary with the tally of gold medals won, dialing in your just-in-time manufacturing operation could have a big impact on waste and profitability. So when you're designing your performance analytics schema, focus on decision-making processes where quicker information results in better decision-making and the business is prepared to react more quickly as a result.
Friday, April 18, 2008
Getting Past the Silos
The recent theme in books and articles on the subject of business intelligence is that in order to get value from these initiatives, you have to do something with the information: make better decisions, take action, measure your results...lather, rinse, repeat. I'm up on that crowded soapbox and I'll tell you why.
In order for business intelligence to be more than just a smarter way of doing what you're already doing, you have to be able to reach across departmental boundaries (which are imaginary, by the way, as in this humorous Coke commercial where one department tries to sue another) and put the data together in new ways that will give you access to the new insights that can make you more competitive. Everything you're learning about your own department? Your competitors already know all that stuff. You're not getting a leg up that way. But if you can ignore those artificial boundaries and think about things from a customer perspective--like an end-to-end process that cuts across multiple departments, for example--you can find ways to move your business to a new level of competition.
What's difficult about doing this is not necessarily a data issue. Typically each department/division has it's own goals that it needs to meet, it's own incentive programs and each department will focus on their goals which, by the very nature of organizations, will be different than those of another department. There is typically little or no incentive for the managers to allocate resources to activities that will not have a direct impact on their departmental goals.
In previous posts I have stated that data warehousing and business intelligence functions should be centralized with some resources assigned to various internal clients. There's an important reason why the organization needs to be set up this way. I can hear the protests right now: "But, but, but! Our needs are unique!" "Our IT doesn't get it!" "We need to move faster than that!" You know how I know you're saying this? Because that's exactly where I was a few years ago when I started working on our data warehousing project. But once we started working on projects that leveraged the data we had gathered, the light bulb came on as we began to realize the true value of what we had created. Creating a more efficient and elegant way to manage our division's processes with information was our primary aim in the beginning. As we began to accumulate data from various areas, our division executive started coming up with ways to combine data from different systems to get new information. For example, she requested analysis comparing customer satisfaction data with employee satisfaction data to see if there was any correlation. This may not sound groundbreaking, but historically these were two different processes with totally separate data stores and combining the data was something that had never been considered. It doesn't take a rocket scientist to figure out that the two were directly correlated and this information was used to make important management decisions.
Another great idea was connecting customer satisfaction data to health data. What we wanted to know was whether we could predict that someone was going to be unhappy simply based on the types of claims they were filing. The answer was amazing and not at all what we thought. Customers who were dissatisfied tended to fall under the category of either having filed very little or a whole, whole lot. Most of our satisfied customers were those who had filed enough claims to have some experience with our customer service department, but not enough to be buried in the avalanche of paperwork that often comes with a catastrophic illness or injury. What this told us was that we needed to approach two very different populations with different processes in order to bring about significant improvement in satisfaction.
The common thread in both of these scenarios was that these were data sets that had never been connected before and in both cases it took the support of an upper-level executive to make the analysis and subsequent actions take place. So if you're lucky enough to have someone in your upper ranks who "gets it" and asks provocative questions and then does something about the answers, you are on the road from simply faster and more accurate analysis to "competing" on analytics.
In order for business intelligence to be more than just a smarter way of doing what you're already doing, you have to be able to reach across departmental boundaries (which are imaginary, by the way, as in this humorous Coke commercial where one department tries to sue another) and put the data together in new ways that will give you access to the new insights that can make you more competitive. Everything you're learning about your own department? Your competitors already know all that stuff. You're not getting a leg up that way. But if you can ignore those artificial boundaries and think about things from a customer perspective--like an end-to-end process that cuts across multiple departments, for example--you can find ways to move your business to a new level of competition.
What's difficult about doing this is not necessarily a data issue. Typically each department/division has it's own goals that it needs to meet, it's own incentive programs and each department will focus on their goals which, by the very nature of organizations, will be different than those of another department. There is typically little or no incentive for the managers to allocate resources to activities that will not have a direct impact on their departmental goals.
In previous posts I have stated that data warehousing and business intelligence functions should be centralized with some resources assigned to various internal clients. There's an important reason why the organization needs to be set up this way. I can hear the protests right now: "But, but, but! Our needs are unique!" "Our IT doesn't get it!" "We need to move faster than that!" You know how I know you're saying this? Because that's exactly where I was a few years ago when I started working on our data warehousing project. But once we started working on projects that leveraged the data we had gathered, the light bulb came on as we began to realize the true value of what we had created. Creating a more efficient and elegant way to manage our division's processes with information was our primary aim in the beginning. As we began to accumulate data from various areas, our division executive started coming up with ways to combine data from different systems to get new information. For example, she requested analysis comparing customer satisfaction data with employee satisfaction data to see if there was any correlation. This may not sound groundbreaking, but historically these were two different processes with totally separate data stores and combining the data was something that had never been considered. It doesn't take a rocket scientist to figure out that the two were directly correlated and this information was used to make important management decisions.
Another great idea was connecting customer satisfaction data to health data. What we wanted to know was whether we could predict that someone was going to be unhappy simply based on the types of claims they were filing. The answer was amazing and not at all what we thought. Customers who were dissatisfied tended to fall under the category of either having filed very little or a whole, whole lot. Most of our satisfied customers were those who had filed enough claims to have some experience with our customer service department, but not enough to be buried in the avalanche of paperwork that often comes with a catastrophic illness or injury. What this told us was that we needed to approach two very different populations with different processes in order to bring about significant improvement in satisfaction.
The common thread in both of these scenarios was that these were data sets that had never been connected before and in both cases it took the support of an upper-level executive to make the analysis and subsequent actions take place. So if you're lucky enough to have someone in your upper ranks who "gets it" and asks provocative questions and then does something about the answers, you are on the road from simply faster and more accurate analysis to "competing" on analytics.
Wednesday, April 9, 2008
Web Analytics a Whole New Ball Game
Direct marketing has long been leading the charge in information-based decision-making and this was never more true than it is today as companies large and small find innovative ways to leverage the Internet. In the wake of the news of Yahoo! acquiring IndexTools web analytics in an effort to compete with Google Analytics, I thought I would share another web marketing story I read yesterday. DMReview's article "The New Online Reality at Fox Sports" on their BI channel gives an interesting view into how this multimedia giant is using analytics to leverage its web content on its sites that function as an income stream.
The article discusses Fox's "secret formula" that it uses to measure and tweak its NFL, NBA and MLB products. These metrics are considered critical to running a successful web marketing engine, but the article stresses the value the editors bring to the table to make content decisions initially, the quality of the reporting, and the deep understanding of their markets necessary to know what will be interesting to their viewers. In a highly competitive space like web sports media, analytics support, rather than replace, high-quality decision-making that keeps Fox Sports in the game.
Measuring time spent, where readers come from, where they go, revenue generated and a host of other web metrics is a daunting task, both in complexity and volume of data. In a sidebar, the article points out that standard BI software is not optimized to manage the massive data points that are tracked across many web pages and that this type of data is not typically fed into traditional data warehouses. Special web analytics packages fit this growing niche. Interestingly, the difficulty with using this data is not cited as a lack of data or technology to access it, but a lack of analysts who understand the data. It seems that web analytics is still uncharted territory to a great extent, awaiting a few cowboys to ride in and rustle the data. (Did I go too far with that analogy?) Similarly, companies are looking for marketing VP's and managers specifically with electronic media expertise.
If you have a website, you need to measure what is going on there, bottom line. Even tiny companies are figuring out how to leverage a web presence to gain market share, generate leads, and create new income streams, and to manage these efforts effectively, you need goals, metrics (leading indicators and success measurements), you need to know where your traffic is coming from to effectively focus your marketing efforts...all the same things you need to know for a traditional marketing program. But as this article points out, different technology is needed for this rapidly-changing channel. These days, there's no excuse for a company of any size not to be on the web and even less excuse for companies not to measure their efforts there, since it does cost to be on the web, but even small companies can have some level of analytics through something like Google Analytics which is, shockingly, free. It seems even analytics are being democratized.
For more information about the evolving field of web analytics, check out The Web Analytics Association.
The article discusses Fox's "secret formula" that it uses to measure and tweak its NFL, NBA and MLB products. These metrics are considered critical to running a successful web marketing engine, but the article stresses the value the editors bring to the table to make content decisions initially, the quality of the reporting, and the deep understanding of their markets necessary to know what will be interesting to their viewers. In a highly competitive space like web sports media, analytics support, rather than replace, high-quality decision-making that keeps Fox Sports in the game.
Measuring time spent, where readers come from, where they go, revenue generated and a host of other web metrics is a daunting task, both in complexity and volume of data. In a sidebar, the article points out that standard BI software is not optimized to manage the massive data points that are tracked across many web pages and that this type of data is not typically fed into traditional data warehouses. Special web analytics packages fit this growing niche. Interestingly, the difficulty with using this data is not cited as a lack of data or technology to access it, but a lack of analysts who understand the data. It seems that web analytics is still uncharted territory to a great extent, awaiting a few cowboys to ride in and rustle the data. (Did I go too far with that analogy?) Similarly, companies are looking for marketing VP's and managers specifically with electronic media expertise.
If you have a website, you need to measure what is going on there, bottom line. Even tiny companies are figuring out how to leverage a web presence to gain market share, generate leads, and create new income streams, and to manage these efforts effectively, you need goals, metrics (leading indicators and success measurements), you need to know where your traffic is coming from to effectively focus your marketing efforts...all the same things you need to know for a traditional marketing program. But as this article points out, different technology is needed for this rapidly-changing channel. These days, there's no excuse for a company of any size not to be on the web and even less excuse for companies not to measure their efforts there, since it does cost to be on the web, but even small companies can have some level of analytics through something like Google Analytics which is, shockingly, free. It seems even analytics are being democratized.
For more information about the evolving field of web analytics, check out The Web Analytics Association.
Monday, April 7, 2008
Data Warehouse and BI Support Organization: Functional or by Client Area?
If your organization is like the one I grew up in, no doubt there is some undulation in organization structure. Maybe things started off in the old "strategic business unit" structure where every business unit had just about every function except finance and HR. Then over time, as your company became larger, went public, felt a budget pinch, or whatever, they decided to go to a more functional organization to eliminate redundancies in areas such as marketing and possibly even some IT functions.
I promised to stay business-focused with this blog and I will. But as business people we are often engaged in the debate over where the data warehouse effort will live. Perhaps historically in your company, anything that smelled like IT is functionally organized in IT and prioritized through a corporate prioritization process. My bet is that these resources are scheduled for the next 3 years and not able to respond to your business needs with the speed you would like. Many times what results from this situation is data efforts, large and small, sprouting up throughout the company, from little Access databases to full-scale data warehouses owned and supported by business areas. Necessity is the mother of invention, but eventually these efforts will capture the attention of IT management and the debate (war) will be on.
I will place a stake in the ground and say the best scenario is something of a compromise. It is best for a data warehousing group to be centralized within IT, but to have a "customer-centric" organization. There is much to be supported: hardware, nightly ETL loads, metadata, end users, security, training, etc. It is impossible to ensure "one version of the truth" with data marts distributed throughout the organization unless they are connected to an overall architecture and it is very difficult to support these functions within a business department or even a business division. While business people may have the vision for how the data will be leveraged, their IT counterparts bring the special knowledge of what is involved in supporting such a major initiative and should be at the table to offer this value.
That said, I think that somewhere between a fully-centralized, functional approach and a siloed, isolated data mart is an enterprise-wide effort that is organized to quickly address immediate needs of business clients. In order for a data warehouse to be supported by the business, it must remain more than relevant, its relevance must be on the order of oxygen for the business users. And in order to remain that relevant, the data warehouse must be able to address the business needs swiftly and nimbly.
Exactly what that organization will look like will vary from company to company, but generally speaking, the subject matter experts--managers and analysts in each business area--need some control over priorities for the business intelligence resources. They also need to be very involved as a group in the overall priorities for how the data will be developed at the warehouse level, so they understand how their priorities will fit into the overall plan.
Like I said, how deeply into the process this client-focused organization would go varies by company, but in most cases even down to the ETL development and support level there will be some datasets that are particular to a business client and should be organized to support that particular client, balancing swift attention to business priorities with ensuring that the data does not become siloed. The result is a sort of cross-functional, client-focused team that lives in the IT area, is managed by IT managers, but acts like part of the business customer's team.
Additionally, as much of the report development that can be handed off to business analysts should be. These are usually cheaper resources than BI developers found in IT and can be directly driven by their manager's priorities. The important thing is to ensure that reports are properly documented.
The advantages for the manager and business analyst are clear. The advantage of an organization like this for IT management is that they get to balance the immediate needs of their business customers (whose support they need) with building the enterprise data warehouse, which is a constantly moving target with new systems coming in, old systems being retired, and new overall strategic initiatives coming into play every day. Finding a balance that takes all these considerations into account requires political savvy and a deep organizational knowledge of what will work organically within your company.
I promised to stay business-focused with this blog and I will. But as business people we are often engaged in the debate over where the data warehouse effort will live. Perhaps historically in your company, anything that smelled like IT is functionally organized in IT and prioritized through a corporate prioritization process. My bet is that these resources are scheduled for the next 3 years and not able to respond to your business needs with the speed you would like. Many times what results from this situation is data efforts, large and small, sprouting up throughout the company, from little Access databases to full-scale data warehouses owned and supported by business areas. Necessity is the mother of invention, but eventually these efforts will capture the attention of IT management and the debate (war) will be on.
I will place a stake in the ground and say the best scenario is something of a compromise. It is best for a data warehousing group to be centralized within IT, but to have a "customer-centric" organization. There is much to be supported: hardware, nightly ETL loads, metadata, end users, security, training, etc. It is impossible to ensure "one version of the truth" with data marts distributed throughout the organization unless they are connected to an overall architecture and it is very difficult to support these functions within a business department or even a business division. While business people may have the vision for how the data will be leveraged, their IT counterparts bring the special knowledge of what is involved in supporting such a major initiative and should be at the table to offer this value.
That said, I think that somewhere between a fully-centralized, functional approach and a siloed, isolated data mart is an enterprise-wide effort that is organized to quickly address immediate needs of business clients. In order for a data warehouse to be supported by the business, it must remain more than relevant, its relevance must be on the order of oxygen for the business users. And in order to remain that relevant, the data warehouse must be able to address the business needs swiftly and nimbly.
Exactly what that organization will look like will vary from company to company, but generally speaking, the subject matter experts--managers and analysts in each business area--need some control over priorities for the business intelligence resources. They also need to be very involved as a group in the overall priorities for how the data will be developed at the warehouse level, so they understand how their priorities will fit into the overall plan.
Like I said, how deeply into the process this client-focused organization would go varies by company, but in most cases even down to the ETL development and support level there will be some datasets that are particular to a business client and should be organized to support that particular client, balancing swift attention to business priorities with ensuring that the data does not become siloed. The result is a sort of cross-functional, client-focused team that lives in the IT area, is managed by IT managers, but acts like part of the business customer's team.
Additionally, as much of the report development that can be handed off to business analysts should be. These are usually cheaper resources than BI developers found in IT and can be directly driven by their manager's priorities. The important thing is to ensure that reports are properly documented.
The advantages for the manager and business analyst are clear. The advantage of an organization like this for IT management is that they get to balance the immediate needs of their business customers (whose support they need) with building the enterprise data warehouse, which is a constantly moving target with new systems coming in, old systems being retired, and new overall strategic initiatives coming into play every day. Finding a balance that takes all these considerations into account requires political savvy and a deep organizational knowledge of what will work organically within your company.
Saturday, April 5, 2008
BI Tools v. Excel
When I think of BI tools, I'm thinking of something specifically built to sit on top of a data warehouse and pull the data in a structured way, like Business Objects(R) or Cognos(R). There are among us "BI Snobs" who think that Microsoft(R) Excel(R) is bad and who would like to stamp out spreadsheet use. These are probably the same people who think Microsoft(R) Access(R) databases need to be stamped out in the corporate environment, but I would tend to agree with them on that point, which is a post for another day.
When I worked on a data warehouse team, part of my job was selling the value of the data warehouse and BI tool we were charged with implementing. But I admit that even I would get to the end of the flexibility with my BI software and resort to converting my report to Excel to complete my analysis. I was a closet Excel lover, because like many other business people, I was raised on Excel and tended to think about analysis in this very conventional, two-dimensional fashion. I'm sure I'm not the first person to write about the pros and cons of Excel v. BI tools, but please indulge me for a minute, since that's what I'm thinking about this morning.
See, I don't believe that there are two types of people in this world: BI (Business Object, Cognos, or whatever) people and spreadsheet people. I believe it is possible to use both technologies as long as you are smart and aware. The big issue with spreadsheets is that they are very prone to error, accidental or otherwise. Spreadsheets can be used to "spin" the truth and make the results seem different than they are, to one's benefit. When using your BI tool, however, you are pulling from your warehouse calculations that have been (presumably) previously vetted by a team of business and technical resources and have been agreed upon by everyone as the "one version of the truth." But BI tools will never be as flexible as spreadsheets and spreadsheets will never be as powerful as BI tools at handling massive amounts of data and drilling down.
The other problem with spreadsheet use is that once you convert your data out of your BI tool, you lose the ability to drill down to the detail, which eliminates much of the value of your data warehouse. The best thing I can suggest for this is maintaining documentation of the BI report from which your spreadsheet originated, so you can track back to the underlying data when necessary.
Also, as I mentioned before, Excel encourages thinking about analysis in a conventional, two-dimensional approach, where a dimensionally-modeled data warehouse has your data organized in a much more efficient, multi-dimensional way. So you may be shortchanging yourself and adding steps by forcing the data into a two-dimensional format. If that doesn't make sense, just ask me and I'll try to explain it more clearly.
I may be stating the obvious, but I don't believe the BI tools will ever be successful at stamping out spreadsheets, nor should they be. As for the accidental errors that are easy to commit in spreadsheets, the best thing to do is create a lot of checking calculations (usually calculations that should equal zero) and be thorough at going back over your formulas and making sure they are accurate. Run BI reports to compare to your spreadsheets to make sure they pass the "sniff" test. And once you feel good about the integrity of your spreadsheet, lock out the cells that should not be manually entered. I also visually indicate which cells have manual entries, in order to make very short work of figuring out where errors may be, once I have built my model. Use both technologies, but be smart and thorough about how you use them and there will still be a place for spreadsheets in the world of BI.
When I worked on a data warehouse team, part of my job was selling the value of the data warehouse and BI tool we were charged with implementing. But I admit that even I would get to the end of the flexibility with my BI software and resort to converting my report to Excel to complete my analysis. I was a closet Excel lover, because like many other business people, I was raised on Excel and tended to think about analysis in this very conventional, two-dimensional fashion. I'm sure I'm not the first person to write about the pros and cons of Excel v. BI tools, but please indulge me for a minute, since that's what I'm thinking about this morning.
See, I don't believe that there are two types of people in this world: BI (Business Object, Cognos, or whatever) people and spreadsheet people. I believe it is possible to use both technologies as long as you are smart and aware. The big issue with spreadsheets is that they are very prone to error, accidental or otherwise. Spreadsheets can be used to "spin" the truth and make the results seem different than they are, to one's benefit. When using your BI tool, however, you are pulling from your warehouse calculations that have been (presumably) previously vetted by a team of business and technical resources and have been agreed upon by everyone as the "one version of the truth." But BI tools will never be as flexible as spreadsheets and spreadsheets will never be as powerful as BI tools at handling massive amounts of data and drilling down.
The other problem with spreadsheet use is that once you convert your data out of your BI tool, you lose the ability to drill down to the detail, which eliminates much of the value of your data warehouse. The best thing I can suggest for this is maintaining documentation of the BI report from which your spreadsheet originated, so you can track back to the underlying data when necessary.
Also, as I mentioned before, Excel encourages thinking about analysis in a conventional, two-dimensional approach, where a dimensionally-modeled data warehouse has your data organized in a much more efficient, multi-dimensional way. So you may be shortchanging yourself and adding steps by forcing the data into a two-dimensional format. If that doesn't make sense, just ask me and I'll try to explain it more clearly.
I may be stating the obvious, but I don't believe the BI tools will ever be successful at stamping out spreadsheets, nor should they be. As for the accidental errors that are easy to commit in spreadsheets, the best thing to do is create a lot of checking calculations (usually calculations that should equal zero) and be thorough at going back over your formulas and making sure they are accurate. Run BI reports to compare to your spreadsheets to make sure they pass the "sniff" test. And once you feel good about the integrity of your spreadsheet, lock out the cells that should not be manually entered. I also visually indicate which cells have manual entries, in order to make very short work of figuring out where errors may be, once I have built my model. Use both technologies, but be smart and thorough about how you use them and there will still be a place for spreadsheets in the world of BI.
Subscribe to:
Posts (Atom)