I was recently involved in a reasonably huge data mining & business intelligence task (that I probably should not disclose). I could say this was an eye-opener, but I am old enough to be cynical and disillusioned so that it was not a surprise.
First, we had some smart people in the team (shamelessly including myself :-), “smart” almost by definition means “experts in programming, sw development and enough mathematics and statistics) doing the sw implementation, data extraction and statistics. Then there were slightly less smart people, but experts in the domain being studied, that were supposed to make the sense of the results and write the report. These people were offloaded from the team, because they were very urgently needed for other projects.
Second, the company bought very expensive tool for data mining and statistical analysis, and subcontracted other company to extend it with necessary functionality. The tool did not work as expected, the subcontracted extension was late by 2 months (they finished it at the time the final report should have been made!) and it was buggy and did not work with the new version of the tool.
Third, it was quite clear that the report should be bent towards what the customer wants to hear (that is not to say it would contain fabricated data—just the interpretations should be more favourable).
So, those smart people spent their time in 1) implementing around bugs in the sw we were supposed to use, 2) writing ad-hoc statistical analysis sw to be able to do at least something, 3) analysing data in the domain they were not experts in, 4) writing the report.
After all this, the report was stellar, the customer extremely satisfied, the results solid, the reasoning compelling.
Had I not been involved and had I not known how much of the potential had been wasted and on how small fraction of the data the analysis had been performed, I would consider the final report to be a nice example of a clever, honest, top level business intelligence job.
So, those smart people spent their time in 1) implementing around bugs in the sw we were supposed to use, 2) writing ad-hoc statistical analysis sw to be able to do at least something, 3) analysing data in the domain they were not experts in, 4) writing the report.
After all this, the report was stellar, the customer extremely satisfied, the results solid, the reasoning compelling.
Had I not been involved and had I not known how much of the potential had been wasted and on how small fraction of the data the analysis had been performed, I would consider the final report to be a nice example of a clever, honest, top level business intelligence job.
So, this problem is NOT one I’m tackling directly (I’m more saying, how can they get smart people like you to make that cludge for much cheaper) but the model does indirectly incentivize better BI tools by creating competition directly in forecasting ability, and not just signaling ability.
(using throwaway account to post this)
Very true.
I was recently involved in a reasonably huge data mining & business intelligence task (that I probably should not disclose). I could say this was an eye-opener, but I am old enough to be cynical and disillusioned so that it was not a surprise.
First, we had some smart people in the team (shamelessly including myself :-), “smart” almost by definition means “experts in programming, sw development and enough mathematics and statistics) doing the sw implementation, data extraction and statistics. Then there were slightly less smart people, but experts in the domain being studied, that were supposed to make the sense of the results and write the report. These people were offloaded from the team, because they were very urgently needed for other projects.
Second, the company bought very expensive tool for data mining and statistical analysis, and subcontracted other company to extend it with necessary functionality. The tool did not work as expected, the subcontracted extension was late by 2 months (they finished it at the time the final report should have been made!) and it was buggy and did not work with the new version of the tool.
Third, it was quite clear that the report should be bent towards what the customer wants to hear (that is not to say it would contain fabricated data—just the interpretations should be more favourable).
So, those smart people spent their time in 1) implementing around bugs in the sw we were supposed to use, 2) writing ad-hoc statistical analysis sw to be able to do at least something, 3) analysing data in the domain they were not experts in, 4) writing the report.
After all this, the report was stellar, the customer extremely satisfied, the results solid, the reasoning compelling.
Had I not been involved and had I not known how much of the potential had been wasted and on how small fraction of the data the analysis had been performed, I would consider the final report to be a nice example of a clever, honest, top level business intelligence job.
So, this problem is NOT one I’m tackling directly (I’m more saying, how can they get smart people like you to make that cludge for much cheaper) but the model does indirectly incentivize better BI tools by creating competition directly in forecasting ability, and not just signaling ability.