Just went off the phone from a Concours teleconference with Tom Davenport, Bob Morison and about 50 other participants. The topic was Competing on analytics, which was also the title of a Harvard Business Review article he wrote in January last year, and which forms the basis for a book of the same title coming out soon. Concours is launching a membership program called the Business Analytics Concours (glossy brochure here). The upshot is a revivial and refocus on business analytics. Following a number of examples of companies (in particular the Harrah Casino) that have had success due to their ability to relentlessly analyze and optimize their performance, value offerings and market opportunities.
Tom is an interesting character and a prolific writer on knowledge management, process optimization and knowledge worker productivity. He outlined how companies that compete on analytics tend to share certain attributes, such as senior management advocacy, an enterprise approach to analysis (rather than letting the thousand analysts bloom), going first deep and then broad, and paying attention to the development of a strong analytical capability.
My role was to comment and to discuss the technology side: While everyone recognizes that competing on analytics is a question of culture, understanding of the business environment and analytical skill and drive, there is a technology side to it as well. What kind of technologies can enable analysis and optimization, what emerging technologies should IT be monitoring and experimenting with in this space, and how would an enterprise architecture for a company with an analytical bent look different from most companies’ architectures today?
Here are my notes:
- the obvious technologies needed are repositories for data, such as data warehouses and datamarts, as well as business intelligence software for analyzing it
- on a more abstract level, we need technologies that allow for rapid collection (most data is out there in digital form, but it needs to be made analyzeable), structuring (hopefully avoiding human intervention in the data cleanup phase, which is costly) and analysis of a wide range of data (which very often turns into experimentation)
- in particular, we need technologies that let peple develop models from operational data, and redo structuring and categorization in a dynamic and shared way (did anyone say wiki?)
- a short-term path to better analytics may be search technology, which gives access to unstructured data, allows joining of many sources, and does not require rearchitecting and a massive job of initial categorization and structuring
- a sizeable investment in data preparation will kill the analytical impulse in its birth
- long-term, there are interesting possibilities in the kind of data exchange protocols visualized by Van Jacobson, i.e. a form of networks that makes data location irrelevant and pathways hidden
- lastly: We have to realize that this is a cultural, strategic and managerial issue, and that almost any technology can be used in an analytical way. If you are not inclined to analyze your environment, no amount of technology is going to make you do it. In fact, more technology can distance you from the real world, and make you give in to the temptation of letting people have pre-saved spreadsheets and fixed models rather than the ability to analyse
- an ideal would be to have experimental facilities, where things could be sim’ed out, complete with a button labeled "Make it happen".
This looks like an interesting project, because it goes right to the heart of what companies must get better at in a world where information spreads rapidly, imitation is easy, and you compete on your evolving optimization and innovation capabilty rather than invididual technologies or services.