Category Archives: CIO corner

MIT CISR Research Briefing on Enterprise Search

imageLast year I had the pleasure of spending time as a visiting scholar at MIT Center for Information Systems Research, and here is one of the results, now available from CISR’s website (free, but you have to register. Absolutely worth it, lots of great material):

Research Briefing: Making Enterprise Search Work: From Simple Search Box to Big Data Navigation; Andersen; Nov 15, 2012

Most executives believe that their companies would perform better if “we only knew what we knew.” One path to such an objective is enhanced enterprise search. In this month’s briefing, “Making Enterprise Search Work: From Simple Search Box to Big Data Navigation,” Visiting Scholar Espen Andersen highlights findings from his research on enterprise search. He notes that enterprise search plays a different role from general web or site-specific searches and it comes with its own unique set of challenges – most notably privacy. But companies trying to capitalize on their knowledge will invariably find search an essential tool. Espen offers practical advice on how to develop a more powerful search capability in your firm.

Truth, time, context, and computation

A reference to Jeanne Ross’ exhortation to companies to find one agreed – or declared – one declared source of truth got me thinking this morning. Jeanne’s point is that in order to get organizations to start discussing solutions rather than bickering over descriptions, it is better to declare a version of the truth to be the real one. If there are inaccuracies in the source of the data, then people can do something about making them more precise, an exercise that in most cases is much more fruitful than trying to suggest alternative numbers.

I very much agree with Jeanne in the main of this statement (probably a smart move, given that I am her guest at MIT CISR this year), as well as the need for it in many organizations. But it got me thinking – what is the truth, and how has what we consider to be the truth been influenced by advances in computation? With Big Data increasingly available, we can now analyze our way to most things. How does this change our concept of what is truth? Moreover, at what level should a CIO declare the one source of truth?

Truth as a function of time and context

I remember a conversation sometime in the nineties with colleagues Richard Pawson and Paul Turton at CSC – the discussion was on how object orientation changed the nature of systems, from being a computationally limited representation (a function, if you will) to being a simulation of the organization. We saw three stages in this evolution:

VERNER Swivel chair, white Width: 24 3/8 " Depth: 27 1/2 " Min. height: 42 1/8 " Max. height: 47 1/4 " Seat width: 20 1/2 " Seat depth: 18 1/2 " Min. seat height: 16 7/8 " Max. seat height: 23 5/8 "  Width: 62 cm Depth: 70 cm Min. height: 107 cm Max. height: 120 cm Seat width: 52 cm Seat depth: 47 cm Min. seat height: 43 cm Max. seat height: 60 cm  First, truth as a stored value. The example we thought of was inventory level – what is inventory level for a certain product? In a world with limited computer resources, the simplest way to have this number would be to periodically calculate it, and then store it so people can have access to it. When you go to IKEA’s web site to search for a nice and cheap office chair (such as the pictured Verner), for instance, they will give you an estimated number in the store closest to you. I don’t know how IKEA calculates that number, but I doubt if they dip into the local POS system of each store to precisely check it each time you query. (If they do, more power to them.) If this number is calculated on an intermittent basis, it will of course be rather imprecise – but it is computationally easy to get to. Similarly, if you ask Google about the distance to the moon, they will come back with documents which have that number in them, generally agreeing on an average of 384,403 km (238,857 miles). However, that is an approximation, since the moon is can be as near as 363,104 km (225,622 miles) and as far as 405,696 km (252,088 miles) depending on where it is in its elliptical trajectory.

I suspect much of the discussion over which are right in most corporations are about these kinds of numbers – calculated after the fact, subject to interpretation because we just don’t know what the precise situation is, and very often we do not know how we got to that number.

However, computation comes to the rescue – with more powerful computers, sensors and faster networks, we can actually move to the second stage: Truth as a calculated number.

For the distance to the moon example, the simple answer is Wolfram Alpha, the mathematical search engine, which will give you the calculated distance to the moon at the time of the query. For the IKEA example, this would mean calculating the number of Verner chairs in the store each time a customer asks on the web. This can be done varying levels of precision. The simplest way would be to get it from the POS system, which records when a chair is purchased and can subtract it from the inventory. A more precise method, given the length of IKEA’s checkout lines, would be to have a sensor on the chair and track when it is taken out of the shelf and placed on the customer’s cart. Precision is largely a question of how much you are willing to spend. For a physical store, tracking cart volumes is expensive, for an online store, it is, in theory, cheap, since a customer moving an item from inventory to cart is done digitally.

This kind of number is much closer to the truth, and much more operationally useful – and the job of the CIO is to declare how this number should be found, tracked and displayed. It may seem somewhat simple to say this, but this is where there should be no question of the source of the truth – every company should have one and only one, and much of the work of CIOs and their organizations in the last 10-15 years has been in moving companies along until they are capable of calculating the one true number.

Then, we move to the next (and so far last) stage: Truth as a calculated number in context. Context very gets more difficult as the need for precision goes up (which, I suppose, blatantly ignoring the quantum mechanical context, is a sort of business version of the Heisenberg uncertainty principle.)

For the distance to the moon example there is little room for context. You could argue that it be different based on where on earth you are, or for what you are going to do with the information (launching a satellite or calibrating your telescope, for instance) but for most uses, there is little need for contextual customization.

For the IKEA example, the situation is rather different for different parts of the organization, and for different types of customer. If I am a customer looking up the number from my smartphone while close to the store, the POS number might be OK, since I would get to the product in short time and the consequences of imprecision would be small. If the nearest IKEA store is several hours’ driving away, then I might want a different number, one that incorporates not just the current situation but also the likelihood that the number would be zero before I get there. Or, I might want a reservation function, either setting the product aside or at least allowing me to report that I aim to buy one within the next x hours and thus would like the number shown as available to be reduced until I can make it to the store. In an online store, the problem is the diametrical opposite – there, customers can have carts sitting for days and it becomes an operational necessity to have some policy declaring at what point the products in the cart will have to be made available to other customers.

Similarly, the very concept of inventory level itself means different things to different parts of the organization. For a store manager, it is a cost concept, something to be optimized in a balancing act between capital costs and stock-outs. For a supply chain manager, it is also a flow concept, something to be optimized between stores. For someone managing the physical space of the warehouse, it is a physical concept – goods that have been sold to a customer but not yet picked up are very much something you need to manage. And for a sales person, inventory levels is an availability concept, often subject to negotiations and transfers within the organization.

So, what is a CIO to do?

I think the declaration of a source of truth is a question of hitting the right level, navigating between the simplicity of simple numbers and the complexity of inferred context. In most cases, I suspect, the optimum lies in providing the ability to find the truth, giving customers (i.e., of the IT organization) their numbers at the source – which should be the one, declared one – but also giving them the tools to interpret them in light of their own context.

The key here is not to try to move from the first phase to the third without missing the second. Unfortunately, in my view, many IT organizations have done just that, by responding to requests for customized reports, systems and views from archival rather than current, operational data. As each number becomes institutionalized through use within its context, transitioning to a declared truth can become an exercise in power rather than rationalism. Better to promise context after speed and precision has been provided – and even better, provide the context in a format the end consumer can relate to within their own context.

For IKEA, that might be giving me the number of chairs available plus a prediction (based on history and, say, number of cars in IKEA’s parking lot) as to how many chairs are likely to be sold, with variance, within the next x hours. For the rest of organization, well – it depends. But ones you provide real-time access to well defined operational data, you can safely leave the question of what it depends on to the person wanting to use it.

Cases of Norwegian IT

Normally when I teach technology strategy (GRA6821), a term paper is part of the course evaluation. The students typically write about some technology, a technology company, or somesuch, normally in groups of three or less.

This year, things will be a little different. I am part of a research project called A Knowledge-based Norway, where the idea is to investigate various industries in Norway in terms of their knowledge generation – and, by extension, their technology evolution. As a part of this project, we will write case studies on various companies, and that is precisely what the students will do. However, rather than having the students chose the companies themselves, we will provide a list of companies, allowing the students, in pairs, to choose a company to write about. We will, of course, entertain suggestions to which companies to have on this list. Here is a start:

Large IT service companies:

  • Accenture (evolution, role of the Norwegian organization internationally)
  • Atea (evolution, mergers, change in role over time)
  • EDB Business Partner/Ergo Group (these companies are about to merge; topics are evolution, mergers, change in role)
  • IBM Norway (evolution, role of the Norwegian organization internationally)
  • Cap Gemini (large consulting company)
  • ?

Innovative technology companies/research groups

  • FAST/Microsoft Enterprise Search division (evolution, merger, technology impact)
  • Simula Research Laboratory (strong research group sprung out of the University of Oslo)
  • Trolltech (advanced technical programming company acquired by Nokia)
  • Opera (multi-platform browser company, still independent with a growing Asian market)
  • Tandberg (videoconferencing technology company, acquired by Cisco)

Academic/research institutions

  • Institute for Informatics, University of Oslo (grossly expanded technology program, new building)
  • NTNU (Norwegian University of Natural Sciences, Trondheim) (birthplace of many companies)
  • Sintef (research arm of NTNU)
  • Norsk regnesentral
  • College university, Grimstad (cluster anchor for interesting little technology area)
  • ?

Software companies focused on the Norwegian or Nordic market

  • Powel (software company focusing on applications for the energy industry)
  • Mamut (personal/SMB company accounting and tax preparation software)
  • Visma (amalgamated vertical ERP company, successful integration story)
  • SuperOffice (sales support software)
  • ?

Large and important IT projects and IT users

  • Telenor (architecture integration project, globalization of services)
  • DNB Nor (largest Norwegian bank, competes on technology platform and services)
  • Norwegian Tax Authority (pioneer in using digital technology to make tax services easier for the individual citizen)
  • (innovative generalized public interaction platform)
  • ?

Interesting startups/rapidly growing companies/interesting stuff

  • Integrasco (blog sentiment analysis, built on top of Amazon’s cloud platform)
  • Meltwater (global media search company, keeping a low profile)
  • EVO Fitness (health club without visible employees – based on remote monitoring and SMS transactions)
  • QuestBack (Internet-based survey company, now expanding outside Norway)
  • ?

This list will grow as I get new ideas – suggestions are welcome! (And yes, perhaps there is an idea to have something about spectacular computer failures as well…)

The skinny on the economic effects of IT

Wired for Innovation: How Information Technology is Reshaping the Economy Wired for Innovation: How Information Technology is Reshaping the Economy by Erik Brynjolfsson

My rating: 4 of 5 stars Erik Brynjolfsson took a look at the IT productivity paradox in the early 90s and decided to sort it out – and he did, by and large, by collecting prodigious amounts of data and tirelessly analyze them to tease out what everybody suspected but could not show empirically: That information technology contributes enormously to increases in productivity, innovation and welfare.

This short and to the point book gives an excellent overview and guide to the research on the economic effects of information technology. Each chapter has pointers to more reading, good examples, concludes with avenues for further research. I will use this as an assignment for my technology strategy students – rather than giving them a few articles, they might as well read the whole book.

(Also available through Google Booksearch. Full notes below the fold.)

Continue reading

GRA6821 Tenth lecture: Project disasters and IT service delivery

This lecture will deal with issues that are (at least superficially) boring but eminently practical: How to avoid systems disasters, and how to deliver IT services within a company. The system disaster we will talk about is CONFIRM, an ambitious project to try to replicate the success of the SABRE reservation system in the hotel and rental car industries. In my experience, it is a real career helper to a) be able to understand when a project is beginning to acquire a whiff of disasterhood, and b) how IT services are provided inside large companies, whether you want to work there of sell your services to either the company or the IT department. The latter you can learn in two hours in a classroom or in two years in a company.

Read and be prepared to discuss:

  • "The Collapse of CONFIRM: What went wrong?", p. 534 in Laudon & Laudon: Management Information Systems, Fourth edition, Prentice-Hall, 1996
    (You might want to go back and revisit Max Hopper’s article on "Rattling SABRE", note the role of CONFIRM in it)
  • Oz, E. (1994). When Professional Standards are Lax: The CONFIRM Failure and its Lessons.Communications of the ACM, 37(10), 29-36.
  • Various other notes, see Blackboard.
  • Langewische, W. (1998). "The Lessons of ValuJet 592." Atlantic Monthly (March).
  • The Concours Group (2004): Service-centric IT (in Blackboard). A consulting report on how to organize an IT department.

Recommended literature:

  • Weill, P. and R. Woodham (2002). Don’t Just Lead, Govern: Implementing Effective IT Governance. Cambridge, Massachusetts, MIT Center for Information Systems Research. Available (number 326) from CISR’s paper web page
  • Weill, P. and S. Aral (2004). IT Savvy Pays Off. Cambridge, Massachusetts, MIT Center for Information Systems Research. Available (number 353) from CISR’s paper web page
  • Weill, P. and M. Broadbent (1998). Leveraging the New Infrastructure. Boston, MA, Harvard Business School Press. Good book on IT management.

Study question:

  • Looking at the CONFIRM disaster – what were the technical reasons for the failure, the organizational (management) reasons, and the strategic (business) reasons?
  • How is running an IT shop (inside an organization), an outsourcing company, and an IT consulting company different – and similar – to each other?

End user computing as vision and reality

My esteemed colleague and similarly jaded visionary Vaughan Merlyn has written rousing call for a new vision for the IT organization. While I do agree with everything he says – in principle – I think we still have a long way to go before the nitty gritty part of IT has moved from server room to cloud, leaving the users to creatively combine and automate their world in cooperation with their colleagues, customers and suppliers. While I do agree that the IT organization is better served by helping users help themselves than do their work for them, I am not sure all the users out there are ready to fish for themselves yet, no matter how easy to use the search engines, social communities and systems implementations tools become.

The enabling vision is not a new thing. I remember a video (or, rather film) from IBM from the mid-80s about End User Computing – a notion that the role of IT was to provide the tools for end users, and then they could build their own systems rather than wait for IT to build for them. (This, incidentally, was also the motivation behind COBOL in the 70s: The language was supposedly so intuitive that end users would be able to describe the processes they wanted automated directly into the computer, obviating the need for anyone in a white coat.) The movie showed an end user (for some reason a very short man in a suit) sitting in front of a 3270 terminal running VM/CMS. Next to him was a friendly person from the EUC group explaining how to use the friendly terminal, which towered over the slightly intimidated-looking end user like the ventilation shaft of an ocean liner.

It didn’t look very convincing to me. One reason for this was that at that time I was teaching (reasonably smart) business students how to do statistical analysis on an IBM 4381 and knew that many of them could not even operate the terminal, which had a tendency to jump between the various layers of the operating system and also had a mysterious button called SysRq, which still lingers, appendix-like, on the standard PC keyboard. Very few of those students were able to do much programming – but they were pretty good at filling in the blanks in programs someone already had written for them.

Of course, we now have gesture interfaces, endless storage, personal battery-powered devices and constant communication. But as the technology gets better, we cede more and more responsibility for how things work to the computer, meaning that we can use it until it breaks down (which it does) at which point we have no idea how things work. This is not the technology’s fault – it often contains everything you need to know to understand it rather than just use it. Take the wonderful new “computational search engine” Wolfram Alpha, for example: It can give you all kinds of answers to numerical questions, and will also (I haven’t seen it, but if the capabilities of Mathematica are anything to go by, it is great) allow you to explore, in a drill-down procedure, how it reached its answers.

This is wonderful – truly – but how many are going to use that feature? By extension: All of us have a spreadsheet program, but how many of an organization’s users can write a new spreadsheet rather than just use an already existing one?

For as long as I have worked with computers, each new leap in functionality and performance has been heralded as the necessary step to turn users from passive to active, from consumers of information to creators of knowledge. While each new technology generation, admittedly, has achieved some of this, it has always been less than was promised and much less than what was hoped for.

And so I think it is this time, too. Many people read Wikipedia, few write for it (though enough do). More importantly, many of Wikipedia’s users are unaware of how the knowledge therein is instantiated. Online forums have many more lurkers than contributors. And human ingenuity is unevenly distributed and will continue to be so.

So I think the IT department will continue to do what it is doing, in principle. It will be further from the metal and closer to the user, but as long as the world remains combinatorially complex and constantly changing, there will always be room for people who can see patterns, describe them, automate them and turn them into usable and connectable components. They will be fewer, think less of technology and more in terms of systems, and have less of a mismatch in terms of clothing and posture between themselves and their customers than before (much of it because the customers have embraced nerd chic, if not nerd knowledge).

The key for a continued IT career lies in taking charge of change rather than being affected by it. I think the future is great – and that we are still a long way from true end user computing. IT as a technology will be less interesting and more important in its invisible ubiquity. And Neal Stephenson’s analogy of a world of Elois and Morlocks, of the many that consume and the few that understand will still hold true.

I just hope I still will be a Morlock. With an Eloi pay and dress sense.

Hard times – more creativity

Roy Youngman, at a recent nGenera teleconference, told this story about how lean times forced a company to think creatively. As he very rightly says in the beginning:

Some people think that “do more with less” means make people work harder to compensate for the people that are let go. Other people think that “do more with less” means “work smarter, not harder”. If you think about it, both of these perceptions are rooted in a fundamental assumption that your existing operation is basically inefficient – that you have people wasting time or you have people working on the wrong things or you have people following bad processes. Depending upon your state of organizational maturity, all this may be true in which case you can “do more with less” by asking fewer people to work both harder and smarter. But good stewards of owner equity should always be trying to eliminate operational inefficiencies at all times, both good and bad. So what do you do if your people are already working hard, smart, and on the right things with the challenge of “do more with less”?

There is only one answer: Innovate!

I’ll expand on my own ideas for how to do this in another blog post – but in the meantime, read Roy’s story of how to do deliver a data warehousing solution without spending much money.

FAST Forward 2009: Notes from the third day

Bjørn Olstad: Microsoft’s vision for enterprise search

Search as a transparent and ubiquitous layer providing information and context seamlessly – from a search box (tell me what you want in 1.4 words and I will answer) to a conversational interface (giving pointers to more information and suggestions for continued searches, to a natural interface.

Demo of Microsoft Surface: Camera interface, can recognize things. Multiuser (as opposed to Apple. Showed an application built on search with touch – whenever you touch an information object a query goes towards an ESP implementation and brings up all the information available on that object.

Very impressive demo of Excel Gemini: How do you fit enterprise data into Excel. (Picture of a VW bug with a jet engine.) Pulls 100 million rows into Excel, sort them (instantly), slices and dices. Built on top of ESP, does extreme compression, takes advantage of high memory, allows publishing of live spreadsheets to Sharepoint. Extremely impressive, worth the whole conference.

Bjørn continues talking about search as a platform: Demoing, where you can ask questions about apartments and houses and get a rich search experience where you can change attributes and the data changes dynamically. Globrix does not hold content themselves, but crawls available content on the web and shows it (much like for airline tickets).

Another demo: Search for entertainment based on location, friends and content. Moving from there to a focused movie site. This is federated search that understands some of the semantics (understands that “David Bowie” refers to a person and therefore only search certain databases.) Also incorporates community (letting users edit the results and feed them back).

FAST AdMomentum – advertising network – has had tremendous growth.

Content analytics: How can you lay a foundation for a good search experience by focusing on data quality? Demo: Content Integration Studio, sucking out semantics from unstructured text and writing it back both to the search engine and to databases (such as an HR database).

Panel session on enterprise search

Hitachi consulting (Ellen): Very big focus on the economy now, almost all conversations are about that topic. eDiscovery is important: Looking at many sources with a view towards risk discovery and risk mitigation.

EMC consulting (Mark Stone): Natural interfaces will be important, frees up the mind to focus on the information rather than the interface. Shows a video of a small girls using the Surface table and how she very quickly starts to focus on the pictures she is manipulating rather than the interface – she completely forgets that she is working with a computer.

Sue Feldman, IDC: We have to get beyond the document paradigm. I want to see interfaces that will immerse me in the sea of information and explore it, without having to think about what application it is in.

Sue Feldman: Core issue with search: Data quality and making it a rich experience for the user. Anthropological, linguistic and cultural issues, getting people to understand both what they are seeing and what they are looking for.  We are just beginning on this journey. From keyword matching and relevance ranking to pulling the user in, having a dialogue with the information. What we are seeing is hybrid systems that combine collaboration, search, analysis etc.

AMR Research: There is a religious war going on, between collaborative systems, portals, content management systems, and search. They all claim to be the answer to the problem of connecting users with their data. There is also consolidation in the market, partially driven by the economy, but there is also a consolidation of functionality and an explosion in new ideas, many small companies coming up with new ideas.  No one technology is going to solve all of these problems. Lots of opportunity because Microsoft is gobbling up all these technologies, trying to provide one product that covers most (Sharepoint).

Q: Examples of interaction management?

Hitachi consulting: Best examples currently found in collaboration and community software.

EMC: There is a tool out there that searches not only blogs, but specifically the comment sections of blogs, looking for mentions of products. Do sentiment analysis, find out what the customers are saying about you.

Sue Feldman: Searching through corporate communications in lawsuit situations. Ad targeting. And what is the relationship between search and innovation?

Hitachi: Innovation comes from finding what you did not expect to find.

Q: This question always comes up: Search is a commodity – or is it? What is the current market doing for search adoption?

AMR: I am not sure who says that, there is so much room for innovation, so I can’t understand why anyone would say it is commoditized. Go out there and find the opportunities.

Sue F: Well, search is a tool, like a screwdriver. But I really need a screwdriver. The toolbox has expanded so much. I see the search market continuing to explode even though the technology is tanking. Possible that we will see a disruption with a new platform based on information management, access and collaboration.

EMC: We are seeing growth, the business will mature because companies have to focus on what the business really needs.

Sue Feldman & others: Search use awards

Customer awards:

  • Best productivity advancement: Verizon Business.
  • Best digital market application (I): McGraw-Hill Platts (doing industry-specific searches, 50% increase in trial subscriptions, 40% increase in revenue.)
  • Best digital market application (II): SPH Search (reader interaction and content integrated with newspaper sources, federated search.)
  • Social computing: Accenture (internal search on people profiles and content)
  • User engagement:, Japan (700m pageviews, 18m unique users)
  • User engagement: AutoTrader (peak query level of 1500 qps)

Partner awards:

  • Digital market solution: Comperio (use of search for user interaction)
  • Social computing solution: NewsGator (enterprise social computing on top of Sharepoint)
  • User experience solutions: EMC Consulting
  • Partner of the year: Hitachi consulting.

FASTForward 2009 – impressions from the second day

The second day has less of the “big picture” and more of product announcements and more technical detail. Here are some notes as the day progresses:

Kirk Koenigsbauer, Microsoft: Our enterprise search vision & roadmap

Kirk is responsible for the business side of FAST after the acquisition. He is speaking on Microsoft’s commitment to search, the roadmap and future business directions, including pricing.

About 15% of the research done in MS Research is search-oriented.  10 years support on current FAST products, even non-MS platform.

Search server express now has more than 100,000 downloads. 1/3 of MS enterprise customers have deployed a MS search solution. Partner #s have doubled.

MS vision: Create experiences that combine the magic of software with the power of Internet services across a world of devices. Search is integral to vision.

Demo: Use of search in a business setting, showing documents in a viewer format, extracting keywords and concepts.

Announcing two new products:

  • FAST for Sharepoint, which is FAST ESP integrated into Sharepoint, available at a substantially lower price than FAST ESP, typically 50% lower price. Simpler pricing model: Per-user charge for FAST ESP standalone, included in Sharepoint. Still need to buy a server at 25K a pop, but this is substantially lower price. Will be available from next rollout of Office (wave 14). Will also provide a licensing bridge for those who purchase Sharepoint now.
  • FAST Search for Internet business. New functionality for interaction management (promotions, campaigns etc.), Content Integration Studio (graphical interface for managing content restructuring and content integration), and simplified licensing: Language pack and connectors will be part of the standard package.

Valentin Richter, Raytion: User engagement

Low satisfaction with many search solutions, and 70% of search managers do not study search logs with an eye to improve the experience. Went through a list of common myths about search (such as “people know what they are looking for”.) People want simplicity – they cannot handle expressions and need more of a drill down approach navigating through related information. Installing search platforms immediately needs to a focus on information quality: You find duplicates, you find confidential documents everywhere, and so on – be ready for it both in a technical and organizational sense.

Walton Smith, Booz Allen Hamilton: Case study of use of FAST and Sharepoint

BAH based in Virginia, traditionally centralized, but expanding. 300 partners, all wanting to go in different directions. De facto collaboration tool was Outlook. Created a social computing platform called Among the results: Have given access to more esoteric material, which caused issues with indexing. Were able to pull new people from other parts of the organization on a project. Other application:, finding people with the right credentials and experience, pulling information from many sources. crawls hell and iShare. About 1/3 of the firm is now using the platform, lots of information on individuals.

Charlene Li: Transformation  based on social technologies

It is all about engaging users in dialogue: H&R Block has a page on Facebook where they discuss tax issues – not trying to pull people in, at least not explicitly. Comcast is on Twitter with their customer service people. Starbucks testing ideas, such as automated purchasing based on a customer card. Beth Israel’s CEO blogs about what it is like to run a hospital. Necessary to change search to include social software: Technorati searches blogs, allows social bookmarking. You can use Twitter mapping to see what people are discussing – showing that what is rated high somewhere may not be what is most discussed. Amazon now lets you filter reviews by friends.

Conclusion: Social networks will be like air, and will transform companies from the outside in. Social media is impacting search at multiple levels, refining results based on personalization details derived from their social circles.

Jørn Ellefsen, Comperio: In search of profits

Comperio has more than 100 customers and have created a front application, Comperio Front, that sits between the customer’s web pages and their search engine. Introduced Drew Brunell who works with SEO for, among others, News International. Paid search is the growing part of the advertising market, everything else is either flat (display ads) or sinking (traditional ads). Doing a lot of experimentation linking into customer behavior – for instance, matching content with areas that see a lot of conmments, “invisible newspapers”. Another notion is the “curated content model”, setting up pages with a blend of original content with stuff from the outside web. Topic pages based on “zero-term search”, offering editorial content put together automatically around. Stefan Sveen, CTO Comperio, demonstrated topic pages from Times Online: User and journalists can create their own topic pages, based on search results and mark entries coming in after the page is created.

Venkat Krishnamoorthy, Thomson Reuters: Delivering Contextual and Intelligent Information to Premium Customers

Reuters delivers context-sensitive information for pre-investment analysis to premiere customers. They have done this for a long time, but want to change from being a data-delivery company, but to integrate into  the user’s workflow. Challenges here included having too many applications the customers needed to stitch together, finding information was difficult, especially across different kinds of assets – more than 40 content databases.  Solution: Put in a search and navigation layer between their desktop products (they have two, a web-based one and a premium, client-based one).

ERP analogies

Andy McAfee uses an analogy of an ERP as a factory for business processes. Here are my analogies:

  • We are born as originals and die as copies. ERP systems are the other way – they start as copies and die as originals. An ERP system, when it is installed, allows you to configure it by choosing parameters – what kind of budget process, how you define "customer", etc. etc. After having set a few thousand parameters, you can be absolutely certain that you are the only company in the world with that particular SAP or Oracle or whatever configuration. Of course, standardization was what ERP systems were all about when they were introduced in the mid-nineties: The idea that software should be simple again.
  • ERP systems are flexible the way cement is flexible. Less true now than it was – cement is ultimately flexible when you pour it, then it hardens into the shape of the hole it was poured into.
  • A more advanced version of this is the old joke that SAP (or insert your favorite ERP system) is like a new basic element. Basic elements go through three stages: Fixed, fluid and gas. SAPium (and its cousin Oraclium) start out as a fluid that runs down and fills the holes (basic business process) you want fixed. It then becomes a gas, expanding to fill the whole area (organization) until it has permeated everything, whereupon it becomes a solid that can never be changed again….

Oh well. Less true now than it was, maybe. Or maybe not.

Ozzie and the cloud

Steven Levy, a tech writer whose every article I read if I can get my hands on it, has a fascinating Wired article about Ray Ozzie and his long march to make Microsoft survive and prosper in the cloud. Service-based computing can be a disruptive innovation for Microsoft, since customers become less reliant on a single, fat client (dominated by MS) and instead can use a  browser as their main interface.

I have used Lotus Notes since well before the company was bought by IBM, and always considered it to be a fantastic platform that is somewhat underused, chiefly because while its execution is great, the user interface is somewhat clumsy (getting better, but still) and it is hard to program for. As an infrastructure play for a large corporation, Notes is just great. As a platform for software innovation and innovative interaction, it leaves a lot to be desired. The question is – can Microsoft gain dominance in this market (Sharepoint seems to execute on that one), extend it to consumers (Vista is not a good omen here), and somehow find a business model that works? (By that I don’t mean one with it the same profitability as it has now, that just isn’t possible. But one that is somewhat profitable long-term?)

If anyone is going to be able to pull that off, it will be Ozzie. The article paints, as I see it, a very complete picture and tells me a lot more about the relationship between Microsoft and Ozzie than I knew. But that is usual with Steven Levy articles, ever since he wrote "Hackers: Heroes of the Computer Revolution" back in 1984.

Highly recommended. (And since I like long and detailed articles: this one is at 6900 words or more than 40,000 characters including spaces. Just a hint to my Norwegian newspaper friends, who thinks anything more than 7000 chars won’t be read by anyone.)

Liveblogging from Sophia Antipolis

This are my running notes from visiting Accenture’s Technology Labs in Sophia Antipolis, as part of a Master of Management program called "Strategic Business Development and Innovation" for the Norwegian School of Management.

Accenture’s Technology Labs is a relatively small organization: 200 researchers, 180000 employees in Accenture. There are four tech labs: Silicon Valley, Chicago (the largest), Sophia Antipolis, Bangalore, they should be able to do everything, but in practice there is specialization. The four main activities of the tech labs are technology visioning, research, development of specific platforms, and innovation workshops (with clients, press, consultants etc.) The themes pursued are mobility and sensors; analytics and insight; human interaction & performance; Systems Integration (architecture, development methods); and infrastructure (virtualization, cloud computing).

Kelly Dempski: Power Shift: Accenture Technology vision

The visioning used to be far-thinking, visionary etc., now have a much more immediate focus, want to look at things that you can implement today, make it much more "grounded in reality"

Eight critical trends:

  • 1: Cloud computing and SaaS: Hardware cloud (, IBM, Google (now the third largest producer of servers in the world)), desktop cloud (Google, Zimbra, MS Office Live Workspace), SaaS cloud (Netsuite, CrownPeak,, and services cloud (Google Checkout, Amazon web services, eBay, Yahoo)
    • examples: Flextronics has changed over their HR applications to an SaaS model. AMD emulates chips on software for testing purposes, now contract with Sun to do that in the cloud. New York Times had 4Tb of articles that they wanted to translate to PDF: Translated it all twice (because there was a bug the first time), someone went on Amazon with their credit card, uploaded 4Tb, processed it (24h), there was a bug, had to do it again, 48h, total cost $250 on someone’s credit card.
    • issues:
      • data location (where is the data)
      • privacy and security
      • performance
  • 2: Systems – regular and lite
    • SOA as the integration paradigm (regular), mashups (lite)
    • traditional back-end apps vs. end-user apps
    • small number of apps maintained by CIOs vs. large number of User and user-group created applications (long tail)
    • examples:
      • REST is a light architectural approach for interoperability & data extraction
      • Mashups (JackMe (trading platform tools), Serena, Duet (SAP and Microsoft), IBM) becoming more important in the enterprise arena
      • Widgets and gadgets are light-weight desktop UIs that continually update some data
  • 3: Enterprise intelligence at scale
    • combination of internet-scale computing, petabytes of data, and new algorithms
    • almost all the large systems vendors have partnered with or acquired some analytics oriented software company (such as Microsoft acquiring FAST)
    • rampant use of data: evolution through access, reporting, external & internal, unstructured etc.
  • Trends 1-2-3 together: The new CIO
    • hardware and software procured from the cloud
    • business units, end-users create their own lightweight apps
    • The new CIO:
      • "Data Fort Commander" – ensure security, privacy, integrity of corporate data and manage back-end apps
      • "Chief Intelligence Officer" – provide data analysis services & insights to business units
  • 4: Continuous access
    • mobile device "first class" IT object
    • No concept of enterprise desktop/laptop
    • location-based services
  • 5: Social computing
    • amplify and support the value of the community
    • three major directions: Platformization, inter-operability, identity management
  • 6: User-generated content
    • community-based CRM (users making videos about how to run certain kinds of software or build something from IKEA)
    • new forms of entertainment
    • revenue erosion of traditional media companies
    • this has marketing implications: You can measure the sentiment out there in the user community. You switch from advertising to engaging.
  • 7: Industrialization of software development
    • converging trends will increase integration: Predictive metrics, model-driven development, domain-specific languages, service-oriented architecture, agile-development & Forever Beta.
  • 8: Green computing
    • global warming, energy prices, consumer pressure, compliance and valuation
    • switch out energy-intensive processes for information-intensive processes: Electronic collaboration; Warehousing, supply chain & logistics optimization; Smart factories, plants, buildings & homes; and new businesses such as carbon auditing and trading

Cyrille Bataller: Biometric Identity Management

Biometric identification is coming, driven by increasing demand and technological progress. Biometric identification is defined as "automated recognition of individuals based on their physiological and/or behavioral characteristics. Physiological can be face, iris, fingerprint; behavioral can be signature, voice, or walk. Involves a tradeoff, as with all security systems, between the level of security and the convenience of the system. Fingerprint is most used (38%), face is the most natural, iris the most accurate. Many others: Finger/hand vein, gait, ear shape, electricity, heat signature, hand geometry and so on…

Balance between FMR (false (positive identification) m rate) and FNMR, called equal error rate. Iris has an EER of .002%, 10 fingerprints .01%, fingerprint .4%, signature 3%, face recognition 6%, voice 8%. Many parameters in addition to this.

Securimetrix has something called HIIDE, a mobile unit that does a number of biometrics, used in Iran. Voice is very interesting because it can be done over the phone, interesting for call centers, banks etc. Multimodal important, because it is hard to spoof.

Airports is a good example of what you can do with proper identification: You can move 99.9% of the check-in away from the airport. Bag drop can also be almost fully automated. Portugal is the leader in the EU, have automated passport control with facial recognition (scan, use electronic passport etc.). Most people are not concerned very much with privacy given some assurance and convenience. Likely to see lost of automated border clearance for the masses, but also registered travelers that go through even quicker and are interoperable across many airports. One common misunderstanding is that automated identity checking is moving away from 100% accuracy, but human passport/security control is an error-ridden process and mostly automated processes are more accurate.

Antoine Caner: Next Generation Branch

This is a showcase exhibit of best practice banking technology and processes. This showroom has about 40 companies (banks, mostly) visits per year.

Most banks have a multi-channel strategy, have returned from a strategy of getting rid of branches but want to redefine it. Rather than doing low-value transactions, the branches are seen as a mesh network for business development.

Key principles behind the branch of the future:

  • generating and taking advantage of the traffic
  • flexibility throughout the day
  • adaptation to client’s value
  • sell & service oriented
  • modular space according
  • entertaining and attractive
  • focused on customer experience


  • turning the branch windows into an interactive display (realty, for instance)
  • Bluetooth-enabled push information
  • swipe card at entrance to let branch know you are there, let your account manager know, apply Amazon-like features
  • digital displays for marketing
  • avatar-based teller services
  • biometric-based ATMs to allow for more advanced transactions, as well as more opportunistic sales applications
  • do both identification and authentication
  • digital pen user interface for capturing data from forms
  • RFID-based or NFC (Near Field Communication) in brochures, swipe and get info on screen
  • "interactive wall" for interaction with clients in information seeking mode
  • visual tracking of movement in the branch
  • modular office that can change shape during the day, reconfigurable furniture

What impressed me was not the individual applications per se – though they were impressive – but way everything had been put together, with a back-office application that can be used by the branch manager to track how this whole customer interface  (i.e., the whole bank branch) works.

Alexandre Naressi: Emerging Web Technologies

Alexandre leads the rich Internet applications community of interest within Accenture. He started off giving some background on Web 2.0 and used Flickr as an example of a Web 2.0 application, where a company use user-generated content and tagging to get network effects on their side. Important here is not only the user interface but also having APIs that allow anyone to create applications and to have your content or services embedded into other platforms. Dimpls is another example. More than one billion people have Internet access, 50% of the world has broadband access, which allows for richer applications. Customers’ behavior is changing – it is now a "read-write" web. It has also gotten so much cheaper to launch something: Excite cost $3m, JotSpot $200k, Digg cost $200.

Rich Internet Application and Social Software represent low-hanging fruit in this scenario. RIA allows the functionality of a fat client in a browser interface, with very rich and capable components for programmmers to play around with.

Two families of technologies: Jacascript/Ajax (doesn’t require a plugin, advocated by Google), and three different plugin-based platforms: Silverlight (Microsoft), Flash/Flex from Adobe, and JavaFX from Sun. All of them have offline clients that can be downloaded as well. A good example is, which gives a better user interface – Accenture has developed something similar for their internal enterprisesearch.

Social Software: Accenture has its own internal version of Facebook. Youtube is also a possible corporate platform where people can contribute screencasts of all kinds of interesting demos and prototypes.

Kirsti Kierulf: Nordic Innovation Model for Accenture and Microsoft

Accenture and Microsoft collaborating (own a company, Avanade, together), and have set up an Innovation lab in Oslo called the Accenture Innovation Lab on Microsoft Enterprise Search. Three agendas: Network services, enterprise search (iAD), and service innovation. Running a number of innovation processes internally. This happens on a Nordic level, so collaboration is with academic institutions and companies all over.

Have made a number of tools to support innovation methodologies: InnovateIT, InnovoteIT, and InnomindIT (mind maps), as well as a method for making quick prototypes of systems and concepts for testing and experimentation: 6 weeks from idea to test.

Current innovation models are not working for long-term, risky projects. Closed models do not work – hence, looser, more informal and open innovation models with shorter innovation cycles. Pull people in, share costs throughout the network, Try to avoid the funnel which closes down projects with no clear business case and NIH. Try to park ideas rather than kill them.

Important: Ask for advice, stay in the question, maintain relationships, don’t spend time on legalities and financials.

Plus ca change….

I clipped this from ACM Technews, an email service of the ACM:

Looking for Job Security? Try Cobol
IDG News Service (10/23/08) Sullivan, Tom
A Cobol programmer may be one of the most secure and steady jobs in IT. Analysts report that Cobol salaries are rising due to a healthy demand for Cobol skills, and there are few offshore Cobol programmers. The troubled economy also bodes well for Cobol programmers, says Interop Systems director of research Jeff Gould, as long as they are working for an organization that intends to keep its legacy Cobol applications. "Many mainframe customers with large mission-critical Cobol apps are locked into the mainframe platform," Gould says. "Often there is no equivalent packaged app, and it proves to be just too expensive to port the legacy Cobol to newer platforms like Intel or AMD servers." Deloitte’s William Conner says salaries for Cobol programmers are rising because many Cobol programmers are reaching retirement age and colleges are focusing on Java, XML, and other modern languages instead of Cobol. Dextrys CEO Brain Keane says Cobol programmers are less likely to have their jobs outsourced because the Chinese do not have mainframe experience and recent Chinese computer science graduates have focused on the latest architectures and systems and do not have experience with legacy languages and systems. Meanwhile, warnings that mainframes would disappear have proven to be untrue, particularly because mainframes are very reliable at handling high-volume transaction processing, and companies are increasingly benefiting from integrating legacy mainframe Cobol applications with the rest of their enterprise.

(Full article here.) With the exception of the part about offshoring, this article could have been written 10 years ago, and be just as true then. There are, of course, a number of programmers in India that know COBOL – converting legacy apps for the year 2000 was one of the jobs that got the Indian IT service industry started.

Come to think of it, I never really learned COBOL, myself. But I was a decent REXX programmer….

Tim O’Reilly nails it on cloud computing

In this long and very interesting post, Tim O’Reilly divides cloud computing into three types: Utility, platform and end-user applications, and underscores that network effects rather than cost advantages will be what drives economics in this area. (This in contrast to the Economist’s piece this week, which places relatively little emphasis on this, instead talking about the simplification of corporate data centers – though the Economist piece is focused on corporate IT.)

Network effects happen when having new users on a platform or service are a benefit to the other users. This benefit can come from platform integration – for instance, if we both share the same service we can do things within that service that may not be possible between services, due to differences in implementation or lack of translating standards.

Another benefit comes when the shared service can leverage individual users’ activities. Google’s Gmail, for instance, has a wonderful spam filter, which is very reliable because it tracks millions of users’ selections on what is spam and what isn’t.

Tim focuses on the network effects of developers, which is an important reason why Microsoft, not Apple, won the microcomputer war. When Steve Ballmer jumped around shouting "developers, developers, developers", he was demonstrating a sound understanding of what made his business take off – and was willing to make a fool of himself to prove it.

Tim also invokes Clay Christensen’s "law of conservation of attractive profits", arguing that as software becomes commoditized, opportunities for profits will spring up in adjacent markets. In other words, someone (Jeff Bezos? Larry and Sergei?) need to start jumping up and down, shouting "mashupers, mashupers, mashupers" or perhaps "interactors, interactors, interactors" and, more importantly, provide a business model for those that build value-added services on top of the widely shared platforms and easily available applications they provide.

One way to do that could be to make available some of the data generated by user activities, which today most of these companies keep closely to themselves.  That will require balancing on a sharp edge between providing useful data, taking care of user privacy, and not giving away your profitability too much. As my colleague Øystein Fjeldstad and I wrote in an article a few years ago – the companies playing in this field will have to make some hard decisions between growing the pie and securing the biggest piece for themselves.

If we cannot harness network effects, cloud computing becomes a cost play, and after awhile about as interesting, in terms of technical evolution, as utilities are now. USA is behind Europe and Asia in mobile phone systems partially because US cellphone companies were late in developing advanced interconnect and roaming agreements, instead trying to herd customers into their own network. Let’s hope the cloud computing companies have learned something from this….

The Economist on cloud computing

The Economist has a nice feature on corporate IT, mostly about how it is evolving towards cloud computing. Like everything the Econonomist does, it is nicely worded, measuredly opinionated, and contains nothing new to those in the know. But the article is a very good introduction to the current state of at least the technical and market side of corporate IT provisioning, so I mark it for future courses on just that topic. It includes an audio interview with Ludwig Siegeler as well.

CACM becomes much more readable

CACM (Communications of the ACM) is one of my favorite journals – and it is currently in the throes of an editorial upheaval that I think is very positive. In addition to scholarly articles, it is moving in the direction of essays and more generally accessible articles, without loosening the quality criteria. Ever since BYTE disappeared (a victim of the need for targeted advertising) I have missed a general, quite technical yet accessible journal – CACM is now getting closer to what I am looking for.

Here are two articles I found very interesting:

  • "Will the Future of Software be Open Source?, a well reasoned reflection by Martin Campbell-Kelly, giving a very terse, yet comprehensive and useful description of the evolution of software markets. Answer: OS is a tempting conclusion if you extrapolate, but extrapolation has not been a very successful prediction technique so far…
  • "Searching the Deep Web", by Alex Wright, which explores two different approaches to searching beyond static web pages – the trawling approach, which relies on local storage, and the angling approach, which produces targeted results in real time.

Security, privacy and IP in the 2.0 Enterprise

(bear over with me here for a while, this is something I am mulling over in relation to an nGenera research project called REC – Reinventing End-user Computing.) I am doing a teleconference on security, privacy and IP later today with Kimberly Hatch and other colleagues at nGenera and need to bloviate a bit to get in the mood.)

Continue reading

The opportunity-creating IT department

Vaughan Merlyn has a good post on how IT departments should go from problem solving to opportunity creation. This, to me, means stepping up to the third level of IT management (where the IT dept is facilitating innovation and change, the goal being business transformation. The two former is providing utility services (standard and basic, reliable and cost-effective, the goal being business efficiency) and being a business partner (being flexible and responsive, facilitating growth in scale and scope, the goal being business effectiveness). (See this post and many of Vaughan’s for more detail.)

We used to say that the first and second level activities will not go away even if you reach the third level. But I am beginning to wonder.

The utility business part, for one thing, can now be almost totally farmed out to providers that provide, well, services at expendable cost. In an era of cheap and cloudy services, I am beginning to wonder why IT departments own their servers or provide desktop functionality. If I were kitting out a small company (50 people or less) today, I would do it with cheap and small computers (with really pricey 3G/WiFi/whatever connectivity options) running mostly server-based stuff, and iPhones or the like for the mobile crowd. Productivity software through the Google suite, for instance, with Gmail/Docs/etc. (Google Gears ensures that you can edit off-line). Salesforce or something like that for CRM, a wiki for collaborative stuff. Typepad or WordPress for company web page.

Accounting would be an issue due to localization – it is one of the last holdouts of geographic differentiation, since each country still has its own accounting rules and tax levels. I am not sure about to which extent online accounting is available here in Norway, but I think it is available. (and if not, I would volunteer to be the test site – it should be fairly easy to webify.)

As for personal stuff – just let the employees install whatever they want locally. They are grown-ups.

In that kind of environment (and yes, I am aware of the legacy code/apps/most users are stupid and definitely old/mainframes are still important/no company can start over arguments) the role of the IT department lies in tying things together from the user perspective – orchestrating innovation through the creation of smart mash-ups of all these online services. As for print, backup, and initiating new employees to this liberation from the past – farm that out to someone who does it for a living.

Wouldn’t it be lovely….

PS: Here is an article from The Economist about the Zoho office suite saying essentially the same thing: "Sooner or later, Zoho or another emerging-economy upstart will let a lot of air out of the corporate IT balloon."