IUISC 2004 address

IUISC 2004 address

Yesterday, I gave one of two plenary addresses on the IUISC (Irish Universities Information Services Colloquium) conference in Kilkenny, Ireland, on “Learning Technologies and Learning Institutions: Future, Challenges and Principles.” The text is below, in fairly raw format. Essentially, I talk about the changing competitive landscape for universities, as I see it, the availability of new technologies, and what IT service providers in universities should do. The talk was fairly well received, and fun to give.

Learning technologies and learning institutions: Futures, challenges and principles.

Keynote address, IUISC conference, Kilkenny, Ireland
March 10, 2004
Dr. Espen Andersen
Associate Professor, Norwegian School of Management, Oslo, Norway
European Research Director, The Concours Group, USA

Conference chairman, Ladies and Gentlemen:

First of all, let me start by thanking you most sincerely for inviting me. I always enjoy a trip to Ireland. In fact, I have been a fan of this country and its people since I discovered that before the Irish miracle, according to a manager at IBM the last time I was here, the country was “so poor that the only thing we could afford was education. That warms a professor’s heart, because in Norway, we are so rich that the only thing we won’t spend money on, is education. Secondly, Ireland has understood that to attract foreigners, it is smart to spend less time carping about the greenness of the landscape and more time smoothing out the process for foreign direct investment, especially ironing out bureaucracy through the very impressive work of the Irish Development Agency. Thirdly, you seem to display an openness to foreigners not from an understandable and nicely calculated desire to to improve the country’s economic situation, but because you have a culturally ingrained friendliness that makes a foreigner feel genuinely welcome, as opposed to other English-speaking island countries I could think of. Furthermore, the daffodils are blooming, i can restock my supply of Bushmill’s 10 year old, and my family just acquired an Irish Soft-Coated Wheaten Terrier.

So, I like this place.

When giving a talk, you need to start by establishing legitimacy. You need to tell the audience why they should listen to you. That can be done by ingratiation – I just tried – but much more effective is to convince the listeners that they should pay attention to you because you are able to address some problem they have. The audience here, I surmise, consists of people whose job it is to provide information technology and access to information to the two most thankless constituencies on earth, namely faculty and students. You should know that I did just that for five years, starting out by running customer support and eventually the internal workings of an IT department at the Norwegian School of Management, at a period when we a) grew from 4,500 to 22,500 students, and b) one of the first (1987) business schools in Europe to require our students to have a personal computer. I was operationally responsible for that little caper, and managed to make life harder for the students, scare the faculty and significantly increase our operating costs at the same time. In retrospect the move was still a good one – it positioned the school as appropriately forward-looking, established IT as a natural tool for students and faculty slightly ahead of the competition, and got us started with an online education program that makes money. Furthermore, I got me into a doctoral program at the Harvard Business School and then into a career in IT management research and consulting. I am now back at the same school as a professor, still friends with the IT department, and, given my position, I now can do a lot more to influence the use of IT there than I could as an IT manager. So, I have helped, in a guerrilla fashion, facilitate the transformation of the library into a learning resource center, with 2,000 square meters of computers, work spaces, wireless networks, databases, books and newspapers. When the whole school is moving into new headquarters in Oslo next year, the Learning Center will have the two top floors of a six-floor building, and 8,000 square meters of space for the students and faculty to access information and work in the same environment.

Sorry if I got carried away there for a while, but that was intended to show that I have been where you currently are, and I can feel you pain, to quote a former chief executive. I also think I have found a few tricks to do something about it. Legitimacy established, I hope, what now?

This talk is titled “Learning Technologies and Learning Institutions – Future, challenges and principles.” I was tempted to borrow a line from my mentor Benn Konsynski and call it “Learning technologies and learning institutions: When vacuums collide, but that would have been impolite. Fun, though. So let me instead talk to the title – the uneasy relationship between learning institutions and learning technologies, what is the future, what are the challenges and which principles should we observe when venturing forward? I will start with the University, then the technology, and then the technology providing organization.

From vegetable to mineral memory
In an excellent speech[1] at the opening of the library in Alexandra, the world  famous semiotician and author Umberto Eco coined the phrase “Animal, vegetable and mineral memory.” Animal memory is, of course, the memory we have in out brains – I suppose both the individual kind as well as the collective memory you get when you add communication capability. Vegetable memory is Eco’s term for paper, and mineral memory is various forms of computer storage, be it optical or magnetic. He then goes on to point out that there is no reason to believe that the demise of paper is anywhere close at hand – books will survive not only because they are easy to read and can work anywhere (without batteries), but because they are inherently stable. That is. books cannot be changed once they are written – and we do want it that way, all talk about computer games as ergodic literature[2] aside. According to Eco, computers will replace encyclopedias and other material which are meant to be repositories of knowledge and for which search capabilities are essential. But nobody curls up with their laptop, and moreover, they never will, not because you can’t, but because in a book we want the discipline of having to listen to the author’s story, with all its unhappy endings.

Now, I am not so sure Eco is right on that one. I am currently reading the Count of Monte Christo, freely downloaded, on my laptop. I think there is a possibility that Eco is falling prey to the fairly common mistake of overestimating what the technology will do in one year, and underestimating what it will do in ten. I do agree that one of the most important features of a book is stability – but I think we can get that from computer technology, if we want it. Still, there is a future for books, I think, though more as display than storage.
However, this is not a discussion of technology and books, but a discussion of technology and universities. Universities are creators, repositories and communicators of human knowledge, and their main technology is currently going through the change Eco talks about: having previously gone from animal memory to vegetable, we are now going from vegetable to mineral. What will this mean? To answer that question, let us take a look at what happened during the previous transition.
Before Gutenberg invented movable type and initiated the revolution that made books available to the great unwashed, universities were few and small, consisting of groups of people who communicated with each other and to a very limited degree with the surrounding environment. Along with and sometimes dominated by the rulers of the world, they could effectively prescribe what people should think and know. The advent of the printing press meant that a larger group of people could not only get access to knowledge outside official sources, but also that they could, independently, come up with interpretations of that knowledge that were different from those espoused by the authorities. And we got a number of innovations, such as the Reformation, the Natural Philosophers, Democracy and the Scientific Method.

Before access to knowledge became commonplace, knowledge was imparted by Wise Men. These wise men told you what to do, and derived that right from their capability to remember and to a certain extent evolve knowledge imparted to them from their elders. When access to prior knowledge became commonplace, the effect was to subject the knowledge providers of the world to the question “why should I do this” rather than the more familiar “what should I do”? Don’t misunderstand me, the “what” and “how” is still important, but an increasing number of people now ask “yes, that may be right, but how did you arrive at that conclusion?”

The transition from vegetable to mineral memory will further increase this development – except that now, the knowledge consumers will not ask only what and why, but also why they should listen to you rather than simply access the collective, mineral memory. The researcher and teacher has already gone from being primarily a source of knowledge to being primarily a distributor of knowledge. I think the future teacher will be a just-in-time creator of knowledge, and will not as much teach what is known as refine the students’ ability to access and interpret the shared mineral memory themselves.

Medical doctors now face the rather disconcerting situation that when a patient arrives in their office, he or she is clutching a sheaf of color Internet printouts, has a hypothetical and at least to themselves confirmed diagnosis and regards the doctor as the somewhat irritating gatekeeper to the well deserved treatment. The patient suspects he knows more than the doctor. Frequently, he is right.

Anyone who teaches today is familiar with the sight of students with laptop and wireless Internet connections, surfing the net while the teacher speaks, sometimes interrupting to correct some figure recited. Some of my colleagues want to ban laptops from their classrooms, or to have a switch on the wall that jams the WiFi or GSM network – shuts out the world, in fact – thus ensuring that the students only have one source of diversion, namely themselves.

I solve this slightly differently, but assigning lots of work to do, interrogating the students on what they have learned and how they would use their learning in a constructed and realistic situation. I give them massive data sets from well-known sources to analyze, only to reveal to them, by walking through the analysis, that the data sets are falsified by the very trustworthy sources they have consulted over the web. Paraphrasing one of my colleagues, I make sure that my classroom, and, I hope, the school, is not perceived as a tanning salon as much as health club, where I will provide the tools but the result is something that the student will have to take responsibility for themselves. In short, I treat my students as products, not customers, and derive my competitive advantage from satisfying my primary customers, which are the companies and organizations that will hire these students when they come out, and find them to be hardworking, knowledgeable, and critically minded because I have forced them to be.

When books became accessible, the number of teachers was not reduced – quite the contrary. However, the role of the teacher changed, from oracle and knowledge distributor to interpreter. With the computer and the network, the role of the university will no longer be that of oracle and distributor. That role will be reserved for the few truly top institutions of the world, such as MIT, who are making all their course material freely available in a move that combines an ethically implacable mindset, a deep understand for how knowledge is created, and a commercial sense that borders on genius. The gurus of the world will be fever, more centralized, and significantly richer, thanks to efficient distribution. The works of Newton or Plato today are cheaply and easily accessible. A modern-day Newton or Plato can command higher fees and more frequent flyer miles than back then, not from the selling of their works but because more people will be willing to pay huge sums to see them in person, to gain the benefit of their wisdom not only from the source, but applied to their situation.

What then, with the not-so-top universities, which includes most of us? We won’t go away, but the job will change. To survive and prosper, I think we will have to, to a larger extent than previously, take responsibility for what the Germans call “Bildung” – we will, in short, have to really become the educators we for so long have claimed to be. And we will be measured not on how much knowledge we have imparted but to what extent we have changed the student.

Tech future: Dynamic content interfaces and decentralized context production
Now, let me move on to the technology, which, after all, is what I am supposed to talk about. I find that a bit hard, because, quite frankly, I do not think any specific technology will make much of a difference when it comes to learning technologies in universities in general. Rather, it is the combined effect of overall technology evolution that will force changes on us. Information and communications technology continues to evolve at a breathtaking speed: Moore’s law will eventually slow down, but it will be for lack of customer demand for high end processing rather than any currently known technological limitation. Storage technology has evolved even faster than processors, currently driven by the iPods and the ThumbDisks of the world. Communications technology continues to evolve, with high-speed fiber and new wireless “last mile” technologies such as Intel’s WIMAX (with 10Mbs in a 5-mile non-line-of-sight radius) becoming available in a year or two. A few technologies, such as speech recognition, long-life batteries, and for rather obscure reasons, Bluetooth and similar personal area wireless networks, have been slower to develop. However, as I have consistently held since 1995, processing, storage and communication is now so capable and so low in price that if you are thinking of doing anything strategic, you better free up your mind by thinking that they are free, or you are in danger of adjusting your imagination to non-existing or short-term technological limitations.
The organizational effect of the technology, however, is slower in coming. A few years ago I worked with companies in the mail order industry, who where busy converting their paper catalogs to CD-ROMs (an evolution that was obviated by the Internet, incidentally.) I observed that these companies went through a three-step process: In the first step, they stored their paper-based catalogs electronically, maintaining the sequencing, design, and content of the paper catalog on the disk. Fairly soon they understood that this did not exploit the inherent potential in the technology – for searching, say, or alternative navigation and display possibilities. So they split the publishing process in two, one for paper, one for CD. And finally, as the digital content gained prominence, they did what they probably should have done from the beginning, namely produced the content for the electronic medium and then generated the paper-based catalog out of the electronic content. That is, they produced to the most dynamic interface and generated the more static packaging from that.

We have always done this – created the new technology in the image of the old one. For instance, the reason we write TO:, FROM: and SUBJECT: on our emails is less because that is a good way to write email and more because email is an automatic version of the corporate memo, which developed to look the way they do because of the advent of hanging file folders, rather than pigeonholes[3] stuffed with elaborate and highly polished business letters.

Universities currently are in the first or the second stage of the vegetable to mineral memory transition: Most of them use technology to take their existing courses online, the more advanced ones have set up separate distance education organizations, which gradually start to use the capabilities of the technology, but still in the image of the old. A few, like MIT, are moving into a third phase, by making the mineral memory the core of their activity and identity.

Eventually, all universities will make the move, with everything produced electronically, with paper (or, if you will, static representations of the technology) generated as needed. They will gradually increase the dynamism of their digital teaching interface to a point where paper just can’t keep up. To get there, they will, as with all technology-induced organizational change, need external pull and internal facilitation. Most of all, they will need a picture of what this will look like, a template and a way to get there, and a way to get the faculty across, rather than the students and the administration. (I get slightly incredulous when I hear people talking about “student acceptance” of new technology – barring purely technical implementation problems, students will accept any technology you force them to accept. If you can’t force them, chances are the problem is not with the technology) The faculty, on the other hand, will, if they are any good, be in a position to pick and choose, and getting them there will be like herding cats.

The external pull for a digital conversion is coming, partly from the increasing abstraction and thus need for theoretical knowledge in the world, partly because of a globalization of knowledge provision that is truly astounding. When markets go global there is a tendency that the actors get polarized: Rather than being a collection of actors in all sizes, they tend to evolve into a few very large actors and many small ones: Large and small survives, the medium size, often regional dominator disappears. I will have to make some allowances for language, but that kind of protection does not exist in Ireland, a fact I think you should be thankful for, on average.

The internal facilitation, however, is somewhat lacking. At present, we do not have the tools to facilitate the transition to the mineral-memory educational institution. I am aware that this conference is sponsored, in part, by course management software companies, and that it is considered bad form to speak badly about sponsors. Let me therefore precede that following remarks by saying that I have not used all the products of the sponsors here, but I have used most of them, for many years, as well as others. Let me also state that there are entirely rational ways for the course management system developers to go the way they have done, or at least that is what they will tell you, along with the news that the next version will fix it.

The writer Paul Fussell[4] has coined the term BAD (that is, “bad” written in capital letters). “Bad”, in his view, is something that everyone would agree is bad, such as dog-doo on the carpet, English boiled vegetables, or George W. Bush. BAD, on the other hand is something that is not only bad, but where the very badness of the product or service is held up as its chief good feature, such as the Oprah Winfrey Book Club, Microsoft Outlook, self-service airline check-in, government educational quality programs, and, come to think of it, George W. Bush.

Current course management and other commercially sold learning management software is not only “bad”, but BAD. BAD, BAD, BAD, BAD, BAD. How bad? Let me count the ways:

  • They only provide support for the first step in the three-step process of transiting from vegetable to mineral memory, that is, all it can do is put paper-based courses on-line. They try to conceal this badness by stressing that it facilitates learning and acceptance by using a familiar interface, or, rather, a metaphor, with known concepts such as “classrooms”, “group areas”, “documents” and other hogwash. Hobgoblins of little minds, I say.
  • They dumb down the user interface for the course creator (and for the student, I should add) to the degree that is almost unfathomable, and justifies this by talking about user-friendliness and how the system can be used by almost anyone with a minimum of training. That is well and good, but it is now 37 years since Smalltalk 67 was invented, with a graphically oriented object browser, and since I manage not one course, but about 10, I would very much like to do that in an interface that allows me to behave like a technologically competent and moderately literate adult, rather than forcing me into dealing with an interface designed for an airline check-in machine catering to moronic, alcohol-marinated charter tourists.
  • They are built on a systems architecture that, while using a relational database, does not let the capabilities of the database show through to the user, thus treating every course as a separate entity with ensuing duplication of content
  • They are not built to interface effectively and intuitively with other systems, which really is rather odd, given that the data for the one task that justifies a course management system – namely, identity management, that is the linking of student to learning material – comes not from the course management system itself but from the student administration systems
  • They arrange everything into courses, not making any allowance for the course creator or creators to have draft areas, repositories of learning modules, conditional access to course material or alternative interfaces to, say, non-students.

This has lead to me use the course management system at our school exclusively as a way to limit access to copyrighted digital material, that is, let the students get at the stuff without having to look it up through the library themselves. If I had a way to automatically let the student administrative system interface to my own web site, I would ditch the course management system completely – there are much better tools available from other places.

What tools? Well, we are slowly developing ways to use the Internet that are not automated versions of the non-digital world. The most interesting current uses are Blogs and Wikis. Blogs, or weblogs, are personal web sites, usually in the form of reverse chronological online diaries, the very capable and easy-to-use software that creates them, and exchange technology with names such as TrackBack and RSS, that allow thinkers of all stripes to put their thoughts online, to react to others, and to tell others that they have reacted. The beauty of this is that spammers do not have access, the creator has complete control, and participants can participate. After biting my nails for 12 months, I started my own blog last week – check it out at www.espen.com, and then get an RSS reader, which is a tool for reading many blogs fast, and ask yourself – what more do you really need?

Wikis are best demonstrated by what I think is the most exciting project in human knowledge accumulation so far, namely the Wikipedia. This is a communally written encyclopedia, at www.wikipedia.org, which currently has 240,000 articles and more visits (as well as, sometimes, better content) than Encyclopedia Britannica. At the heart of the wiki is the notion that articles can very simply be produced, be interlinked, and that if you don’t like the text in an article, you can make it better by rewriting it. This sounds like a recipe for total disaster – and there is no shortness of goblins in the system, but they are dealt with in a communal way as well, through a strong work ethic, fanatical attention to the NPOV (or neutral point of view), and a technology that at all times maintain all previous versions of an article. This means that, while you can go in there and destroy reference articles by for instance telling everyone that, say, “Trinity College Sucks”, someone will revert you vandalism at a mouseclick. The quality of the Wikipedia is surprisingly high – check it out. Why companies spend fortunes on commercial knowledge management software is really beyond me – they want control, of course, but Wikipedia shows that the best control is, more often than not, no control.
If I should give any advice for our technological future, it is the following: Take a big swill of self-confidence, adopt the attitude that we are currently under-spending on technology and overspending on systems, and start to collaboratively write both the content, the context, and the underlying infrastructure that you all need. After all, the commercial systems you buy are crap and easily recreated, open source tools are available, and you are not nearly utilizing the computer science and other students you have enough, nor providing an environment that promotes innovation for the truly innovative teacher – what are you waiting for?

The organizational dimension: Providing IT in a university

Now, that-s easy for me to say – I no longer have to provide IT for a university. How do you do that, anyway? Hardcore cynic CIOs – is there any other kind? – say that their title really means Career Is Over. When I did some research on Fortune 500 CIOs with Computer Sciences Corporation in the 1990s, I discovered that the average tenure for a CIO was 2.3 years, and 40% of them were fired – most of them for not being able to communicate with top management. (The figures have since gotten better, as far as I know). There are also big differences between types of companies: In my daytime job of teaching business strategy, I tend to divide companies into value chains (those that produce products), value shops (that solve problems on behalf of customers), and value networks (companies that mediates interaction between customers link customers to each other). What universities are, depends on who you ask: To an administrator and an undergraduate student, they are value chains, where students aren’t customers, but products. To a research faculty member and to most graduate and executive students, the university is a value shop, geared to solve knowledge problems in a professional environment. Providing IT in a university becomes a problem partly because of this dichotomy, and will remain so.

Being a CIO in a professional firm is very hard, chiefly because professionals tend to think that nothing outside their profession matters, so you have zero clout unless you have the education and background the company professes. In a hospital, you are nothing unless you are a doctor. In a law firm, non-lawyers are air. Worst of all is being responsible for IT in a consulting company or a university – everyone thinks they know the technology better than you do (frequently, they are right), and the traditional justifications of increased productivity or better information provided by the technology counts for little outside the executive suite, and often not there either. What people care about is that the technology works – here, now, free and perfect – in the way they want it to, not realizing or even wanting to realize that they are unique in their behavior.

The librarians, on the other side, are somewhat better off. First of all, the faculty and students depend on them to a much larger extent than the IT organization, because they have nowhere else to go. Hmmmm. Scratch that – that used to be true, didn’t it? Perhaps the librarians should use the goodwill they have to increase the use of IT – that way, you can be seen as both a resource for the university and a competitive advantage?

Use of IT in most business schools has traditionally been based in administrative applications. Added to this have been certain academic uses of computers, typically centered around statistical, accounting and operations research applications, often implemented on centralized technology. During the 1990s, use of IT by faculty and students for more general tasks grew, and gradually more computer and communication resources are consumed by individual faculty and students, typically e-mail, word processing and Web-based searching, than the administrative systems. This use is highly idiosyncratic, and much IT service time has been wasted to trying to make it more standardized.

The IT department is normally not dimensioned for 7/24 service and 100% uptime, and the demands from faculty and students are fragmented and not very visible. But technology is like drugs: With familiarity comes use, and with use, dependence. A faculty member trying to edit his or her web page from an airport six or seven time zones away to inform the students that the reading assignment for next week has changed, is taking advantage of technology that allows him or her to compress space and time. This is great. However, he or she is also taking a huge technology risk, for while it may be technically possible to work in this way, the support apparatus is not available at two in the morning in case something goes wrong.
In a professional service firm setting, such as in a consulting company, the faculty member would have an hourly alternative price, the cost in lost time could be calculated, and the support apparatus dimensioned based on the economic benefit derived for the organization.

However, both culture (the reigning picture of faculty is not one of frequent flyer lounges and mobile laptop Internet access) and economics (alternative costs typically accrue to the individual faculty member, not the school) ensure that the support apparatus is dimensioned for “proper” faculty. These are the good folk who physically show up in their office every morning, leaves in the afternoon, and uses the desktop computer only for appropriate (as defined by the IT department) purposes.

The outcome is that the technically advanced members of the faculty, who often have outside earning power in the form of externally funded research or consulting engagements, buy their own, idiosyncratic technical infrastructure. They do not have time and inclination to work to upgrade the overall technology levels of the rest of the organization and have little patience with the non-technically oriented faculty members. These tend to get stuck in their offices with older, standard equipment and little incentive to upgrade their IT use skills or broaden the scope of their IT applications. Gradually, a widening “IT divide” comes up between the pioneers and the laggards, the first not willing to accept outside management of the self-configured technology they have come to depend on, the second facing an inadequate technological infrastructure and thus unwilling to expose their work to it.

Managing this divide is key to advancing IT use in academic institutions today. It is surprisingly difficult, because unlike business and government organizations, top management cannot force their decisions through, the people with the knowledge don’t have the time, the people with time do not have knowledge, and the “logical” solution requires turning around notions of which resources in the organization are expensive (such as faculty and student time) and which are not (such as computers.)

One way to overcome the stalemate is to take a principles-based approach to technology management. This means that rather than trying to manage technology in terms of specifications of what technology to use where and when, an organization tries to articulate broad yet precise statements – principles – for what the goals of the use of the technology are and how they should be reached. The principles take the form of statements, which serve as guidelines for the users and implementers. Rather than seeking agreement on the tools used, one seeks agreement on the principles.
The quality of the principles is essential – it is very easy to come up with puff pieces, i.e. statements that sound wonderful but have few practical implication and provide little guidance to those it concern. The principles cannot be too complicated – they must be easy to understand, remember and do.

Strategy is choice. A classic test of whether you have made a real choice is to take your principles and see if they are meaningful in their negated form, i.e., if you turn the statement on its head, does it become meaningless? For instance, a statement such as “Use of IT should be pedagogically appropriate” is meaningless in the sense that if you negate the message (“Use of IT should not be pedagogically appropriate”) it becomes an absurdity. Thus, the original statement is meaningless since it does not represent a choice between viable alternatives, but rather just states the obvious. Shannon and Weaver[5] stated this principle as “the information content of a message is the inverse of the probability of receiving it.”

Five principles for use of IT in an academic setting
Here are the five suggested principles I suggested for my business school. I didn’t get number 4 through, but the others have been more or less adopted, albeit in a less direct way than here stated:

Principle 1: The purpose of using ICT is to make faculty more effective and the student experience better. Corollary: Administrative cost savings is a useful result but a bad objective.

Principle 2: ICT should be the preferred way to communicate and coordinate around teaching. Corollary: Paper should be hard to find and even harder to store.

Principle 3: ICT in a business school should be portable and wireless, just as the people working there. Corollary: If you want a stationary computer, don’t move.

Principle 4: All courses should be deliverable across space and time. Corollary: The cost is in the taping, the money in the reruns.

Principle 5: The ICT service levels ICT should be high, measurable and based on faculty and student experience. Corollary: You cannot manage what you do not measure, so start with yourself.

Conclusion

IT is no longer an extrinsic technology in universities, but an everyday presence at work and in the home. The technology may make our work easier, but also increases the competitive pressure on the organization because of increased expectations and new forms of competition. We must respond to this challenge by using the technology in innovative ways, both to do better what they always have done, but also to do new things that weren’t possible without the technology.

In this discussion, I have argued that simple principles with clear practical implications is a way – in my view, the best way – to make academic institutions take advantage of IT. Simple principles and abundant technology will help make the best schools and the best teachers excel – but will also cause anomie among those who do not want to use the new technology, do not see the competitive threats from the outside, or do not see themselves as participants in an organization struggling to grow and come to terms with a changing environment.

Managing an academic organization is difficult because success is contingent on motivating the critical mass of the organization to accept not only the implicit visions, but the explicit and practical goals. The technology challenge is particularly difficult because IT cuts across academic disciplines and administrative functional organization – success is contingent on getting the critical mass of the organization to accept IT as an integral part of their work and as a dimension for innovation in teaching and research. As often as not, simplicity will win out – but simplicity does not sit well with academic institutions, where the premium is on making things problematic and deep before making them simple. Principles will help allay the fears, practical and visible everyday implications will increase the understanding, incentives will further the will to innovate.

Good teaching is good teaching, whatever the medium – and the challenge of taking advantage of IT in a business school lies in making everyone – students, faculty and administration – able and willing to do so. It is much too important a challenge to be left to any single entity within the organization, but also much too important to be dealt with by organic processes.
There are many things I could have said here. I could have talked about the need for relationship management in the IT-organization interface. I could talk about the need for IT management to stop being the IT service representative in top management and rather to become the universities driving and challenging representative into the IT service. And I could particularly talk about my hobby horse, that the role of IT services and libraries is to provide a platform for faculty and student innovation rather than automating administrative processes. But you get my drift.
So, let me sum up: The role of the university will change. There is a lot of cool technology out there. Get some principles, make them real, and success will follow. Don’t ask for permission.

Just do it.

Thank you very much.

References
[1] Eco, U. (2003). Vegetable and mineral memory: The future of books. Al-Ahram Weekly Online. Cairo. On the Internet at http://weekly.ahram.org.eg/2003/665/bo3.htm.
[2] Aarseth, E. J. (1997). Cybertext: Perspectives on Ergodic Literature. Baltimore, MD, The Johns Hopkins Press.
[3] Yates, J. (1989). Control through Communication: The Rise of System in American Management. Baltimore, Johns Hopkins University Press.
[4] Fussell, P. (1991). BAD, or, The Dumbing of America. New York, Summit Books.
[5] Shannon, C. E. and W. Weaver (1949). The Mathematical Theory of Communication. Urbana, IL, University of Illinois Press.

Advertisements