Monday, June 28, 2010

Accreditation and Assessment: The Corregidor Position

Using the dominance of degree granting power to protect a Universities future is like defending Corregidor. At best, it's a holding action. At worst, it's a deadly distraction.

Corregidor, as viewers of The History Channel will know, was the Gibraltar of the Orient, protecting Manila Bay, the best natural harbour west of San Francisco, and with it, the Philippines and Americas empire on the Pacific rim. A magnificent fortress, it was thought to be able to withstand siege for six months, plenty of time for the fleet from Pearl Harbour to relieve it at leisure. Within a few minutes at Pearl Harbour, with that fleet in ruins, Corregidor's fate was sealed. It indeed fell six months later, after a long a bloody battle which tied down substantial Japanese forces. Few of the defenders lived to see VJ day.

The degree granting power of Universities is indeed a fine fortress. Degrees hold vast power over the imagination of the middle classes, a magic scroll, held like a wand by a robed graduate, to open the door of success and respectability. They provide employers a handy shortcut to assess the diligence and knowledge of potential hire. They are difficult to earn, thus filtering out students without the brains, resource and good fortune. A degree is a valuable thing.

It's a good business to be in, making graduates. If you are an academic and don't think education is a business, the next Pegasus back to Fairyland leaves at four. it might not be a for profit business, but students, or taxpayers on their behalf, spend a heap of good money to get graduates. It's a transaction. That makes it a business. If you don't think  it should be a business, do it for free.

 There three big barriers to entry for a potential competitor who might try to get into the graduate making line.

First, the brand. The University of So and So has been around for a long time, and will have build up a certain reputation among employers and potential students. It takes a long time to build that up. If I decided to open a university in the morning (The University of Rob), however good it was, it would still take me years, and a vast expenditure or marketing and public relations propoganda to build that brand. Potential students are often steered by parents and career guidance teachers who'll be a clear quarter century out of touch. Employers are more fickle, but still lean on degrees from places they've heard of to filter the candidates into the slush pile. The brand (or, reputation, if you don't like commercial language) is the most valuable asset a University has.

Second, the infrastructure. The conventional degree setup is expensive to run. Lecture halls must be maintained, academics fed and watered, quads mowed. Students spend four years knocking about the campus jumping through one hoop or another, and that costs money. So long as people expect a degree to look like four years physical time on campus, it's going to be expensive to do that. If I wanted to open a university in the morning, I'd have buy some expensive real estate, do a lot of building work and hire a great many academics. I can't just put up lots of content on Youtube like the Khan Academy and expect people to take me seriously, no matter how good my material is.

Finally, there is the state sponsored monopoly. The state reserves the power to decide who can and cannot hand out degrees, or even who can call themselves a University. The University of Rob would soon find it's letterbox full of troubling legal correspondence.

This is the weakest link.One keen populist or neo-liberal politian, one piece of legislation, and it's gone. If I ran the circus (after the Revolution, you know) I would sweep this away with the stroke of a pen, just to see what would happen.

What would happen, exactly?

First up, a whole bunch of private providers would ride over the hills. Most of them would be cowboys. The natural reaction would be to put some kind of bureaucracy in place to regulate exactly what size chunk of cognitive transformation a Batcholers, or Masters degree is (How would we measure congnitive transformation, exactly?). Adding quality assurance system is tempting, but why not just let people get their learning in module (or smaller blocks) and have graduates and employers rate how useful each unit was. Things like Tripadvisor work fine for a weekend in a hotel, which is about as long and expensive as a single module might be. Cowboys with turkey courses would soon find themselves on the receiving end of poor reviews and hard pressed to win further business. Existing institutions may find some surprises, good and bad, in the cold light of day. Students could go as they please, learning what they need, wherever it is best, where and when they need it. Employers might learn they don't care so much about the broad education, they want people who have done this or that unit. Newspaper commentators may feel it undermines society. Society is of course, not prevented from subsiding the process, just as it does today.

Universities shouldn't rely on their degree granting power to survive, no more than the defenders of Corregidor could rely on the Pacific Fleet. A state sponsored monopoly is no secure long term foundation for any enterprise, especially in a time when other changes have the potential to strongly erode the incumbents position. Alas, the Tertiary Sector will have no Pearl Harbour. The Edupunks shall not fly over the hill one day yelling Banzai, to wake Universities from their complacency. The collapse will come slowly, as first two planks of their value wear away, decade after decade, until they rest more and more on the monopoly of degree granting power. That too, will in time become irrelevant.

Corregidor was eventually retaken, of course, in a war that ended using tactics and technologies unimagined when it was constructed. It was not rebuilt.

Photo: Ruins at Corregidor, by Jepster via Flickr.

Friday, June 25, 2010

Don't do Strategic planning on tactical timescales

Most Universities conduct strategic plans with a five year time horizon. This is too short. Most degrees take three or four years (allowing for repeats, long medical degrees, interminable PhDs etc) so five years is a little bit longer than one 'product cycle'. A car takes, perhaps, a few weeks at most including components. Could you imagine Ford or Toyota having a 3 month strategic planning horizon? No. It would be ridiculous. You would argue that the analogy is false, that what Universities produce is much less tangible, and more embedded in society than a mere car. Correct. So our planning needs to be even longer term.

As I have argued in previous posts, the time steps in education are long. A University reputation takes decades to built or destroy - witness the long march of Ireland's newer universities, DCU and UL to credibility, and the mixed outcomes of the UK former polytechnics. A good research department takes years to mature, building the capacity and credibility to attract lead researchers and fat grants. Technological shifts are, in practice, marginal over 5 years. Public policy fashions take that long to go from the catwalks of the OECD to the statute books of the Dail.

Strategic planning is about vision, it's about where you want to be in the long term. It needs to encompass an idea of where you want to be in the long term. That means hard thinking about the long term, and where you want to be, not just making a to do list for staying put, wrapping in the pseudo corporate language of the day.

Friday, June 11, 2010

The Tragedy of the Commons and The Last Consumer

As I warned you in the last post, I'm still a little off piste and at the edge of scope thinking about the economic context universities will operate in as the century wears on. Bear with me, I'll stop soon!

My eldest daughter, previously mentioned, wishes to be a Mermaid Musketeer when she grows up. "Wouldn't it be nicer to be a Vet" I think, but I don't say it. I remember how many of today's jobs were (and remain) inconceivable to my father's generation. Maybe Mermaid Musketeers will be in high demand in the 2020's. What do I know.

The conventional narrative of technological development has been that with successive leap forward some gadget or other removes another piece of drudgery from the Toils of Mankind. The newly unemployed riot a little, and then find more fulfilling careers as Advertising Executives, Psychoanalysts and Personal Trainers. Since the plough and irrigation gave us the first agricultural surpluses and allowed priestly and bureaucratic castes to emerge, it's been one of the key narratives of history. Thus, we assert, it will always be so, just as the autumn turkey is confident of a good winters food and a fine spring to come. It ain't necessarily so.

Come with me, if you will, to the supermarket. Tesco, Sainsburys, Walmart, wherever. They in a key place in our world, bringing stuff we need from the four corners of the world into one convenient place, beyond the dreams of any dead King. All strive, rightly, to do so as cheaply and efficiently as possible, cutting costs where they can so they can remain profitable, and competitive on price with the other supermarket down the road. Nothing wrong with that.

A year or two ago, the automated tills were a novelty. People were reluctant to use them, but they have become accepted. It seems slower than the human till, but for a small basket on a busy day, great. I'm sure it means that the supermarket can cut the number of staff at peak times, with one staffer monitoring six or eight autotills. Of course, now that RFIDs are dropping in price, pretty soon we'll just have our trolleys autoscanned on the way out, we can swipe our payment card to exit and be off in moments. Much faster, and it'll be a no brainer compared to waiting in a queue. They can cut most of the till staff. It looks like a horrible job, good riddance.

Meanwhile, back in the storeroom, we'll start seeing more and more machines helping out. It's a lot cheaper to run storerooms with robots. Companies like are starting to put in place systems that are faster and cheaper to run. Stacking lemons is a bit more complex. It's taken a long time for robots to be able to do that kind of work, but if a robot can fold towels, how far away can a commercial shelf stacker be? A long long time ago, when I was doing my PhD, I paid part of my way stacking shelves for Coca Cola. Great workout No brainpower required.

So as the century wears on, smart supermarket operators will put in those systems. Driven by sales data from the till systems, warehouse robots will load and unload the trucks (no more tricky health and safety issues in the warehouse - no humans allowed) and specialist packer robots will keep the shelves stocked, working mainly at night to minimise human interaction. You could, conceivably, have a complete supermarket shop without dealing with or seeing one human. A nice Augmented Reality system with voice recognition can show you where the cheese is, no shuffling about looking for staff.

There will probably still be a couple of staff though. So many laws assume a shop will have a shopkeeper, it will be hard to avoid having a bored looking manager or greeter around. Technicians may come and go to fix the odd thing, but in time a good R2 unit could replace them. The trucks will still legally require drivers, but as time goes on they will be more closely monitored by expert systems and central controls so they have little or no autonomy. Industry will lobby for UAV trucks to be allowed between, say, three and six am. The accident figures will make their case compelling, eventually.

Of course, in the meantime, most of your food supply will just arrive, a shopping list mediated between the expert systems in your supermarket, your fridge and pantry, the health assist system your health insurer mandates (no more ice cream!(, with a final approving nod from your bank that the delivery fits within the budget you approved. The milk just appears in the fridge, unpacked by your housebot. No more late night runs to the cornershop for milk. Indeed, no more cornershop,as the few that survived the death of the newspaper close up.

In this future, who actually works in the supermarket? We have a few drivers and perhaps a half dozen staff per megastore so there is enough to cover 24/7 opening, annual leave and so on with always one person instore. We would imagine teeming head office, but as AI's and expert systems improve, we need less and less there. Tasks are hived off to expert systems or outsourced to some up and coming service provider where brains are cheap. Productivity per worker, as measured, becomes immense. There just aren't that many workers anymore.

The supermarket story sounds trivial as presented, but you can, with a little imagination, infer a similar story in many industries. A large proportion of our jobs are semi skilled, and do not really demand much brainpower. All the unskilled and semiskilled people who, even in the first world, make the service sector hum are going to be in trouble.

Now, your local supermarket is still making money. It's still paying people, just much less, more highly skilled people, and of course larger dividends to the owners. Who, exactly, is shopping in this supermarket, and with what? All those unemployed people? Henry Ford is alleged to have paid his workers over the odds, as he felt anyone working for him should be able to afford the cars they are making. What's happening here is a parody of that. With each reduction in workforce, there are less and less consumers who can actually afford to buy very much. It's like the Tragedy of the Commons. In this classic economic fable, it pays each farmer to graze the commons as heavily as possible, even though, in the long run, it will destroy the grazing and ruin them all. Increasing automation to increase productivity and cuts costs is a sensible, responsible decision for any business. Each time it happens, it reduces the pool of gainfully employed consumers until there are none left. So whose left with money to shop? Only a handful of highly paid core staff, and the shareholders, mainly pension plans for people who'll never be able to afford to retire.

Historically, of course, the displaced labour has migrated to newer and more interesting professions, but as the machines get smarter and smarter, the pool of professions that only humans can do gets smaller and smaller. I've already blogged about Emily Howell, the virtual composer and other examples of Artificial intelligences tackling problems long thought to be human only. It's also worth noting the set of problems faced by a business are not all best solved by a brain designed for staying alive on the savannah. Intelligences not as smart as us, but different, might do just fine. Think of chess as an example. Or sorting post, or telephone switchboard operators. The machines may even do better, since they lack some of the human brains many, many cognitive bugs. They don't have to be as smart as us, they just have to be smart enough. And besides, who says we're that smart?

Science Fiction writers readily paint pictures of utopian post scarcity societies, where humans live in abundance. Roddenberry's Federation is the classic example, or more recently Iain M. Banks' Culture Novels. The question unanswered is how do we get there from here. The technological path is clear, tractable, and generally plausible. There is however no guarantee that our economic model will be able to adapt to it. Changing economic models is a somewhat risky operation.

Human history has, of late, been an extraordinary positive narrative. While the History Channel drones on about the great wars of the 20th century, we as a humans live in unprecedented numbers and affluence. Famine, poverty and war, once the global norm, as seen as failures, problems to be contained and solved, not accepted. Much of this prosperity comes from technological change. But there is no guarantee that this will continue. It's conceivable that our economic model, structured around rationing and scarcity, might bring us to some kind of dead end. Increasingly homogeneous government models, where each country operates in much the same way following agreed international norms, limits the capacity for different countries to respond in different ways and for new approaches to evolve.

I'm not advocating a stop on technological development. That is impossible, and unwise. We still need to move fast forward to bring the levels of comfort we have largely reached in the first world to all, and solve some of the problems we've created along the way. But we need to be agile and pragmatic about how our societies are organised, and start keeping a good close eye on numbers like the Gini coefficient, so that things don't get ugly. We need to be open for other ways of doing business, and mindful of how we can keep our economic models flexible and adaptable. I'm not preaching anarchism or socialism. I suspect the exact 'ism we will need hasn't been quite invented yet.

As for what it means for Universities, it's hard to tell. In the long run (and I'm thinking a century out here, at least), I think there will be big shift away from professional/ vocational training we see a lot of now, where the focus is often on getting a job at the other end. In a world where there is no job at the other end, or at least, nothing you or I would think of as a job (is blogging a real job?) what people will do in Universities might look a lot more like recreational activity to us today.

That seems like a big leap, but look at our world through the eyes of an early graduate of Bologna or Oxford. Our Universities might look pretty easy to them. No memorisation, no hand copying books. And the jobs out the other end? I don't know how many hours a scribe to Emperor Barbarossa worked, but I suspect they worked harder and longer than a 21st century middle management white collar type.

We're a little further along the road than we might think.

The Three Economies of Plenty

First, some apologies on two counts. Firstly I'm going to swing to the edge of scope for a post or two, as I'm thinking a little bit about economic context in which Universities will operate as the century wears on. I want to capture some thoughts I have on that, which sit at the edge of scope, both in terms of topic, as they don't address Universities specifically and in terms of time, looking ahead towards the centuries end. Apologies are also due that this topic is at the edge of my expertise, I'm no economist, a point which will no doubt become painfully evident presently.

The 21st century will be a century of 3 economies, material, informational and experiental, or, for short, stuff, ideas and fun. Right now the three economies are entangled, confused and confounded. That will change over time.

The material economy is the most familiar. You buy stuff, you make stuff, you sell stuff. For a long time, it didn't go anywhere much, as the supply and variety of stuff was limited - mainly potatoes in Ireland, it seems.Basic economic constructs like supply and demand curves come from this economy. It started to get interesting a few hundred years ago when industrialization greatly increased the volume and range of goods available. Supply up, cost down, demand up, world economy go go go!

This kind of economy will approach, but not hit, the bumpers over the next century. Environmentalists tell us finite resources and raw material supply puts physical limits on the worlds capacity to make stuff and we must all make do with less. Perhaps. More likely, in my view, is that we will hit the limits of what we can consume. There are only a finite number of cars, phones and shoes we can actually own. Even in my lifetime, attitudes to  material goods have shifted. A house heavy with possessions is an anchor, not an asset. When everything is available, 24/7, there is no need to accumulate your own personal warehouse - you can buy what you need, when you need it. We may continue to buy more expensive objects as status symbols (the Mercedes instead of the Skoda) but the amount of physical goods involved, and the relative functionality of those goods won't change much. To put it another way, there is only so much cake we can eat. It might be very good cake, hand baked in Switzerland by the latest celebrity chef and flown in by SST, but it's still cake. In some cases, the real status is to have less, drowning in possessions in unfashionable. Who wants a GPS, an MP3 player,a portable  a HD handycam and a phone nowadays when you can have them all in one slim device?

This economy does have a fair bit left to run - the sons and daughters of Chad have a long walk in the dust, generations, until they reach the point where that third hovercar is just an encumbrance, but their grandchildren will get there.

The second economy is the information economy - books, music, movies and media. For a long time, people thought this was just an annex of the physical economy. From the first Bible at 30 florins to the last DVD Series Box Set at €9.99 in the bargain bin, people thought they were selling physical objects, when they were really selling the information encoded on them. By creating a finite number of copies, you could create an artificially limited supply and slip into the working patterns of the material economy without trouble.

This economy, as you may have noticed, is in trouble. Napster smashed the illusion of scarcity. Now we all understand that the marginal cost of a piece of information is zero. In the age of the eBook, no bestseller can sell out. Because humans have a herd instinct, and like to have something to talk to each other about, there are still hit singles, blockbusters and bestsellers which are valued enough that they can conceivably charge for access - some TV stations make a tidy sum charging people to view soaps online - a day early. Others services charge for convenience - it's easier to pay 99c for a song on iTunes than to hunt for a dubious download. Undercutting the whole process is open content, open to all, distributed at no cost. You may want to be paid for your column in the newspaper, but ten others behind you will blog the topic for purely for glory. Your book may be insightful and comprehensive, but I'll get the gist of the topic on Wikipedia first.

This economy will sort itself out into a working model over the next decade or two, and then hit the buffers of demand. Just like physical goods, there is a limit to what humans can consume. We can only read, watch, and listen to so much in a day. Time is finite. It doesn't matter how compelling your new album is, I'm all compelled out. I don't have time to watch TV, but I keep a list, I call it the Dribble List, of stuff I'd like to watch sometime. When I get to a stage in life when all I can do is dribble, and hit the pause button so I can make a rude suggestion to the RoboNurses, I'll catch up. Maybe.

The third economy is the experience economy. It's the holiday, the restaurant meal, the night at the theatre. It's not like the information economy, for every person having an experience there is a real, often very high, marginal cost. Supply is somewhat elastic  - new restaurants sprout remarkably quickly when the economy improves). Except sometimes it isn't - only a handful of people can climb Everest each year, there are only so many tickets for the Met, and so many unspoilt beaches. Unmet demand is enormous, as we have more and more free time, we increasingly want to do something more compelling with it than watch Big Brother, if we have the money. People in the second economy are smartly moving into the third, if they weren't there already. I'm going to a Suzanne Vega concert tomorrow. I spent more on two tickets than I would to buy her entire back catalogue, and she'll get a bigger cut out of it. Musicians will make more from concert tours, authors from public speaking engagements, TV stars will make more from stage shows and tours.

The interesting thing about these economies is they run on a system designed to manage scarcity in the first, physical economy. If there is only so much stuff to go around, then it makes sense to invent money as a measure of need, and give the stuff to the person who will give you the most. The ideas of supply and demand, fundamental to economic thought and theory, come from this economy of stuff. The rules make no sense in an information economy, where the marginal cost drops to zero. Similarly, in a world without scarcity, these rules make less sense. We have to create artificial scarcity, in a overpriced, designer limited edition batches, to keep prices up. Despite the best efforts of marketing gurus, everything can be had, in quality far better than our parents had, in the bargain warehouse, at the China price.

Healthcare is an interesting example of an experience economy which breaks our economic models. We have trouble, globally, in finding models for funding healthcare that work because our economic models simply don't work when supply is finite and at high marginal cost, but customers have no choice but to get it. You can buy rice instead of wheat, but dialysis is dialysis.

Education, particularly tertiary education, would see itself firmly in the experience economy. Tertiary education changes your brain, your heart, and often your liver. It's an real experience. Let's not forget though, that many Universities still have one large boot in the information economy. Libraries, lectures, course programmes and journals were part of the package of information you bought access to with your fees. Universities who spend a lot of effort on that need to think again. You can't make a buck on something that become free or bulk commodity, unless you have superstar lecturers, the Simon Schamas and Niall Fergusons who will by virtue of their status attract keen students and make you more marketable.

The smart move is to give the information away for free and focus on the experience. Anything else is swimming against the tide. 'Destroy your business' wrote Jack Welch, former celebrity Ego CEO of General Electric. He was wrong about a lot of things (who isn't), but right about that. What he meant was - think what a disruptive competitor could do that would put you out of business. Do it. Do it to them before they do it to you. MIT understood this when they launched their open courseware initiative. Universities who put their very best high value content up on youtube and iTunesU for free understand this. It doesn't matter if it isn't sustainable in the long term. In the long term, as Keynes put it, we're all dead. Another 15 years and the sector will be so unrecognisably scrambled that everything will be different anyway. The 21st century is like being trapped in a burning building. You might not know where to go, but you better drop and crawl as fast as you can, 'cos staying put isn't going to keep you alive for much longer.

Monday, June 7, 2010

Black Swans and The Fifth Megatrend.

I've just finished reading Nassim Talebs book ' The Black Swan'. It's essential reading if you are thinking about forecasting, and a lucid and entertaining read as well..

The core idea is that many processes thought to be governed by Gaussian distributions are really governed by Power Law Distributions. For example, over a short term, much financial data looks like it clusters around an average value, in the same way as height or weight does. In fact it's a Power Law, where larger scale events are increasingly rare, but overpoweringly large. Over a short or selective term, a set of data like, say, the the size and frequency of bank collapses, drilling accidents or airline disruptions seems like a simple normal distribution, with the probability of severe events tiny and easily quantified. A day later, events make fools of us all.

This idea impacts on recents posts where I set out four key trends which I thought would drive the story or tertiary education through the 21st century: Demographics, Economics, Telepresence and Artificial Intelligence. In each of these trends I somewhat boldly extend a current trend in a broad swish across the century ahead and consider the effects. As Taleb would argue, I am like the Turkey extrapolating his ever increasing meals and weight gain to dream of the mighty bird he will be come spring - a simple inductive fallacy.

It's worth taking the time to consider the rationale behind each of those trends, and how much inertia they have. Are they indeed driven by fairly dull, Gaussian processes, where, for example economic growth generally flits around two or three percent a year, with the occasional wobble up or down a few percent, or are great surges over and back possible - Black Swans, in Talebs language.

Population growth seems to be at fairly low risk of a major shift. It's a consequence of many factors; how many children people want to have, how many they can have, and how long they live. The biological factors are well bounded, at the bottom (you can't have negative children) and at the top (It's really unlikely you'll have 14 children, or live to 200). The variables are driven by things with a lot of inertia - how many children you would like is keyed to economic factors and social expectations which change slowly. Infant mortality is keyed to economic wealth, and how long you live is usually driven by economics too, through the sum of your lifestyle and health from womb to tomb. So population forecasts have a lot of mass behind them. Given that uncertainties coming from basic factors like birth rates and so on amount to over a billion people either way, it would take a particularly vigorous pandemic, or an enthusiastically conducted nuclear war to make much of a difference.

Economic growth is the most interesting one. Predicting 10 fold increase in wealth feels bold in the current climate, but that's just 2.5% a year. Anything under 1% is declared a disaster by the media. Many developing economies scarcely felt the 'Global Financial Crisis' of 2009 and are still growing at rates well in excess of 2.5%. A lot of this growth is probably technologically driven, but given that a very sizeable minority of the worlds population have yet to benefit from the inventions of the 20th century, even if we invent nothing of note from now on, there is plenty of scope. There are limits to economic growth, particularly in the physical 'economy of stuff'. There is only so much physical stuff we can own or consume, and so much physical stuff we can make. The sustainable economics of information, with a marginal cost of zero, is still a subject of debate. It is bold, but not implausible, to suppose that whatever the overall level of growth in the 21st century, the regional inequalities that arose out of the industrial and colonial age will largely even out.

Good quality Telepresence seems like a no brainer - an engineering problem, with an obvious economic payoff for a solution. I can't see how it couldn't come to pass, all else being equal, although some forward advances in the affordability and speed of physical travel might delay it's adoption. Any kind of adverse system shock, like a high mortality pandemic or war, would hit transport nets more and promote telepresence.

Artificial Intelligence is probably the rockiest potential megatrend. Others might not even consider it - it will arrive as a Black Swan to many. Moore's law is a classic inductive case. The long established historical trend doesn't prove it will continue for one more day, no more than the Turkey feeding schedule does. That said, some of the worlds biggest R&D budgets are feeding this turkey and making ever more powerful chips. It's also noting that the law refers to the cost of a unit of processing power, not pure component density. Even if you hit physical  wall on component density, as is frequently prophesied, there is still plenty mileage in making chips cheaper, and with better utilisation making the power cheaper at the desktop. Think of the difference between a chip on a desktop PC, spending most of it's life running screensavers and Freecell, and the same chip in a server farm, running at optimum 24/7/365. Granted it might be running Farmville, but you get the idea. If anything, the risk for Moores law is that something like Quantum Computing will come out of leftfield and rapidly accelerate the curve.

Raw processing power doesn't necessarily mean human like artificial intelligence, but it will chip away at it, until the differences seem pedantic. When the machines can answer the phone, read your x-ray, drive your car and write your essay for you, who cares whether it's 'real' or 'fake' AI.

Overall, Technology seems to be the most risky quarter, with the potential to pull out Black Swans like rabbits out of a hat. It seems likely that they would be 'positive' Black Swans - developments in unexpected places, rather than negative Black Swans, when expected things fail to happen. That said a negative black swan, like perhaps an elegantly tailored bioweapon, cannot be dismissed.

Of course, Taleb would no doubt argue that focusing on these four trends is missing the point. The key trend, the main event that will shape tertiary education in the next century isn't any of those - it's something we haven't -can't - imagine yet, some surprise out of left field that will change the narrative, for better or for worse - Fifth Megatrend, the Black Swan.