Landing : Athabascau University

Universities can’t solve our skills gap problem, because they caused it | TechCrunch

http://techcrunch.com/2016/05/08/universities-cant-solve-our-skills-gap-problem-because-they-caused-it/

Why this article is wrong

This article is based on a flawed initial premise: that universities are there to provide skills for the marketplace. From that perspective, as the writer, Jonathan Munk, suggests, there's a gap between both what universities generally support and what employers generally need, and the perceptions of students and employers about the skills they actually possess. If we assume that the purpose of universities is to churn out market-ready workers, with employer-friendly skills, they are indeed singularly failing and will likely continue to do so.  As Munk rightly notes:

"... universities have no incentive to change; the reward system for professors incentivizes research over students’ career success, and the hundreds of years of institutional tradition will likely inhibit any chance of change. By expecting higher education to take on closing the skills gap, we’re asking an old, comfortable dog to do new tricks. It will not happen."

Actually quite a lot of us, and even quite a few governments (USA notwithstanding) are pretty keen on the teaching side of things, but Munk's analysis is substantially correct and, in principle, I'm quite comfortable with that. There are far better, cheaper and faster ways to get most marketable job skills than to follow a university program, and providing such skills is not why we exist. This is not to say that we should not do such things. For pedagogical and pragmatic reasons, I am keen to make it possible for students to gain useful workplace skills from my courses, but it has little to do with the job market. It's mainly because it makes the job of teaching easier, leads to more motivated students, and keeps me on my toes having to stay in touch with the industry in my particular subject area. Without that, I would not have the enthusiasm needed to build or sustain a learning community, I would be seen as uninterested in the subject, and what I'd teach would be perceived as less relevant, and would thus be less motivating. That's also why, in principle, combining teaching and research is a great idea, especially in strongly non-vocational subjects that don't actually have a marketplace. But, if it made more sense to teach computing with a 50 year old language and machine that should be in a museum, I would do so at the drop of a hat. It matters far more to me that students develop the intellectual tools to be effective lifelong learners, develop values and patterns of thinking that are commensurate with both a healthy society and personal happiness, become part of a network of learners in the area, engage with the community/network of practice, and see bigger pictures beyond the current shiny things that attract attention like flames to a moth. This focus on being, rather than specific skills, is good for the student, I hope, but it is mainly good for everyone. Our customer is neither the student nor the employer: it's our society. If we do our jobs right then we both stabilize and destablize societies, feeding them with people that are equipped to think, to create, to participate, reflectively, critically, and ethically: to make a difference. We also help to feed societies with ideas, theories, models and even the occasional artefact, that make life better and richer for all though, to be honest, I'm not sure we do so in the most cost-effective ways. However, we do provide an open space with freedom to explore things that have no obvious economic value, without the constraints or agendas of the commercial world, nor those of dangerously partisan or ill-informed philanthropists (Zuckerberg, Gates - I'm thinking of you). We are a social good. At least, that's the plan - most of us don't quite live up to our own high expectations. But we do try. The article acknowledges this role:

"Colleges and universities in the U.S. were established to provide rich experiences and knowledge to their students to help them contribute to society and improve their social standing."

Politely ignoring the US-centricity of this claim and its mild inaccuracy, I'd go a bit further: in the olden days, it was also about weeding out the lower achievers and/or, in many countries (the US was again a notable offender), those too poor to get in. Universities were (and most, AU being a noble and rare exception, still are) a filter, that makes the job of recruiters easier by removing the chaff from the wheat before we even get to them, and then again when we give out the credits: that's the employment advantage. It's very seldom (directly) because of our teaching. We're just big expensive sieves, from that perspective. However, the article goes on to say:

"But in the 1930s, with millions out of work, the perceived role of the university shifted away from cultural perspective to developing specific trades. Over time, going to college began to represent improved career prospects. That perception persists today. A survey from 2015 found the top three reasons people chose to go to college were:

  • improved employment opportunities
  • make more money
  • get a good job"

I'm glad that Munk correctly uses the term 'perception', because this is not a good reason to go to a university. The good job is a side-effect, not the purpose, and it is becoming less important with each passing year. Partly this is due to market saturation and degree inflation, partly due to better alternatives becoming more widespread, especially thanks to the Internet. One of the ugliest narratives of modern times is that the student should pay for their education because they will earn more money as a result. Utter nonsense. They will earn more money because they would have earned more money anyway, even if universities had never existed. The whole point of that filtering is that it tends to favour those that are smarter and thus more likely to earn more. In fact, were it not for the use of university qualifications as a pre-filter that would exclude them from a (large but dwindling) number of jobs, they would have earned far more money by going straight into the workforce. I should observe in passing that open universities like AU are not entirely immune from this role. Though not much filtering for ability on entry, AU and other open universities do none-the-less act as filters inasmuch as those that are self-motivated enough to handle the rigours of a distance-taught university program while otherwise engaged, usually while working, are far better candidates for most jobs than those who simply went to a university because that was the natural next step. A very high proportion of our students that make it to the end do so with flying colours, because those that survive are incredibly good survivors. I've seen the quality of work that comes out of this place and been able to compare it with that from the best of traditional universities: our students win hands down, almost every time. The only time I have seen anything like as good was in Delhi, where 30 students were selected in a program each year from over 3,000 fully qualified applicants (i.e. those with top grades from their schools). This despite, or perhaps because of, the fact that computing students had to sit an entrance exam that, bizarrely and along with other irrelevances, required them to know about Brownian motion in gases. I have yet to come across a single computing role where such knowledge was needed. Interestingly, they were not required to know about poetry, art, or music, though I have certainly come across computing roles where appreciation of such things would have been of far greater value.

Why this article is right

If it were just about job-ready skills like, in computing, the latest frameworks, languages and systems, the lack of job-readiness would not bother me in the slightest. However, as the article goes on to say, it is not just the 'technical' (in the loosest sense) skills that are the problem. The article mentions, as key employer concerns, critical thinking, creativity, and oral and written communication skills. These are things that we should very much be supporting and helping students to develop, however we perceive our other roles. In fact, though the communication stuff is mainly a technical skillset, creativity and problem-solving are pretty much what it is all about so, if students lack these things, we are failing even by our own esoteric criteria.

I do see a tension here, and a systematic error in our teaching. A goodly part of it is down to a misplaced belief that we are teaching stuff, rather than teaching a way of being. A lot of courses focus on a set of teacher-specified outcomes, and on accreditation of those set outcomes, and treat the student as (at best) input for processing or (at worst) a customer for a certificate. When the process is turned into a mechanism for outputting people with certificates, with fixed outcomes and criteria, the process itself loses all value. 'We become what we behold' as McLuhan put it: if that's how we see it, that's how it will be. This is a vicious circle. Any mechanism that churns students out faster or more efficiently will do. In fact, a lot of discussion and design in our universities is around doing exactly that. For example, the latest trend in personalization (a field, incidentally, that has been around for decades) is largely based on that premise: there is stuff to learn, and personalization will help you to learn it faster, better and cheaper than before. As a useful by-product, it might keep you on target (our target, not yours).  But one thing it will mostly not do is support the development of critical thinking, nor will it support the diversity, freedom and interconnection needed for creative thinking. Furthermore, it is mostly anything but social, so it also reduces capacity to develop those valuable social communication skills. This is not true of all attempts at personalization, but it is true of a lot of them, especially those with most traction. The massive prevalence of cheating is directly attributable to the same incorrect perception: if cheating is the shortest path to the goal (especially if accompanied by a usually-unwarranted confidence in avoiding detection) then of course quite a few people will take it. The trouble is, it's the wrong goal. Education is a game that is won through playing it well, not through scoring.

The 'stuff' has only ever been raw material, a medium and context for the really important ways of being, doing and thinking that universities are mostly about. When the stuff becomes the purpose, the purpose is lost. So, universities are trying and, inevitably, failing to be what employers want, and in the process failing to do what they are actually designed to do in the first place. It strikes me that everyone would be happier if we just tried to get back to doing what we do best. Teaching should be personal, not personalized. Skills should be a path to growth, not to employment. Remembered facts should be the material, not the product. Community should be a reason for teaching, not a means by which it occurs. Universities should be places we learn to be, not places we be to learn. They should be purveyors of value, not of credentials.