Landing : Athabascau University

Weighing the Importance of a Computer Science Degree | Inc.com

https://www.inc.com/young-entrepreneur-council/do-you-need-a-traditional-computer-science-degree-to-be-an-effective-software-engineer.html

An interesting opinion piece from a person in a consultancy that hires a lot of developers.

tl;dr a computing degree still tends to be very useful if you want to make a living as a developer, despite the fact that there are lots of alternative, faster, cheaper ways both to get useful skills, and to gain credit for them, and that those alternative ways tend to be far more relevant to employers, instantly usable, and focused on skills that matter now. The point is that, though CS graduates entering the workplace are usually woefully underprepared - they do tend to need to learn a whole bunch of new stuff about businesses that universities fail to provide, and to learn a whole bunch of new technologies that few universities bother to teach (the standard curricula are much to blame, not just individual universities) - the author reckons that they nonetheless tend to offer greater value in the end, mainly because they are consistently better problem solvers, and are consistently more capable of picking up new technologies more quickly.

It's nice to have a bit of validation from one (albeit an important one) of the stakeholders in what we do, but I'm not so sure we can take the credit for it. Most universities (Athabasca is a very unusual exception) act as filters even before students begin, only taking people that are pretty capable in the first place. So, duh, such people are more likely to be better at learning and solving problems, regardless of how we treat them. It's the same fallacy behind student loans - yes, you are (on average) likely to make more money as a graduate than someone who is not,  but a good part of the reason for that is the fact that you became an undergraduate in the first place puts you in a group that is, on average, notably smarter than those that did not get in. Even Athabasca  - a resolutely open university with hardly any pre-filtering (well, we do filter a bit on our graduate programs) - acts as a filter regardless of anything we do, inasmuch as anyone that completes one of our degrees has to be a pretty special, self-starting, self-guided learner to succeed. And, of course, we all do post-filtering by assessing and credentialling our students, weeding out a few of those that got through the pre-filter. The question is, do we actually add any value at all apart from that?  I'd like to think we do, but there is at least some good evidence - compelling if contested - that, on the whole, universities don't.

I wonder, assuming those same filters were in place at either end, whether our students and their future employers might be better served if we did, or accommodated and embraced, at least a bit of the stuff that people get from those cheaper, unconventional learning pathways. Can't help feeling it might not be such a bad idea.

 

 

Comments

  • Vivekanandan Kumar November 24, 2017 - 5:32am

    Universities do two things. Help learn about knowledge and help create new knowledge. One can drive students to the leading edge and expose opportunities for them to be creative, the conventional learning pathway. The more successful a university is in this, the more prominent they become, which leads to more students believing in this traditional way of learning and knowledge creation and flocking to such conventional universities.

    Unconventional learning pathways are themselves fine as far as helping to learn about knowledge, and the help is non-traditional, mostly self-help. But, how successul is this pathway about creating knowledge? I doubt that the creation of new knowledge naturally follows in unconventional pathways. It could be done, no doubt, but it is not geared for it yet. 

    The very word "University", an insitution about learning at the highest level, demands that new knowledge be created through a follow up research. If universities don't see themselves as beacons of rigorous knowledge creation and confine themselves to 'just learning', then they should rename themselves to some sort of higher (not the highest) learning institutes and give up the identity of being a University.  

  • Jon Dron November 24, 2017 - 1:02pm

    @Vive - yes, I agree, that does accord with my idea of a university too, at least when combined with the practical application of that knowledge to help the community. 

    Technically, the word 'university' derives from "universitas magistrorum et scholarium" which means 'community/society of teachers and scholars' and says little about the level or expectations of what that community actually does apart from to learn and to teach. But I agree that, at least since von Humboldt (notwithstanding the awkward and counter-productive Canadian distinction between comprehensive and other universities), an acknowledged role of that community is the generation of new knowledge.

    But I don't think there is any contradiction at all between performing that role and accepting, supporting, or actually providing unconventional pathways to get there. Quite the opposite, in fact: we should positively encourage it, if for no other reason than that it drives innovation and creativity. I'm pretty sure that most of us professional academics use self-teaching (or non-formal methods like attending conferences) most of the time when we need to learn stuff that moves our research forward. I don't see why we shouldn't encourage students to do the same.

    For instance, if they want to do a project that demands Ruby on Rails (which we don't teach), I reckon it's a perfectly legitimate path for a student to take a bootcamp. Similarly, if our courses bore or intimidate them, or don't use the tools they wish to use, I think it should be absolutely acceptable for them to achieve the same outcomes in different ways, without penalty. It would be great if we could participate and help out ourselves. It would be good if we were not so deeply bound to providing a standardized teaching curriculum that, by its very nature (professional societies are about setting norms, not reaching heights), cannot be on the leading edge of the field, and that admits little variation and creativity.  It would be brilliant if our students were more engaged in solving real problems rather than implementing (and for the most part copying and pasting from the web) yet another solution in Java to the Towers of Hanoi. 

    I'm not suggesting that we should scrap everything we do now and make it a complete free-for-all: it doesn't have to signal a drop in standards at all. But, if we were to provide bit of structure, a bit of support, maybe a few basic foundation topics (optionally replaceable with equivalents), some rigorous criteria of the right kind, and some means of assessing achievement, we could make a more open, embracing, problem-oriented and competence-based approach work far better than what we do now, with far greater student satisfaction and engagement, and far more relevant, useful skills for all concerned.