Landing : Athabascau University

Instructional design, measurements, and criticism of one model fits all.

David Jones writes an interesting article (Measuring the design process - implications for learning design, ...) about the measurement perspectives in instructional design.  Here are some interesting excerpts:

"Design is okay with relying on experience and intuition for the basis for a decision. While the engineering culture wants everything measured, tested and backed up by data (para. 2)."

"The design of university teaching and learning has some strong connections with visual design. It involves subjective and contextual decisions, it’s messy, unpredictable and hard to measure" (Learning design, e-learning and university teaching, para. 1).

"The attempt to achieve quality through consistency.
This is such a fundamentally flawed idea, but it is still around. Sometimes it is proposed by people who should know better. The idea that a single course design, word template or educational theory is suitable for all courses at an institution, let alone all learners, sounds good, but doesn’t work" (Symptoms of this problem, para. 3).

"The IT and management folk don’t have any convictions or understanding about teaching or, perhaps, about leading academics. Consequently, they fall back onto the age-old (and disproved) management/rational techniques. As they give the appearance of rationality" (Results, para. 2).

I enjoyed his insights into the debate about qualitative and quantitative measurements for instructional design and the perspectives of IT and management in education. The application of one design (in this case he is referring to an LMS) to all courses restricts flexibilty and the abilty to adapt courses to accomodate differences bewteen courses. 

It seems strange to think that all courses could be delivered online using the same tool and can be adaptable enough to produce a similar high quality of learning.  Courses and programs have different learning outcomes, instructors have various teaching abilities and methods, and students have many different learning styles.  How then can standardized measurements and tools accomodate these differences?  How does educational organizations deal with these differences?  Is this a criticism of how the lecture style still dominates the thinking in designing courses? 

The one issue I have with Jones' ideas is the dismissal of the need to acquire student and teacher input into the design of innovative programs.  He writes about one symptom of this illness: "Designing innovation by going out to ask people what they want.  For example, let’s go and ask students or staff how they want to use Web 2.0 tools in their learning and teaching" (Symptoms of this problem, para. 6). This seems at odds with the idea that feedback is important in making decisions about course design and evaluating leaarning effectiveness.

I understand the conservative notion that the masses can sometimes mislead decision-makers into taking up with the latest fads or satisfying the need to be profitable by giving people what they want.  But how do you decide when to lead and when to follow?  I don't believe that people can be this fickle. Is there a compromise?

The course (MDDE617) I am taking this term on program evaluation includes a model for participant-oriented  evaluations that stress the importance of getting input from all stakeholders when evaluating a program (Fitzpatrick, Sanders, & Worthen, 2004). Evaluations that use qualitative and quantitative methods of data gathering are important in designing and evaluting educational programs and courses. I realize that Jones is not dismissing the importance of these methods but I do wish he would have provided a further explanation or an alternative.  I expect that a closer inspection of his blog will reveal an answer to this question but I wish he had spelled it out in this posting.

 

References:

Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004). Program evaluation: Alternative approaches and practical guidelines. White Plains, NY: Longman.