At this year's Educause meeting (ELI, Feb. 2012), my boss latched on to an idea that if we tell students an 80% is ok, that means we are saying 20% of the material does not matter. Since then, I've stumbled up against what seems to be an idea based on John Carroll's instructional theory, masterly learning, but with some weird twists.
Since I'm in higher education, this concept that all students should leave my class, a freshman level information literacy class with 100% mastery of what I am teaching. The class I teach is a research class where students learn not only how to access information but also how to evaluate it, how it's produced, how we use it, how we receive it, the finances involved, etc. In other words, how to critically think.
I don't think college courses should be about mastery as much as learning how to think and find answers and to sometimes live with uncertainty.
One of the more off-kilter examples I've heard used to justify this method of teaching at the college level is "Would you want your blood drawn by a nurse who only mastered 80% of her class? What if the 20% were drawing blood?"
But isn't that what certifications and exit exams are for? Wouldn't a program like nursing have assessment of that sort of procedure or task built into the not insignificant number of tests those programs give throughout the curriculum?
It makes me fear for higher ed (even more than I already do) - that it will soon be simply vocational training.
I've read about mastery theory and I actually think it makes sense in a K-12 setting and even recognize some of the instruction I received in elementary and middle school as following this theory;however, I'm not certain it is being applied correctly in the contex I'm referring to.
I'm curious what educational professionals here at DU think about this.
One last thing I should mention. It appears this concept is being applied to instructional design for online learning.