It’s been nearly four centuries since the first formal classrooms appeared in what would eventually become the United States. The earliest example of a public school was the Boston Latin School, founded in 1635, the first to relieve families of having to educate their kids at home in the “three R’s”—reading, writing, and arithmetic.
Despite massive changes in society and technology since colonial times, one thing hasn’t changed much: the way we teach, test, and pass our students along to the next level—or into their adult working lives.
Most students today still take the same lessons from the same teachers in the same format—and they must pass the same tests to graduate. Of course, higher education allows for variations in courses of study, but within each classroom or curriculum, the content, delivery, and assessment are fixed. Over the course of their 12-year education (plus two, four, or eight more in university), students ingest, memorize, and practice the materials presented, then take tests to receive a certificate to prove they “learned” it.
Today, however, there’s a challenge to the centuries-old “binge-and-purge” approach—so-called because students stuff their brains full of facts, then regurgitate them on tests (often to be summarily forgotten). Competency-based education is starting to take hold.
What is competency-based education, and why is it so different?
Traditional time- or credits-based education is like a transaction, almost as though the student is purchasing the right to be educated within a specific time frame. It tends to promote memorization rather than the application of knowledge. Students hope to obtain something valuable from the transaction, of course, but when assessing the knowledge they’ve gained, it’s just a matter of meeting an often arbitrary 70-percent threshold. The bestowed certification often doesn’t translate to actual applied knowledge.