Death to the College Credit

In my experience in higher ed, I’ve learned this rule: Governance is impervious to being fixed. Somehow it works, and no degree of AAU guidance will make it uniform, or even rational. And a university’s governance works only at that university; whereas one school takes 12 minutes to approve a new concept, another may take 12 years.

In general, though, good concepts that genuinely benefit the institution (academically, fiscally, politically, aesthetically) will probably get through. My point here is that most change is possible, but nothing is simple. Given that, let’s tackle a particular challenge. Not technology. Not costs. Not senior administrators whose tactic is incremental change or the notion of “it’s not fair because X college has deeper pockets.”

Let’s take on college credits. In a growing body of thought, they are the enemy.

Credits are artificial packages of pretend competencies. They are charged at way too high a rate, complicate transfers, encourage academic silos and prove nothing of worth to employers or society at large. They are more a means of billing and price inflation, and they encapsulate all that is wrong and inefficient in higher education. College credits are evil.

In a traditional general ed program, credit boundaries dissuade collaboration and the creation of efficiencies between departments and hence for students. Look at Lumina, LEAP, The Institute of the Future and almost any organization that has analyzed critical skills needed for society in 2020, and you’ll see remarkably consistent alignment around these key elements:

  • Inquiry/Analysis
  • Communication
  • Creativity
  • Critical Thinking/Problem Solving
  • Information/ICT literacy
  • Quantitative Literacy

Now, if I were a student looking to master these key skills, I might ask: Why, having demonstrated that I am a brilliant critical thinker in five classes, do I need to keep jumping through that same hoop, again and again, in a sixth, then a seventh class—which are often evaluated in different ways, first by a historian, then by a mathematics instructor, then by a lit professor? And so on, ad graduation?

Separating syllabi from often poorly written “siloed” learning outcomes, and aligning concrete materials with focused assessments provides the context and connections for improved learning. Independent graders and/or standardized tests can confirm that students really do get these core elements. Separating curriculum from term times would mean that students can practice and then demonstrate competencies clearly and repeatedly, at their own pace, building on prior learning, working in areas that provide intrinsic motivation for them.

It’s great that in a typical General Studies course students are exposed to the literary giants and learn (some) key dates in the history of Rome (or Europe? America? whatever the curriculum dictates), but aren’t the actual competencies the primary goal?

Compare and contrast: As a provider of education, which is a better product? I contend that it’s one in which we separate out essential skills and provide practice and feedback around those key competencies listed above. And when all are done—a degree that really reflects a valuable skill set. No more having to graduate the student who hung around for six years and wearied enough faculty members to be shuffled off with a number of credits—maybe not even enough to amount to a qualification—and tens of thousands of dollars in the hole.

A focus on credits mean high charges, slow pace and inefficiencies.
 A focus on competencies allows for greater efficiency, leverage of prior learning, real support, meaningful achievement and clear motivation (i.e., these skills will help me in my career and in my life).

Employers don’t look at credits; they look at the degree. They don’t want the assumed competencies that three or four college credits supposedly represent; they want real competencies. Employers will still see a degree from institute X, but the expectation will be that the degree represents a skill set rather than a fairly arbitrary measure of endurance—a gold watch bestowed for time served.

In this way, the learning is constant and the timing is variable—not the other way around.
 Time to school, costs and inefficiencies are all reduced; motivation, retention and actual competencies all trend upward. Registrars have less to worry about, faculty are less angst-ridden over having to pass students who haven’t really “got it,” and students come away with degrees that reflect competency-hours of active learning, rather than credit-hours of passively being taught.

For more on this, read Cracking the Credit Hour by Amy Laitinen and for a model that really attempts to nail this, look at Southern New Hampshire’s College for America.

About Kevin Bell

Kevin Bell is the executive director for Online Curriculum Development and Deployment at Northeastern University's College of Professional Studies. He leads Northeastern's efforts toward the goal of high-quality, professional online programs based on industry and academic standards. He works closely with senior leadership to ensure that the Northeastern University Online strategy aligns with the College's five-year vision and with the University's academic and regional campus goals.

Leave a Reply

Your email address will not be published. Required fields are marked *

*


+ 5 = twelve

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Scroll To Top