Skip to main content
Fascinating Blog

“Innovation” is for poor people

By July 6, 2015December 20th, 2015No Comments

I’d been meaning to write about the so-called “essay grading” software programs in the context of Evgeny Morozov’s concept of “solutionism” – the idea that we use technology to solve “problems” that aren’t really problems, but are accessible by technology.

After all, the “problem with essays” isn’t that they require someone to understand them:  that’s the whole point.  Ideally, someone is writing an at least slightly unique take on the subject under discussion.  To “grade” it without “understanding” it isn’t an improvement at all.  What will happen – inevitably – is that you’ll get students writing (and eventually thinking) down to the things the software can process.

After all, it’s not like the software is capable of evaluating what it’s reading in any original sense.  It’s not even like it’s particularly good at it.  From the New York Times article: 

“Often (critics of computer grading) come from very prestigious institutions where, in fact, they do a much better job of providing feedback than a machine ever could,” Dr. Shermis  (a supporter of computerized grading) said. “There seems to be a lack of appreciation of what is actually going on in the real world.”

This is a classic example of lowering our standards to the technology we have.  Much in the same way that Sherry Turkle has documented that old age facilities are getting robotic pets for seniors because they’re less costly than providing actual human contact, we are offering robotic essay graders to students because the cost of providing actual human contact is too great.

No “problem” in the real sense is being solved, we’re just lowering our standards to the technology.  The end result is more activity (papers graded) of a kind we didn’t really need (papers graded badly).

But then my occasional colleague Albert Samaha comes out with this doozy of a blog post, noting that while companies are falling over themselves to build automated educational systems capable of handling incredibly complex details … while across the united states prisons are still keeping track of prisoners BY HAND!  The result are prisoners lost in solitary confinement for years, vanished into the system, not allowed to see a lawyer because nobody in authority remembers they exist.

Albert writes:

“It’s technological stratification. Educational institutions, with backing from the tech world, are on the verge of having software complex enough to grade essays and “provide general feedback, like telling a student whether an answer was on topic or not.

Meanwhile, one of our most deeply rooted public institutions– an incarceration system that oversees tens of millions of people– relies on little more than a chain of human competency: “on court clerks to record judges’ orders correctly, prison and jail administrators to properly read those instructions, and facility staff to accurately add and subtract good-time credits.”

It’s a great point.  One we need to think closely about.  But, Albert, I think you’re missing the larger point of this technological stratification:  it’s not the children of the rich who will ever have their essays graded by machines, just as it’s not the rich retirees who will be given robot pets instead of human caretakers.

The children of the rich will attend the kind of schools that pride themselves on small classes and direct attention from teachers, just like they do now.  They’ll hire personal tutors instead of using free lectures on YouTube, just like they do now.

Innovation is for poor people.  There’s a reason the rich still prefer the old fashioned human touch.

You’re so right about the automated essay graders representing technological stratification:  but they’re on the other side.  It will be the children of prisoners who will have their essays graded by computer as their parents are miscounted by hand.