We’ve seen an erosion in the concept of good judgement over the
past forty years. The partisan arguments over US Supreme Court appointments,
the increasing emphasis that all moral values are relative, the insistence that
anything other than the most mathematically proven declarations are arbitrary,
the notion that “subjective” is a synonym for “random” and “rationally groundless”—all
of these speak to the growing sense that all statements are either completely objective
or utterly arbitrary.
To be fair, the concept of good judgement originally came
under fire for good reason, which lies in the fact that good judgement has been
historically coded as white, cisgendered, heterosexual, male, and old. But it’s
also important to acknowledge that just as historically marginalized groups and
individuals were finding a voice in public discourse, this trend was accompanied
by a growing skepticism toward the notion of good judgement and expertise in
general. It seems that many would rather live in a world with no intellectual
authority rather than allow historically marginalized groups to lay claim to
this authority.
The problem with all of this is that when a person confronts
a judgement they don’t like, they can completely write it off. This leads to a social
(or should I say antisocial) phenomenon one could rightfully call the
privatization of truth.
What’s been lost in all of these conversations is the
principle that one can, through education, improve their subjective judgement.
A graphic designer might not have an objective sense of which designs will be
better received by certain audiences, but to say that their aesthetic judgement
is therefore arbitrary, groundless, and no better than anyone else’s is to
throw out the concept of good judgement altogether.
This crisis of faith in good judgement is part of the crisis
that’s impacting the Humanities. Part of this crisis is the notion that good
judgement, no matter how well-argued, can never compel agreement. One could offer a strongly argued reading of misogyny
in the works of William Faulkner, but the fact remains that any student, if
they wish, can fold their arms and argue, “It’s not there. You’re just reading
too far into things.” The professor can offer mounting evidence, but all the
student needs to do is continue shaking their head. For some instructors, this
type of response can badly rattle their confidence in their own reasoning. But
good judgement doesn’t rely on the acceptance of others to show its worth. The
values and hallmarks of good judgement are many. Persuasiveness might be one of
them, but compelled agreement isn’t. Persuasiveness is a quality of the
argument itself; agreement depends entirely on the caprice of the listener. If
the recalcitrant position of “I’m not persuaded” were enough to completely
undermine the concept of good judgement, a majority of our institutions would
completely collapse (including the law itself, which is based solely on judges’
subjective, informed judgement of the law as it’s written).
So what are the hallmarks of good judgement? Thankfully,
they are skills that the Humanities continues to teach very well, the first of
which is verbal acuity—the ability to make a point clearly. Another is discursive
command, the ability to be intentional about what types of language (be it
medical, literary, religious language, etc.) one is drawing upon when making an
argument, and what types of language their interlocutor is using. Another still
is embodied knowledge, the ability to listen to one’s physical reaction to
certain statements, assessing this reaction to sense whether there is “something
wrong” with what is being said, the using verbal acuity and discursive command
to try and formulate this objection in words. Another still is empathy, the
ability to inhabit (however imperfectly) the perspective of another person, or
at least to acknowledge that that person’s lived experience is radically
unknowable to oneself (as is the case with a white male speaker like myself
trying to speak on behalf of individuals whose lived experience is radically inaccessible
to me. In that situation, the principle of empathy defers to listening).
All of these skills, and many others, are taught by the Humanities.
But here’s where I think the Humanities faces its biggest conundrum. The
Humanities, generally speaking, is not content to uncouple the skills it
teaches from the values it wishes to instill. For example, the ability to critically
reflect on how language can shape reality is a core skill learned in an English
program. But what are we to make of a Republican politician who stands on the
floor of the US Senate arguing that climate change reports are simply representations
of reality and not the thing itself? To many English professors, this argument would constitute an irresponsible misinterpretation of what critique is
meant to do. But on the other hand, what exactly prevents this senator from using
critique in this way? What happens when critical doubt, when applied to
subjects as diverse as climate change and sexual assault, becomes the greatest
weapon regressive conservatism has at its disposal?
At this point, the Humanities faces a choice: to focus on teaching
discrete skills and then encouraging people to use them in responsible
ways, or to continue arguing that there is something inherently progressive about
the skills it teaches. This is where some Humanities instructors might argue
that they are teaching habits of thought rather than something as
superficially utilitarian as "skills." In any other discipline, critical thinking
is simply another name for problem-solving or problem identification. In the
Humanities, it seems to carry with it a progressive (or at least anti-authoritarian) mission, due in part to the inheritance of "critical" from 20th-century critical theory.
This isn’t to say that the Humanities should abandon its values;
rather, it might need to give up the notion that there is something inherently progressive about the skills it teaches.
Further, the Humanities needs to stop arguing that there is
some sort of moral improvement or “becoming more human” that is inherent to the
skills it teaches. The critical reflective skills taught in the Humanities can
just as easily be used for self-deception as they are for self-knowledge; they
can just as easily be used to rationalize unjust practices as they are to
critique them. Indeed, it’s the double-edged nature of these skills that makes
them so powerful and so dangerous at the same time. The problem lies in thinking that a certain progressive mindset is inherent to the skills taught by the Humanities, which if we are to be
honest, can produce a regressive devil’s advocate just as easily as they can
produce a progressive critical thinker.
What remains in all of this is the importance of good
judgement and the skills that constitute it. When Eve Sedgewick speaks about
the homosocial continuum, the quality of her judgement and the salience of her
points does not depend on compelled agreement. If someone folds their arms and
says, “Bullshit,” it doesn’t matter. The quality of Sedgewick’s argument depends
on the skills she built over her career, and her ability to use those skills to
create a strong argument.
What needs to be reasserted (and it’s a shame that this
needs to be argued) is that one’s judgement can improve through education,
and that the majority of our social world is predicated entirely on the quality
of people’s subjective judgements, something the Humanities helps to improve.
Talk about good judgement in a boardroom today, and heads will nod. Talk about
good judgement in a Humanities classroom, and suddenly people start using words
like “arbitrary” or “groundless.” The Humanities doesn’t need to apologize for
the fact that some people’s judgement (with allowances made for context) can be
better than that of others. But even more importantly, it needs to emphasize
that a person’s judgement, through education, can become better than it previously
was.
No comments:
Post a Comment