Higher education is shockingly right-wing
- Article
- Mar 1, 2023
- #Education #Politics
Article
Perhaps the stupidest idea that everyone takes for granted is that higher education in the United States is left-wing.
If "left" and "right" have any meaning at all, "right" describes a worldview under which civilized society depends upon legitimate hierarchy, and a key object of politics is properl...
Show More
-
You might also be interested in
-
- Podcast episode
- Jun 3, 2023
-
-
- Video
- May 31, 2023
-
- Article
- May 30, 2023
by Alex Sammon
Perhaps the stupidest idea that everyone takes for granted is that higher education in the United States is left-wing.
If "left" and "right" have any meaning at all, "right" describes a worldview under which civilized society depends upon legitimate hierarchy, and a key object of politics is properly defining and protecting that hierarchy.
"Left", on the other hand, is animated by antipathy to hierarchy, by an egalitarianism of dignity. While left-wing movements recognize that effective institutions must place people in different roles — sometimes hierarchical, sometimes associated with unequal rewards — these are contingent, often problematic, overlays upon a foundational assertion that every human being has equal dignity and equal claim to the fundamental goods of human life.
Whatever else colleges and universities do in the United States, they define and police our most consequential social hierarchy, the dividing line between a prosperous if precarious professional class and a larger, often immiserated, working class. The credentials universities provide are no guarantee of escape from paycheck-to-paycheck living, but statistically they are a near prerequisite.