Before I get into it, here’s a relevant passage from Heidegger’s The Origin of the Work of Art in which he describes one of Van Gogh’s studies of peasant shoes (One finds oneself in a dangerously forgiving mood when reading philosophy written this beautifully.) :
A pair of peasant shoes and nothing more. And yet…From the dark opening of the worn insides of the shoes the toilsome tread of the worker stares forth. In the stiffly rugged heaviness of the shoes there is the accumulated tenacity of her slow trudge through the far-spreading and ever-uniform furrows of the field swept by a raw wind. On the leather lie the dampness and richness of the soil. Under the soles stretches the loneliness of the field-path as evening falls. In the shoes vibrates the silent call of the earth, its quite gift of the ripening grain and its unexplained self-refusal in the fallow desolation of the wintry field. The equipment is pervaded by uncomplaining worry as to the certainty of bread, the wordless joy of having once more withstood want, the trembling before the impending childbed and shivering at the surrounding menace of death. This equipment belongs to the earth, and it is protected in the world of the peasant woman. From out of this protected belonging the equipment itself rises to its resting-withing-self.
This passage was on my mind as I left the movie theater last week, having finally given Star Wars: The Last Jedi a go. It’s hard to overestimate the impact Star Wars has had on the worldview of what is now three generations of Americans, including my own. Unlike many of my compatriots, however, it’s never been a religion for me. Or at any rate, I’m an apostate. I, for one, thought the prequel movies were jolly good fun. I had no real expectations of those movies, nor for The Force Awakens. But since I was such a fan of Rian Johnson’s Brick and Looper, I’ll admit I might have set the bar a little too high for The Last Jedi. In the end, it was no better or no worse than the last four movies. Nothing gained, nothing lost. But it was apparently a real let down for the faithful because the expectations of this one were that it would dig deeply into Jedi mythology and lay down some canonical law for any future fan fiction. For me, the film’s pronounced lack of substance was actually kind of thought-provoking. I couldn’t help but think about what fantasy tells us about the truth function of art about which Heidegger is speaking in his essay.
Continue reading →
Elizabeth Bruenig has written an excellent piece in the Washington Post entitled “Why is millennial humor so weird?” While Bruenig is not the first person to diagnose the millennial condition through humor, her piece is the most clearheaded and insightful I’ve seen on the topic.
Bruenig focuses in particular on the aesthetics of absurdity in millennial cultural production, which, in contrast to absurdist aesthetics of the past, is not accented with outright pessimism:
Surrealism and its anarchic cousin dadaism are nothing new; neither is absurdism or weirdness in art. ‘The absurd,’ Albert Camus wrote in 1942, ‘is born of this confrontation between the human need [for happiness and reason] and the unreasonable silence of the world.’ Absurdity is the compulsion to go looking for meaning that simply isn’t there. Today’s surrealism draws aspects of all of these threads together with humor, creating an aesthetic world where (in common internet parlance) ‘lol, nothing matters,’ but things may turn out all right anyway.
I would add that millennial absurdism can further be defined against the cynicism and irony of postmodern cultural products (those belonging to baby boomers and gen x’ers). In postmodern culture, the central trope was self-referentiality–the practice of acknowledging production from within the production (think of the “S.O.B.s” episode of Arrested Development when the show found out it was going to be canceled). Here, we can go back to McLuhan’s distinction between hot and cold media. The postmodern aesthetic of self-referentiality was a bit like hot media in that its consumption was profoundly passive. It was so passive that its producers (writers, onscreen talent, etc.) positioned themselves as members of the audience, watching the production right along side us. In other words, even the producers removed themselves from the production. There was no need to go looking for meaning in context because, as the audience, we were the context. There was no meaning to be found outside of ourselves. Millennial absurdism by contrast takes the attitude that context is always yet to come; the audience must actively create the context by distorting the product.
Continue reading →
There are two myths about literacy which refuse to die. The first is that writing is simply recorded speech, and the second is that since the emergence of so-called “Internet 2.0,” we are moving back to an oral culture.
I’m sorry to say that linguists are among the main propagators of that first myth. Linguists are always quick to point out that writing came along at the eleventh hour in the overall story of human language, and that any impact writing has on speech is minimal. Both of those things are true, but neither of them warrant the further assumption that writing is just a derivative of speech. If that were true, writing—particularly alphabetical writing—would be much easier to do than it is. (As I always tell my students, writing never gets easier but you do get better at it.) More to the point, as David Olson argued, writing is a model of speech which therefore involves interpretation rather than coding and decoding.
Continue reading →
Dr. Ian Malcolm (Jurassic Park), one of the great characters in pop-fiction/film, couldn’t have said it better:
If I may… Um, I’ll tell you the problem with the scientific power that you’re using here, it didn’t require any discipline to attain it. You read what others had done and you took the next step. You didn’t earn the knowledge for yourselves, so you don’t take any responsibility for it. You stood on the shoulders of geniuses to accomplish something as fast as you could, and before you even knew what you had, you patented it, and packaged it, and slapped it on a plastic lunchbox, and now your selling it. You wanna sell it…
Malcolm was talking about making dinosaurs, but it could just as easily be said about Mylan’s jacking up the price on Epipen when it found out a generic was going to be allowed. This action is going to cause a lot of immediate pain. The callousness of companies like Mylan is morally reprehensible. But this particular company’s profiteering is just part of the rot. The underlying rot is the capitalization of intellectual property, the buying and selling of patents on the open market.
Why should we expect Mylan or Turing or anybody else in big pharma to have any moral investment in the drugs they sell? They didn’t, as Malcolm said, earn the knowledge for themselves, so of course they take no responsibility for it.
Suggested reading: Philip Mirowski’s Science-Mart.
I recently came across Zizek’s (with apologies for the inappropriate orthography) post on The Philosophical Salon, in which he defends himself from tweeters and comment sectioners. I’m not terribly interested in the specifics of his rebuttal, except to say that it’s a fascinating state of affairs when the likes of Justin Bieber and Taylor Swift seem far better equipped to defend themselves than Slavoj Zizek. “The medium is the massage” and all that.
As always, though, Zizek has a way of turning the perfectly intuitive into something worth arguing. I’m referring to the following passage from Part Two of his defense:
The stance that sustains these tweet rejoinders is a mixture of self-righteous Political Correctness and brutal sarcasm: the moment anything that sounds problematic is perceived, a reply is automatically triggered—usually a PC commonplace. Although critics like to emphasize how they reject normativity (“the imposed heterosexual norm,” etc.), their stance itself is one of ruthless normativity, denouncing every minimal deviation from the PC dogma as “transphobia,” or “Fascism,” or whatever. Such a tweet culture, combining official tolerance and openness with extreme intolerance towards actually different views, simply renders critical thinking impossible.
Again, I have nothing to say about the specifics of this, as I have neither the expertise nor the ethos. But in general terms, he points to something I’ve been struggling with for a while: How does one separate an authentic political movement from just another iteration of populism? Although populism goes hand-in-hand with reactionary thinking and so mostly afflicts those who identify with the Right, that’s not always the case. Badiou has devoted much of his career to figuring how to draw the distinction between a properly transformative Event and a reactionary episode. However, Zizek points to something much simpler. Perhaps populism is just a multitude with an orthodoxy (which of course is a contradiction).
Or maybe I’m just looking to preserve my own ego, for instance, for having been a wheaty Bernie Sanders supporter without being a chaffy BernieBro.
I’ve started working out at the gym again. This is primarily because I can’t stand the thought of other people getting fit because of a fun internet phone application. How dare they?!? Fitness is supposed to be about envy and shame, not whimsy. Everyone knows that.
I’m sure the renewed exercise is healthy for me, but it does feel a little unwholesome.
Usually, whenever a publicity starved celebrity incites an internet indignation orgy with an off-color comment, I give it the old Lucille Bluth eye roll (see image above). And I’m tempted to do the same with Martha Stewart’s latest one about millennials’ ignorance and lack of initiative. It’s a blip in the news cycle. There are much more important things going on. Martha Stewart’s cultural megaphone doesn’t have much amp anyway. But I’m happy to steal her kairos to say something I’ve been wanting to say for a while.
My contemporaries and I live in a shadowy space between generations. We’re too young to be GenX and too old to be Millennial. The terms GenY and Nintendo Generation have applied to us, but we never got assigned a definite character like the boomers, Xers, and millennials got. In the classroom, I’m usually young enough to get my students’ pop culture references but too old for them to get my references (I’m guessing no more than 25% of them know what a Lucille Bluth eye roll is, for instance). It’s a strange asymmetry, but I believe that at this brief moment I’m on a sweet spot where it’s possible to have both an insider’s empathy and an outsider’s perspective. Or at least I have enough millennial narcissism in me to think so.
Continue reading →