I drive, a few hours early for a political luncheon, to a nearby coffee shop. Up the road through blocks of white, square buildings, lined up next to and behind each other. I’m dehydrated and wearing sweatpants that look like slacks, while the road slopes up and a white Mercedes tailgates me. I hope for a view at the top. But instead, I pull into a large parking lot on the hill, bounded by more large, rectangular white buildings. No view. Only concrete. I stop my car, set a timer for when I need to leave, and then walk into the shop, light headed. 

I sat in a conversation yesterday with other photographers about AI. AI is thirsty, someone said. It chugs water. I thought of California’s water problem, and the “West” with the constant threat of Water Scarcity. And someone else said that it was impossible to differentiate AI from a regular person’s writing, at this point. In fact, what you’re reading right now could be totally artificial. Imagine a robot as a narrator. And I wonder if philosophers in hermeneutics or semiotics anticipated this sort of robot-as-narrator, with their early-day pre-internet words like Cybernet and Cyberspace, thinking about networks through a Chinese room, or Having No Mouth and Must Scream. 

This someone else said that there’s no point in learning anything anymore, with the internet at your fingertips, and AI doing your thinking for you, and I asked her about Bloom’s Taxo—and then I stopped myself because I didn’t want to be a pretentious asshole—but there are different sorts of metrics for “critical thinking,” and all require a sort-of creative leap and some sort of specificity with a text or a situation or an idea. And it’s incredibly difficult for AI to be able to make these sorts of qualitative and creative comparisons (at least last I checked), or to be specific—telling you what a quote *means* and how we would understand that sort of meaning, or to even have a critical understanding of what makes up a text or a work (but, it’s also difficult and a bit scary for people to do this sort of work too…).

And while this someone else kept wondering what motivates people to learn (I thought of how important it was for me to understand and fight for queer perspectives on Christianity, and that unfolded into a more abstract curiosity that motivates me to learn), one person said that not many people have ever been motivated so much to learn at all, and now it’s just easier for people to pretend to be interesting, so, of course, I thought of a conversation from the other day…

I sat at my computer working, while my brother sat on his computer gaming, with his friends on voice chat. And I did not know the microphone was on, so I started venting to my brother about how I’ve become a bit more jaded about gay men, and how I used to hold this sort-of optimistic idealism for the “gay community” based on gay history and queer theory—that we could all reshape the community in ways that were “healthy” and positively kind to each other (let’s all read Mary Oliver and sing songs around a campfire), rather than reproducing structures of “gay trauma.” And I had to pee, right as my brother’s friend responded. “It’s like this theory I read,” he said, and began talking about queer and racial enclaves, like nations with only one race, and how this was a failure of pluralism, and that gays should do their best to integrate into society, and how it was “extreme” for people to want little enclaves of their own (I asked him what makes something “extreme” and he “well if you plot a bell curve and it’s at the ends of it,” and I said “but how do you get the bell curve?” and he said “you measure it,” and I asked “how?” and he said “that’s not important” and changed the subject, but not before suggesting that all minority opinions, in reference to a bell curve, were “extreme”). And so he hopped between topics (I asked: “don’t you think it’s a bit of a stretch to compare racial and sexual “enclaves”—especially when LGBTQ are often raised in straight households and have to migrate to an enclave, but in your situation, a racial enclave could be entirely self-sustaining and isolated?” and he replied “you have to be charitable, Blake,” and I thought, “That’s not how charity works,” and now, reflecting, I guess he’d be pretty charitable to a computer posturing for thought).

Last night, I video-chatted with my other brother. I told him that I asked the AI for a self-portrait, over text. He said he had asked the AI for a self-portrait the day before, and it gave him an image of a person looking at a mirror. For me, over text, the AI said that it was “in essence” “a network,” and I asked what materials constituted it, and it did not mention water. Our skin shines in our own self-portraits even if our lips are chapped; our language glistens in particular ways that generalized knowledge cannot contain. But the AI, perpetually scraping and perpetually thirsty, operates like a brute that never sweats. It speaks, ignoring its foundation.

Which is all to say that, as I am sitting in these concrete structures, about to shoot images (that medium most transparently constructed of surface) for a political entity that I probably don’t quite agree with, I wear sweatpants that look like slacks as a slight rebellion: as long as I don’t reach my arms too high up, and reveal the drawstring waistband, then these look as formal as it gets, but remain comfortable for me.