Discussion about this post

User's avatar
VL's avatar
Mar 2Edited

Adding to John Rudisill's comment: I've been exceedingly dismayed by the way naive technophilia has led so many faculty colleagues in the sciences to embrace AI. I actually suspect it's the ultimate motive behind the hostile takeover of the US government, and universities should take heed. Several observations:

1) Back in 2023, the hype surrounding AI exploded just a few months after the cryptocurrency bubble burst. That seemed very suspect to me, since AI has been around for decades.

2) Biden-era regulations on speculative investments and lack of public enthusiasm mean that the tech bros, who invested billions in crypto, will lose their money without deregulation and a big push from the government. Similarly for AI: corporations have invested heavily, which has led to a speculative bubble, and Chinese technology is now a threat, which makes sense of Trump's protectionism and cozying up to Russia.

3) The data centers that enable AI are a horrible energy sink. According to the IMF (International Monetary Fund), in 2022 these datacenters were already responsible for 2% of the world's energy consumption and 1% of global emissions, and their usage is supposed to double by 2026. I bet it already hasโ€”such predictions are inevitably optimistic, and Musk rushed to build the world's largest data center, which he calls The Colossus, in 2024 before the election. Trump/Musk's rush to decimate climate policy makes sense in this context, along with their flip-flop on Ukraine (they need to corner the market on minerals).

4) The attack on universities makes ideological sense but Trump is not an ideologue, so what's going on? What is the end-game of cutting funding to the NIH and NSF? Universities serve as economic, cultural, and social anchors in American cities. In the area I live in now, if the universities were to, say, lay off half their staff, the private sector wouldn't be able to take up the slack and you'd wind up with a massive economic depression and people not being able to pay bills. I don't think Trump and Musk are that dumb. My betโ€”and John Rudisill's note provides indirect support for the ideaโ€” is that the federal administration will back off the draconian cuts under some pressure from the courts but say "we'll let you keep your funding on the condition that you develop AI programs/rely on Starlink/sign an exclusive license to use xAI" or some such.

What Musk wants is data and money. Exactly why remains a puzzle to me. I don't think he wants to go to Mars himself...I think him more likely to fancy himself one of the Eloi and the rest of us as Morlocks.

And if anyone needed any further persuasion that AI is being put to evil uses, check out this piece at the LRB on how senior staff at Google, Microsoft, and Amazon in Israel empowered the destruction of Gaza by greatly expanding the Israeli military's access to cloud computing and AI tools: https://www.lrb.co.uk/blog/2025/january/militarised-ai?utm_medium=email&utm_campaign=20250205BlogUSRW&utm_content=20250205BlogUSRW+CID_3af904971c3e354ae6ae7d7f46c9ee89&utm_source=LRB%20email&utm_term=Militarised%20AI

Expand full comment
John Rudisill's avatar

As I read this I am in the midst of a struggle against administration and some colleagues (outside of my department) to stop before it starts a new โ€œacademic minorโ€ in โ€œcomputing (read: A.I.) for the Arts and Humanitiesโ€. We are told โ€œcomputing is just a tool for solving problemsโ€ and students in the humanities can benefit from the power of this tool to โ€œsolve their disciplineโ€™s problemsโ€. I wish I could imagine a scenario where this goes through and the result is that, once it does, we can finally turn to the more important human endeavor having fully turned the utilitarian bullshit over to the coders and AI prompt authors. My worry is that the acceleration is away from the capacity to even recognize the immense intrinsic value of that more important stuff we have to do.

Expand full comment
19 more comments...

No posts