Claude Code is like crack cocaine for CRIMPers
< Next Topic | Back to topic list | Previous Topic >
Posted by Luhmann
Nov 15, 2025 at 12:22 AM
I spoke to a professional programmer about how to better use Claude Code and they recommended having it always tell you what it is going to do before doing it, and then write itself a report about what worked/didn’t work after it is done. It also helps to instruct it to offer two or three solutions to every problem: one “elegant” and one “succinct.” There are a lot of little tips like that you can find if you search for how to configure your claude.md file.
Posted by Amontillado
Nov 16, 2025 at 03:05 PM
The most advanced general intelligence processors available - our brains - decide that we can leverage our resources better with vibe coding.
It’s sound reasoning even if it opens the door for slop, but I have a question.
Is there any reason to assume future AGI will find different justification? Will it want to expend its exaflops and petawatt hours on something it could delegate to lesser intelligences like LLMs?
Walking the reasoning back, are we making the best use of our resources, draining our water tables and putting homes in competition for power while bypassing the practice of personal achievement?
I’m all for advancement in computing. The current AI craze seems inelegant to me.
OpenAI’s plans to spend $1.4 trillion on a $300 billion speculative investment bankroll seems like the very picture of optimism to me.
As we say here in the American South, bless their hearts.
Posted by satis
Nov 16, 2025 at 03:47 PM
AI’s resource footprint looks big mostly because the numbers are unusual, not because the impact is unprecedented. Put into context, datacenter use is still well below crypto mining, residential cooling, gaming rigs, or industrial processes. The sky is not falling, although it makes great newsbait. And since AI systems help reduce total energy use in other sectors via grid optimization, logistics, materials discovery, etc., the picture is more complicated than ‘AI drains water tables.’
The worry about bypassing personal achievement echoes every technological shift from handheld calculators to CAD. Tools automate the routine parts of work and raise the ceiling on what individuals can accomplish; they don’t erase the human element. There’s no basis to assume a future AGI would prefer not to expend compute the way a tired human might. It will do whatever its objective function demands, whether that’s more compute or less. Machines don’t inherit human motivational structure.
As for the huge investment numbers, every transformative technology looked like reckless optimism before the returns arrived. Alternating current was dismissed as niche. Influential monks and scholars predicted the printing press wouldn’t last. In the 90s economist Paul Krugman infamously compared the impact of the internet to the fax machine and doubted its economic significance. Camera companies underestimated the imact of digital cameras… and then underestimated smartphones with cameras.
Tech also has a long history of real bubbles that left behind hugely valuable infrastructure and profitable winners.
The dot-com crash looked absurd in the moment but it built the modern web. The fiber-optic glut seemed like waste until streaming needed every strand. Just because today’s AI wave has bubble-like froth (and some inelegant code) doesn’t mean it isn’t producing real, durable capability that advantages people and companies who use it over those who don’t.
If the upside of more capable AI is even a fraction of what people expect, the scale of investment won’t look extravagant in hindsight.
Posted by Chris Murtland
Nov 16, 2025 at 04:41 PM
How long will any competitive advantage derived from “using AI” last? It seems if the tech gets more powerful and easier for anyone to use (and baked into everything anyway), it will quickly be reduced to something like “my competitive advantage is that I know how to send an email,” i.e., not really an advantage. If anything created can be instantly replicated by anyone else, the only marketplace value left has to come from something else besides creation; but what is that something else? Attention for advertisers, I guess.
Perhaps it would be easier and ultimately more advantageous to just wait for the point at which my AI avatar, without all that much input from me, can represent me in the AI loop (AI creates plus AI consumes). Hey, AI me, go do AI stuff in the AI world and make it seem like I really know what I am doing in that world.
Posted by satis
Nov 16, 2025 at 05:58 PM
It’s important not to conflate access with mastery. Even when a tool becomes universal (and decades later that still isn’t true worldwide for something like the internet), the baseline advantage levels but the advantage for people who use it well doesn’t. Everyone got the internet, but not everyone became Amazon. Everyone got spreadsheets, but not everyone became a great analyst.
If creation becomes cheap or instantly replicable value doesn’t vanish, it just *moves* because AI commoditizes production but not insight, direction, or strategy. Differentiation shifts from creation to the parts machines can’t do for you: choosing what to make, taste and judgment, speed of execution, trust, brand, distribution, and integrating tools into real workflows.