New lower barrier means commodification.
The vast majority of US housing construction is tract housing, which is a commodity. In the EU, flats, which are also commodities.
https://www.nahb.org/blog/2025/08/custom-home-building-grows...
Houses/buildings are each isolated physical structures.
Software is trivially and instantly replicated, and the same software can serve millions.
Also, even in your example you're just the commodified roofer or construction worker. Not the non-commodified house.
edit: i love how this is getting downvotes but no further responses. y'all are in denial. let me ask you this: why is the most common interview loop round a generic LC round? lolol
Skipping the lols, here's the answer to your question: doesn't matter if developers "are already commodities" to some degree.
First, because that degree is small, else developers wouldn't command such high salaries relative to other trades. So they might be commoditized compared to surgeons, but not at all compared to most office or blue collar trades.
Second, even if they are commoditized to some degree, the argument is that AI will bring further commodification. Not that it will introduce the first and foremost case of commodification in the developing world.
lololol something can be a commodity and still expensive. to wit: have you heard of this thing called oil which is recently very expensive?
> do you think in lols or do you ever sit and consider something more deeply?
i think deeply enough to recognize when someone's reasoning is so flawed they should've almost immediately reconsidered their claim upon conceiving of it. and then i laugh out loud (at them) when they didn't. occasionally many many times.
Lower barrier to entry means the developers are even more interchangeable than now.
Developers are scribes - we have sacred knowledge that is now being democratized because everyone can do it due to good enough tools. As a result, we won't be needed much going forward.
The ability to solve problems is what’s important. Not your ability to remember things or to hold sacred knowledge.
Like paint, it can be used as a tool, to paint your house, or as a craft and artform, to paint the Rouen Cathedral.
Is it knowing how to write a regex without a reference, or maybe implementing a distributed ec postgres cluster using bash, ooh how about writing a minimum cnn in C for edge classification ooohhh wooowee…
Ever worked construction? There’s hammer swingers that need one swing per nail and never miss. Or plasterers that make chalk look like marble. How about a high voltage lineman that can switch a 20kv oil-cooled transformer in less than 15 minutes to get the power to the school back on
No different from any tradesman - we’re not special
My experience is that Claude starts to make quite a mess in this context, and it'll often cause as many problems as it solves unless you have the technical and domain knowledge to redirect and correct it frequently. Perhaps training will solve this, and it'll certainly get better, but I'm not sure how far it'll go and how fast.
My gut feeling is that software will only become more ambitious and interface with hardware and other systems in increasingly sophisticated ways. Things that seemed infeasible due to time and cost constraints will be on the table. It'll reveal new challenges, I think. I have a feeling it'll be humans with deep technical skills who are at the forefront of solving those challenges for a while yet.
Not claiming I have the skills and to be one of those people, just that it's where I'm pushing my career at the moment.
I'm stoked that people like this have the resources and newfound capabilities to create solutions like this. The reality is that previously, many people have been underserved due to the economics of software and inherent risks of trying things like this as a smaller business owner. So this is great. We can find more ways that software can be valuable, and people can do their jobs better in ways they've literally only imagined before. It's great.
Will this means many will be jobless? No, they would do other things. They'd be using this software to support society, operating at a high level. Think low-code, but incredibly complex stuff; just not raw code anymore. Instead of making circuit boards out of descrete components, you now slap a few ICs on a board with some supporting passives and the work is then all done in software. Engineers use more high-level components rather then welding and machinijng things from scratch; you buy T-slot profiles and bolts rather than casting and milling steel from billets.
So the job of programmer may disappear simmilar to how we don't have bakers anymore, baking is done in factories, operated by a small staff. Current-day programmers will then increasingly shift to whatever high-level constructs we'll come up with, this high level work will be supported by the base infrastructure that those who still touch raw code will build.
I think smaller groups handling more complexity is on point. But that's because each group will build their own bespoke factory catered to their exact needs.
I very fully expect a mass proliferation of custom programs rather than standardizing on a common set that groans under the weight of being so general to support all use cases.
Expect anthropic to want to capture more of the supply chain over time
I'm not saying this particular individual is wrong in trying to build his solution to the market. Maybe there is some VC money to be made in this moment. But as AI in the workplace gets normalised, most people will either come up with solutions for their problems, or they will ask someone they know to help them with this.
scale will only matter if you are explicitly building a platform. That will still require real software engineering skills.
As for hardware interfacing, if I am not mistaken, almost all companies selling hardware right now still behave like babies when it comes to users getting access to the software inside it. They void warrantees, sue them, so on etc. For ambitious user driven software innovations in the hardware space the companies should open up their interfaces. I don't see this happening at all not only because of the companies' greed but also for regulatory and safety reasons.
I find that scalability is usually overblown because computers are fast now, which is not to say you shouldn't make it run fast on one computer.
It reminds me a lot of my early career spent remediating offshored PHP applications.
It does strike me as a little odd that they didn't hire a developer earlier and got the code written. Sitting back and waiting for someone to drop by and present a solution is a little naive, but it's also the world we built in the IT industry over the past 20 years. When I started my first job, we frequently had customers ask for bespoke solution, most of which was small one week to a few months of work. Multiple co-workers in the mid 2000s has side businesses, where they did contract development, most of which was these types of small one off solutions. Most of the software companies, in my area, that did these types of jobs are all gone now.
If AI accidentally created an environment where people can once again solve small programming problems on their own and massively improve the workflows I'm all for it. Serves the industry right for abandoning these customers.
Maybe they didn't have the expertise to pick a software stack that would serve them in the long run, or they just didn't have the budget to hire a SWE or team full time, or their contractor team just wasn't super invested in the project.
So tech people look at "vibeslop" as unmaintainable technical debt, but they ignore that in a lot of situations their own salary is what makes the tech debt unmaintainable. Maybe that's uncharitable, but I do think many techs are very far removed from the "solve a problem and then dogfood it" cycle
It also coincided with the hollowing out and offshoring of practically all US industrial and manufacturing capability.
I will be very happy if the result of AI means we go back to how things should actually be - where technology/IT is used to support real world things and acts as a backstage enabler to get shit done. Not the main event.
I often said since the early 00’s my dream would be to have made enough money in the insanely stupid “tech for tech sake” world to go back to just being one of a few “IT guys” supporting a factory and keeping the machines running. These jobs of course exist, but due to tech salaries very few small manufacturing businesses could support hiring such a person.
There is now a generation or two of technologists who don’t understand that the job isn’t to learn the latest hot web framework or yammer on about best practices or whatever. It’s to support a business in shipping actual products to customers.
We've long made fun of excel-jockeys getting carried away with VBA, but they came into being because engaging with turgid and expensive software companies to do important but small jobs was such a pain in the ass. This is the start of a new era, and while I am sure we're going to see some wild fiascos, it is a move in the right direction for people that need to solve problems with computers.
The software industry has just abstracted every problem to the point of being unable to solve anything.
I went to college with a lot of actual engineers - mechanical, electrical, chemical, etc. In those fields you are designing products and then engineering processes to output a cog of some sort (drug, car, GPU, iPhone, etc) in the thousands to millions.
In our fields as SWEs, a lot of our job it's like the trades going into a house to install HVAC, fix a burst pipe, upgrade a circuit breaker, replace a furnace, etc. No two setups are exactly alike, no requirements are exactly alike, etc.
Even in the age of LLMs I think the industry remains more artisanal than engineering. And that's not a knock on us, I think it's because what we do is essentially automate business processes.. and no two businesses are alike. I don't think LLMs replace the role, it just makes parts of our job faster. The mindset of how you automate something doesn't generally exist in the minds of people who want the automation.
So the options are: 1. the program involved here is really trivial, 2. it hasn't evolved long enough for the agent to fail at evolution, or 3. others are not seeing what I'm seeing.
I also find Go works really well, and generally stays, if not exceptional, than at least maintainable.
I've also enjoyed using OCaml, but I will say that I found the single worst function I've ever seen in a codebase in vibecoded OCAML.
You might just try asking - "hey I'm having trouble keeping maintainable codebases - how can I structure this project in a way where the code will be stable long term".
Sometimes getting the "software architect" role into the agent context is all it takes.
LLM's shorten that time for every application and every user, but particularly for users from professions that haven't built modeling or debugging skills because they rely on physical reality - like pipes fitting or process supervision - to weed out non-performers.
Hiring for LLM-enhanced work should focus on debugging skills in unknown situations.
AI provides access to much better tools for testing and quickly experimenting with new ideas.
The only ones who should be worried are companies that charge millions for four junior developers and an agile coach, and deliver more PowerPoints than code (I’m looking at you, Capgemini).
For example, we've built in a lot of complexity to areas like authentication. And for good reason. It's like electrical code. I'd pay good money to watch a muggle attempt to configure OIDC infrastructure. Even with the AI explaining everything to you, it's too much information to digest at once. You'd need an entire afternoon just to wrap your head around the idea of asymmetric cryptography. That's a lot of time not spent doing the thing your business is actually about.
> His fabrication shop uses it daily, and he built the entire thing in 8 weeks. During those 8 weeks he also had to learn everything about Claude Code, the terminal, VS Code, everything.
I don't see how he can give this summary with a straight face after posting the interview that CLEARLY contradicts it.
In the interview the engineer says "When Claud Code came out almost a year ago, I started dabbling with web based tools ..." and "When it first came out I had so many ideas and tried all these different things", so he had clearly already used extensively it for a year. I would also guess the engineer was somewhat technically minded from the get-go, since he claims he was "really good with excel" before starting with Claude Code, but that is beside the point.
The interviewer later asks "How much of those 8 weeks was learning Claude Code versus actually building the thing?", and the interviewee answers "Well, I started Claude Code when it first came out so the learning curve has really gone down for me now..." and then trails off to a different subject. Which further confirms that the summary in the post is false.
It really seems like the engineer has spent the year prior learning Claude Code and then spent 8 weeks on solely building this specific application.
The interviewer also claims "This would normally have taken a developer a year to build", which seems really unsubstantiated. It's of course hard to judge without all the details, but looking at the short demo in the video, 8 weeks of regular development time from a somewhat experienced developer doesn't seem too far fetched if the objective is "don't make it pretty, just make it work".
As I said, it's a really interesting case study about a paradigm shift in how software is developed, and it's clear this app would never have existed without Claude Code. So I don't really see the need for the blatant lying.
Coding has never been the roadblock in software. Indeed don’t we experience this now with ai? Vibe code a basics idea then discover the things we didn’t consider. Try to vibe that and the code base quickly gets out of hand. Then we all discover “spec driven development” SDD and in turn discover thinking of specifying everything our selves is an even bigger of PITA?
Because this is an advert
But instead we’ve found a way to circumvent the process. Losing the understanding of your own problem and the new ideas that come off the back of it.
I’m reminded of the story that NASA had a research project to make pens that would work in space, and Roscosmos just used pencils. I always thought NASA came off worse in that anecdote, but I wonder what they learnt while making the pen…
Both agencies used pencils, but they were problematic because the graphite could break off / float around / cause shorts.
The space pen was developed by Fisher independently of NASA. NASA bought 400 of them for $2.39 each. The Roscosmos later bought 100 for the same price.
Firstly, pencils in space pose serious risks. Pencils produce dust, graphite dust is conductive, and won't settle down in microgravity. They were used early on, but both space agencies phased them out when they realized the risks. After that, they first moved to grease pencils, which kind of suck for normal writing.
NASA didn't research how to make pens that work in space, an American private company did it on their own initiative and money. Then they sold pens to NASA for cheap, and marketed the same pens to people not in space for a lot of money and made a nice profit.
Today, both Roscosmos and NASA use the same pens, bought from Fisher.
Most engineers have to take at least one programming class in college.
I majored in mechanical engineering at college. We had a required programming class. A lot of people like myself already knew how to program before we took the class too. We also had a required electronics class. My experience is that most folks with CS degrees would be surprised by the breadth of what mechanical/aerospace/chemical/etc. engineers learn.
Tbh this is nothing new; we knew technical people with Claude code would be able to program well enough that tbey would be business developers.
And the owners of those job shops aim for 3 shifts per worker via automation, and mash their own software with AI already. They are ruthless at cost cutting and automation and AI tools are perfect for them.
Unfortunately the vibe I get talking to them is essentially a triumphant "why would I need you, I have AI" or "yeah you're screwed".
I can't blame them for being served expensive barely functional crap SaaS or ERP software for ages, but I was not expecting to be viewed as part of the problem coming from a robotics, automation, and optimization background myself. It's just all a block of overpaid swindlers to them.
People in the trades have a ruthless pragmatism that SV has forgotten.
But before LLMs, computers couldn't understand that phrase. Now they can.
bullshit story always leave something like this.
But even then, it also says 5m can save days of work. Days is minimum 2days, or 16h, or 960s. That’s not 10x faster as previously stated, but 192x faster.
So yeah, it doesn’t add up.