> I see the same thing repeated in various front end tooling too. They all claim to be _much_ faster than their counterpart.
>
> 9/10 whatever tooling you are using now will be perfectly fine
Are you working in frontend? On non-trivial webapps? Because this is entirely wrong in my experience. Performance issues are the #1 complaint of everyone on the frontend team. Be that in compiling, testing or (to a lesser extend) the actual app.
Worked on front end for years. Rarely ever hear people talking about performance issues. I was among the very few people who knew how to use the dev tools to investigate memory leak or heard of memlab.
Either the team I worked at was horrible, or you are from Google/Meta/Walmart where either everyone is smart or frondend performance is directly related to $$.
Uh, I've worked for a few years as a frontend dev, as in literal frontend dev - at that job my responsibility started at consuming and ended at feeding backend APIs, essentially.
From that I completely agree with your statement - however, you're not addressing the point he makes which kinda makes your statement completely unrelated to his point
99.99% of all performance issues in the frontend are caused by devs doing dumb shit at this point
The frameworks performance benefits are not going to meaningfully impact this issue anymore, hence no matter how performant yours is, that's still going to be their primary complaint across almost all complex rwcs
And the other issue is that we've decided that complex transpiling is the way to go in the frontend (typescript) - without that, all built time issues would magically go away too. But I guess that's another story.
It was a different story back when eg meteorjs was the default, but nowadays they're all fast enough to not be the source of the performance issues
Agreed. Optimistically let it resolve merge conflicts in an old complex branch. Looked fine at first but was utter slop upon further review. Duplication, wildly unnecessary complexity and all.
AI is not a bigger grift than crypto. Crypto produced basically nothing of value. If all model improvement stops today, Opus 4.5 with Claude Code is a huge leap in productivity building certain types of software.
Just one more new model bro the next one is AGI bro just give me a trillion dollars and I’ll build the datacenters and everything will be perfect bro I promise bro please
Even if it doesn't see any improvements beyond this point it wouldn't be a big deal. It's good enough for most programmers and any improvements are just a bonus.
The masters of mankind are yearning to replace expensive tech workers with this. With agentic versions of LLMs we are at a point now where they can (and should) certainly try and create a more hilarious world
probably because you just install it, then you log in and youre done. tailscale takes care of the rest. going through any more effort just so you can write some slop code is probably not worth it
That’s interesting. Why do you think this is worth taking more seriously than Musks repeated projections for Mars colonies over the last decade? We were supposed to have one several times over by this point.
Because we know how much power it's actually going to take? Because OpenAI is buying enough fab capacity and silicon to spike the cost of RAM 3x in a month? Because my fucking power bill doubled in the last year?
Those are all real things happening. Not at all comparable to Muskan Vaporware.
You do not seem to be familiar with Rob Pike. He is known for major contributions to Unix, Plan 9, UTF-8, and modern systems programming, and he has this to say about his dream setup[0]:
> I want no local storage anywhere near me other than maybe caches. No disks, no state, my world entirely in the network. Storage needs to be backed up and maintained, which should be someone else's problem, one I'm happy to pay to have them solve. Also, storage on one machine means that machine is different from another machine. At Bell Labs we worked in the Unix Room, which had a bunch of machines we called "terminals". Latterly these were mostly PCs, but the key point is that we didn't use their disks for anything except caching. The terminal was a computer but we didn't compute on it; computing was done in the computer center. The terminal, even though it had a nice color screen and mouse and network and all that, was just a portal to the real computers in the back. When I left work and went home, I could pick up where I left off, pretty much. My dream setup would drop the "pretty much" qualification from that.
I don't know his history, but he sounds like he grew up in Unix world where everything wanted to be offloaded to servers because it started in academic/government organizations..
Home Computer enthusiasts know better. Local storage is important to ownership and freedom.
OpenAI's internal target of ~250 GW of compute capacity by 2033 would require about as much electricity as the whole of India's current national electricity consumption[0].
And environmental damage. And damage to our society. Though nobody here tried to stop LLMs. The genie is out of the bottle. You can still hate it. And of course enact legislation to reduce harm.
> It was very much the opposite of Chomsky's ideology as well.
On the contrary. Chomsky was open about his civil-libertarian principles: If you are convicted, and you complete your court-ordered obligations, you have a clean slate.
Tell me, did that attitude extend to helping billionnaires who are having sex with minors? Because that's what he did. Is that what this ideology stands for?
>
> 9/10 whatever tooling you are using now will be perfectly fine
Are you working in frontend? On non-trivial webapps? Because this is entirely wrong in my experience. Performance issues are the #1 complaint of everyone on the frontend team. Be that in compiling, testing or (to a lesser extend) the actual app.
reply