Hacker Newsnew | past | comments | ask | show | jobs | submit | jeffbee's commentslogin

It depends on what you're doing. Steam turbines are absolutely full of exotic alloys. But I tend to agree that large-scale geothermal would be an important component of our all-of-the-above energy policy, which would profit from our existing expertise in punching holes in the ground.

Yeah if you go to CDC WISQARS you can do fatal injury reports filtered by intent (suicide) and aggregated by urban/non urban geography. These differences are not small, they vary by factors or orders of magnitude in every state. It's not the weather.

America basically did not add any net generating capacity in the first two decades of this century, instead treading water with repowering and efficiency. This was a mistake and now that we could use the energy everyone is acting like it's impossible to expand the grid at the same rate we expanded it in the 1980s.

In many ways this mirrors the way America walked into the housing crisis with its eyes closed.


I don’t understand how it could have realistically been different. In say 2001 how can you possibly make the case for very expensive grid expansion for future loads that haven’t been invented yet?

Thats fine. If we did not need it, then we didn't need to build it out using what was at the time more expensive technology. But in 2026 we should not be pretending that the rate at which the grid expanded in the 1980s was caused by alien technology transfers. We can easily repeat or exceed that expansion. Even the most outrageous predictions for IT loads do not exceed what we did in the 1980s.

absolutely agree with that. However, its not so much the capability, its the cost. In 2026 big projects cost a lot more, whos gunna pay for it? In the 80s we all paid for it, but we roughly all benefited as we got more and more electric capacity and day-to-day use cases. Today, it looks like we are all gunna pay for it, but only the datacenter owners are going to benefit. That model is broken.

Well, I don't think the evidence supports that. According to two recent LBNL reports consumer prices are lowest in states with huge demand increases (Texas), and highest in states with shrinking demand (California). The existence of large consumers tends to amortize the cost of grid updates.

theres tons of news articles going around about how datacenter installs are causing large local rate spikes. for eg: https://www.consumerreports.org/data-centers/ai-data-centers...

> That same Bloomberg analysis found that areas with high concentrations of data centers saw electricity prices jump 267 percent over the past five years.

> director of Harvard Law School’s Electricity Law Initiative and co-author of a March 2025 paper exploring how the public is funding Big Tech’s power-intensive facilities. “Utilities are building infrastructure, and then we all pay for it because that’s how the utility business model has always worked,” he says.

> Residential electricity costs are also rising because the rush of new hyperscale data centers wanting to draw power from the grid is spiking demand. That drives up prices for everyone, Peskoe says


There absolutely is a narrative out there, but it's mostly unfounded. Bloomberg ran a completely absurd article about how AI was causing voltage drops in Colorado. Totally insane stuff, some of their contributors are pushing an agenda.

This Dept. of Energy analysis, which was recently updated, makes a lot more sense. https://www.sciencedirect.com/science/article/pii/S104061902...


Interesting. Though their conclusions are pretty weak:

> In some cases, spikes in load growth can result in significant, near-term retail price increases. Results from recent capacity auctions in the mid-Atlantic region prove this point, with sizable impacts on retail pricing beginning in 2025 (e.g., Howland, 2025). The duration of such impacts remains unclear, however, and will depend on the ability to build new cost-effective infrastructure to serve new loads. In other cases, utilities have argued that load growth will reduce average retail prices, consistent with our analysis of recent impacts (e.g., PG&E, 2025). Overall, our results cast doubt on the simple view that load growth will necessarily increase prices over the medium- to longer-term. Emerging evidence from 2025 suggests near-term impacts that can be either positive or negative; medium- to longer-term effects are uncertain.

Basically says “Maybe it makes retail more expensive, maybe it doesn’t”

And quite frankly I no longer fully trust the DoE. Politically captured by the trump administration, and directed to lie about renewables. Probably the folk writing this study are still trustworthy, but sadly I have a seed of doubt now.


Imagine believing in "regional IQ".

I don't know anything about Alabama but in California you generally can't create off-grid developments without permission from a local authority, because it's a recognized problem that "off-grid" systems are often under specified, leading to danger for the occupants. And nobody really wants off-grid to proliferate because it would tend to concentrate the costs of the grid upon the remaining users who will be the ones least able to afford it.

For a place that was two miles from a power line, I would think anyone would approve of off-grid.


Lots of places that will get $150k+ quotes for electrical service too.

At that point, off grid is a no-brainer for everyone except industrial users (and those lots aren’t useful for them anyway).


In general, contractor overhead in America is obscene, compared to Europe. We have a lot of regularly capture working to keep it that way, too.

DIY is viable if you're a bit nutters (like me).

I just paid ~$35k (pre-now-expired-tax-break) to install a grid-tied 25kw ground mount system. I DIY'd everything except the connection between the array and the grid, which I paid an electrician to do, and the trenching which I paid my buddy with a mini-excavator to do.

It was a bit of a PITA, but mostly because I didn't finally make up my mind to do it until October and had to have it constructed by Dec 31st to take advantage of the expiring tax credit. If I'd given myself 6 months, it would have still been a big project, but way less stressful.

My neighbor's paid the same price to a contractor for a 11kw system.

Even at 46°N, and with relatively cheap electricity, my system should pay for itself in 6-8 years.


In EU it would be some $3k for inverter, $5k for panels, another $5k for cables, connectors and mounting and that's it if you DIY everything. Prices with VAT included.

Same in the Philippines here, and we're all buying the same Chinese materials at the end of the day so somehow Americans are getting really fleeced hard on this equipment.

Payback time is 2-4 years.

It reminds of healthcare and infrastructure in the US. When you really dig into why both are so expensive, it's literally every step. Every link in the chain between supplier and consumer is some kind of inefficient market, or burdened by regulations, etc.

Americans are just so rich they don't care enough to see these huge margins and undercut the competition, which is what happens here and keeps markets much more efficient.


Do you have a blog or a writeup about this?

What would have been the cost if it was not DIY'd? Is this doable only in a rural/semi-urban settings?


Being an honorary or actual redneck in an exurban American setting will be the sweet spot for this. Your neighbor's rusting Bobcat is not useless after all. You have the space for ground mounting. I toyed with a rooftop solar DIY project with an electrician handling the AC side, but in my urban context PG&E wanted a six-figure fee for a subterranean transformer upgrade. In 2024 the state regulator established rules that PG&E can't charge for that kind of service upgrade so maybe I should start considering it again.

It all comes back to insurance- they're used to getting crazy sums of money because nobody questions the rates

The Common Law Theory of Everything

I am counting on physical, semi technical contract work to pay once SWE opportunities shrink to the point where it’s not worth it anymore.

Now is the time to get handy if not already. Robotics /physical automation will lag info by a good stretch.


We looked at trying to get some mini-split heat pumps for my mom's place & were getting quotes $30k figures for two modest units (it's a tiny well insulated house). I don't know what the frak is wrong with this nation; this is so fantastically worrying.

Home HVAC is the most obvious current regulatory caused scam in the US. Virginia just added an 'easier' license that 'only' requires two years of experience to receive (and 160 hours of formal training, but that's not the bad part obviously).

Something like a minisplit though can literally be DIYed in under a day. With experience, a DIYer can do it in a couple hours. They're literally designed to be easily installed as a complete system. Even in Japan you can get one installed for under a grand (including the unit). In China it's obviously even cheaper.

Obviously HVAC companies don't want it to be easier to get a license, they make boatloads on entire home systems and maintenace. Being able to just replace a broken unit for $600 would kill their entire business model.

Electrical is a similar scam, though for some reason if you get enough quotes you can usually find one that isn't charging the equivalent of $1k/hr in labor like getting a mini-split from an HVAC company tends to be.


There indeed are plenty of mini-splits you can just buy & install.

I would too. Alas mom lives in a northerly area, and we really would prefer something high efficiency. There's some rebadged 37mpra units about that are 35+ SEER2, which if the number means anything is a colossal leap. The good stuff though doesn't seem to be directly purchaseable. I'd be happy to lay the concrete bed, set it up, drill walls, mount the ductless... Getting help actually vacuuming would be good but I could do it.

But I can't go purchase the system.

It's all deeply infuriating. This is just such a rude awful thing that American society keeps having to put up with such deeply captured deeply absurd base costs everywhere. These tradespeople deserve to make a living, I don't bergrudge them that, but this feels like there has to be so so much more going wrong for these prices to escalate like this.


You can get efficient DIY units - specifically look for mini splits with quick connectors and you’ll find them. Installed one last year and the efficiency is actually better than it says on the box.

Show me anything that promises anywhere near that SEER2. 35 is absurdly better than what the market has seen. High efficiency used to mean >10.

HVAC is wildly variable, even more so than other trades in my experience. Get several quotes, there will be five digit differences between the top and bottom.

Try looking up HVAC workers on thumbtack.

Only 1.5 Twitters. Sort of pathetic!

Great point about the storage. That is another place where the repairability meme is really not helping. Moving the storage controller up into the host SoC is a good idea and the PC world should adopt it.

Apple's storage controller is not even a PCIe peripheral internally, so it's saving power and latency cutting out that interface, even when it's active.


I'm having a tough time wrapping my head around how this could work for PCs today.

I'm guessing Intel/AMD could integrate a single SSD controller that OEMs could use for a specially socketed SSD?

I'm not familiar enough with SSD controllers - but what limits would this introduce. I'm thinking they can't be totally generic - with any NAND chips, any layout, 1-4 chips and TLC or QLC NAND - any capacity etc. It strikes me it would be limiting - you would become restricted to a a small subset of SSDs, maybe not forwards compatible with newer NAND chips etc.

I'd think only the minority of PC Laptops would make sense to have this - ones with soldered SSDs - and I don't know many of these. So Intel/AMD would need a big push to integrate any controller. Maybe Windows ARM laptops, if the controller makes a big enough difference, will do this. I'm curious now if any Snapdragon devices are doing this already.


I think it's just a vertical integration thing. They know what's in the machine and they can make sure that their suspend path puts every peripheral to sleep. Linux has no idea what's in your machine and there may be some device in there somewhere that freaks out if the machine goes to sleep without saying goodnight. Even a 50mW draw will destroy the suspend power budget. Chromebooks have similar vertical integration with respect to ChromeOS and they also enjoy long sleep life. Hypothetically an integrator like Framework can also guarantee this but I can't vouch for it being true, and they would not have any control over Ubuntu updates after the laptop is delivered to the customer.

Just to beat my favorite dead horse, this is why the insistence on SO-DIMMs "BEcAuse it's rEpAIrAble" has wrecked the reputation of a lot of laptops. DDR on a stick is fundamentally hostile to sleep power draw. Soldered-down LPDDR memory has always been massively superior for energy savings, and LP-CAMM finally solves the issue.


How does soldering memory help reduce sleep power consumption vs. using a socket? What is different other than how they are physically connected to the board?

It's not the form factor itself that is the problem. LPDDR is more efficient for various reasons and cannot be on a DIMM. It physically will not work with a socket. That is the problem that LP-CAMM solves: LPDDR but still removable.

You did not answer the question.

Did I not? I'm trying my best here. The question is sort of off-target, though. What I am trying to say is: 1) DDR uses more power than LPDDR; 2) LPDDR cannot work on a DIMM socket, because of its lower voltage signals, and other reasons; 3) SO-DIMMs always contain the higher power DDR; QED) if you insist on SO-DIMMs, then you have to spend more energy.

Rohansi was basically asking 'why', you keep on reiterating that DDR uses more power than LPDDR, but fail to answer why this is the case. Is it clock speed? Is it voltage? Is it a protocol/specification difference? 'various reasons' is not an answer.

There is no physics based reason why it couldn't work. If the industry really wanted to do it they could. But they don't. The primary reason is LPDDR just has too many pins. A DDR5 SODIMM has 262 pins and is an unwieldy beast. LPDDR5 has 644 pins.

LPCAMM2 really shows the trade-offs. It adds a lot of bulk and cost, and repairability hasn't been valued high enough by the market to cover that overhead for most consumers. That's why Micron exited the market they played a big part in founding.

https://www.ifixit.com/News/95078/lpcamm2-memory-is-finally-...


LPDDR is very different from DDR so I don't really feel like diving into it in this tiny box. It has its own oscillators so the CPU doesn't have to clock it while asleep; it adaptively refreshes less often according to temperature; during self-refresh the cells are charged to a lower voltage that wouldn't really work for high-speed I/O but works fine for retention.

New level of glazing Elon Musk unlocked. xAI has a vertical integration advantage because Tesla once moved into an old Toyota factory and because once they paid Panasonic to put a Tesla sign outside a Panasonic battery factory. Incredible content.

I would struggle to dislike Elon more, but this seems like you’re some kind of weird anti-Musk fanatic

Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: