Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tbh, we should more readily abandon GPU vendors that refuse to go with the times. If we cater to them for too long, they have no reason to adapt.
 help



I had a relatively recent graphics card (5 years old perhaps?). I don't care about 3D or games, or whatever.

So I was sad not to be able to run a text editor (let's be honest, Zed is nice but it's just displaying text). And somehow the non-accelerated version is eating 24 cores. Just for text.

https://github.com/zed-industries/zed/discussions/23623

I ended up buying a new graphics card in the end.

I just wish everyone could get along somehow.


The fact that we need advanced GPU acceleration for a text editor is concerning.

Such is life when built-in laptop displays are now pushing a billion pixels per second, rendering anything on the CPU adds up fast.

Sublime Text spent over a decade tuning their CPU renderer and it still didn't cut it at high resolutions.

https://www.sublimetext.com/blog/articles/hardware-accelerat...


Most of the pixels don't change every second though. Compositors do have damage tracking APIs, so you only need to render that which changed. Scrolling can be mostly offset transforms (browsers do that, they'd be unbearably slow otherwise).

That’s not the slow part. The slow part is moving any data at all to the GPU - doesn’t super matter if it’s a megabyte or a kilobyte. And you need it there anyway, because that’s what the display is attached to.

Now, the situation is that your display is directly attached to a humongously overpowered beefcake of a coprocessor (the GPU), which is hyper-optimized for calculating pixel stuff, and it can do it orders of magnitude faster than you can tell it manually how to update even a single pixel.

Not using it is silly when you look at it that way.


I'm kinda weirded out by the fact that their renderer takes 3ms on a desktop graphics card that is capable of rendering way more demanding 3D scenes in a video game.

Sure, use it. But it very much shouldn't be needed, and if there's a bug keeping you from using it your performance outside video games should still be fine. Your average new frame only changes a couple pixels, and a CPU can copy rectangles at full memory speed.

I have no problem with it squeezing out the last few percent using the GPU.

But look at my CPU charts in the github link upthread. I understand that maybe that's due to the CPU emulating a GPU? But from a thousand feet, that's not viable for a text editor.


Yeah LLVMpipe means it's emulating the GPU path on the CPU, which is really not what you want. What GPU do you have out of interest? You have to go back pretty far to find something which doesn't support Vulkan at all, it's possible that you do have Vulkan but not the feature set Zed currently expects.

It was ASUS GeForce GT710-SL-2GD5 . I see some sources putting at at 2014. That's not _recent_ recent, but it's within the service life I'd expect.

(Finger in the air, I'd expect an editor to work on 20 year old hardware.)

Sold it ages ago. New one (Intel) works fine.

I was running Ubuntu. I forget which version.


> It was ASUS GeForce GT710-SL-2GD5 . I see some sources putting at at 2014. That's not _recent_ recent, but it's within the service life I'd expect.

That's pretty old, the actual architecture debuted in 2012 and Nvidia stopped supporting the official drivers in 2021. Technically it did barely support Vulkan, but with that much legacy baggage it's not really surprising that greenfield Vulkan software doesn't work on it. In any case you should be set for a long time with that new Intel card.

I get where you're coming from that it's just a text editor, but on the other hand what they're doing is optimal for most of their users, and it would be a lot of extra work to also support the long tail of hardware which is almost old enough to vote.


I initially misremembered the age of the card, but it was about that old when I bought it.

My hope was that they would find a higher-level place to modularize the render than llvmpipe, although I agree that was unreasonable technical choice.

Once-in-a-generation technology cliff-edges have to happen. Hopefully not too often. It's just not pleasant being caught on the wrong side of the cliff!

Thanks for the insights.


Text editor developers get bored too!

> we should more readily abandon GPU vendors

This was so much more practical before the market coalesced to just 3 players. Matrox, it's time for your comeback arc! and maybe a desktop pcie packaging for mali?


The market is not just 3 players. These days we have these things called smartphones, and they all include a variety of different graphics cards on them. And even more devices than just those include decently powerful GPUs as well. If you look at the Contributors section of the extension in the post, and look at all the companies involved, you'll have a better idea.

There are still three players in smartphones realistically.

ARM makes their Mali line, which vendors like Mediatek license and puts straight on their chips.

Qualcomm makes their custom Adreno gpus. (Derived from Radeon Mobile). They won't sell it outside snapdragon.

Samsung again licensed Mali from ARM, but in their flagship exynos's they use AMD's gpus. They won't sell it outside exynos.

PowerVR makes gpus that are so outdated with features that Pixel 10 phones can't even run some benchmarks.

And then there's apple.


No. I remember a phone app ( Whatsapp?) doggedly supporting every godforsaken phone, even the nokias with the zillion incompatible Java versions. A developer should go where the customers are.

What does help is an industry accepted benchmark, easily ran by everyone. I remember browser css being all over the place, until that whatsitsname benchmark (with the smiley face) demonstrated which emperors had no clothes. Everyone could surf to the test and check how well their favorite browser did. Scores went up quickly, and today, css is in a lot better shape.


The Acid2 test is the benchmark you’re thinking of, for anyone not aware: acid2.acidtests.org

NVidia says no new gamer GPUs in 2026, and increasing prices through 2030. They're too focused on enterprise AI machines.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: