Hacker Newsnew | past | comments | ask | show | jobs | submit | dlcarrier's commentslogin

In my country, we call that an interactive shell.

Fun fact, if you run Python from a command line, with no options, it defaults to such a shell.


Most scripting languages are designed to present a REPL (read-eval-print loop) in such a scenario.

Microsoft requires a 40 TOPS NPU for Copilot co-branding, which a RTX 3050 can beat.

Aren't NPUs only designed to run on small models? From whast I've seen, most NPUs don't have the architecture to share workloads with a GPU or CPU any better than a GPU or CPU can share workloads with each other. (One exemption being NPU instructions that are executed by the CPU, e.g. RISC-V cores with IME instructions being called NPUs, which speed up operations already happening on the CPU.)

You can share workloads between a GPU, CPU, and NPU, but it needs to be proportionally parceled out ahead of time; it's not the kind of thing that's easy to automate. Also, the GPU is generally orders of magnitude faster than the CPU or NPU, so the gains would be minimal, or completely nullified by the overhead of moving data around.

The largest advantage of splitting workloads is often to take advantage of dedicated RAM, e.g. stable diffusion workloads on a system with low VRAM but plenty of system RAM may move the latent image from VRAM to system RAM and perform VAE there, instead of on the GPU. With unified memory, that isn't needed.


That would probably make it take longer. A safer bet would be three really strong cups of coffee and two bran muffins.

It is the case for embedded microcontrollers. An ESP32-C series is about as cheap as you can get a WiFi controller, and it includes one or more RISC-V cores that can run custom software. The Raspberry Pi Pico and Milk-V Duo are both a few dollars and include both ARM and RISC-V view. with all but the cheapest Duo able to run Linux.

All Duos run Linux.

And if Apple introduced it a decade ago, then it's at least five years older than that.

What's new here is the 1 Hz minimum.


It's for OLED screens, so there's no backlight, but also no persistence.

It's an LCD display.

Are you sure? Article says:

> A 1Hz panel is almost, but not quite, on the level of an e-ink panel, which isn’t the prettiest to look at. LG’s panel also uses LED technology, the mainstream panel technology that’s being overtaken at the high end by OLED panels with essentially perfect contrast.


Led backlight I assume.

It's part of the standard technology buzzword rotation:

embedded/cloud/IoT --> AI --> quantum…

When the company originally known as C3 Energy changes their name to C3.quantum, you'll know where on to the next buzzword.


At this point, I'm not updating anything using Python.

Not that I had the option anyway, because everything using Python breaks if you update it. You know they've given up on backward comparability and version control, when the solution is: run everything in a VM, with its own installation. Apparently it's also needed for security, but the VMs aren't really set up to be secure.

I don't get why everything math heavy uses it. I blame MATLAB for being so awful that it made Python look good.

It's not even the language itself, not that it doesn't have its own issues, or the inefficient way it's executed, but the ecosystem around it is so made out of technical debt.


Agree. I was working on an open source package, noticed something weird, and noticed the size of the uv.lock and got a bit scared.

It's a pandemic, I will be hardening my security, and rotating my keys just in case.


Sounds like you're not familiar with https://docs.astral.sh/uv/ ...

It sounds to me like they are: `You know they've given up on backward comparability and version control, when the solution is: run everything in a VM, with its own installation.`

uv taking over basically ensures that dependencies won't become managed properly and nothing will work without uv


What do you mean with "basically ensures that dependencies won't become managed properly"?

Python is genuinely a pleasant syntax and experience. [1]

It's the closest language to pseudocode that exists.

Like every other language from 1991, it has rough edges.

[1] https://xkcd.com/353/


I've noticed that both American and Soviet planes used greenish colors, but the American ones are a yellowish green, while the Soviet ones are a bluish green. I've always wondered if the American yellowish green was chosen because it's similar to the color of the zinc chromate primer used on those aircraft, so the transparency of the paint wouldn't be an issue.

The yellowish-green is the zinc chromate "passivation" coating to help prevent corrosion.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: