Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It’s all about the software & documentation.

There’s lots of “competitors” to esp32, pretty much none of them except STM have the software for developers, which makes their offerings close to useless.

The only way to gain critical mass is relentless deep commitment to developer tools/libraries and documentation.

In many cases these competitors actually try to hide their documentation and keep secret how their chips work.

I’d be interested to hear if there’s other chip vendors with software/documentation as impressive as Espressif but I haven’t heard any, except as I said STM.



Bouffalo apparently has been giving Pine64 the necessary documentation access to make their initiative for a full open software stack workable: https://www.pine64.org/2020/10/28/nutcracker-challenge-blob-...

Edit: a better reference is Bouffalo’s GitHub org showing the docs and mostly open code: https://github.com/bouffalolab


Not full. I managed to squeeze, like blood from stone, from Bouffalo a statement about absolutely no way they are open sourcing their radio interface.


> gain critical mass is relentless deep commitment to developer tools/libraries and documentation

Which seems (IMHO) why the likes of Texas Instruments has failed. They've made a few attempts at jumping on the IoT bandwagon over the last decade and should have been in a perfect position to capitalize on the huge uptake but it just didn't fit their profit margin goals I'm guessing.


Nah, problem with TI was their ridiculous $20-30 price point for a microcontroller with build in Wifi.


RISC-V so far is a turndown for a design, speaking honestly.

ARM have by faaaaar the best tooling in the industry, which are more open, than not, and the baseline is made on GNU toolchain, only with more fancy debug tools being paid/locked down.

RISC-V did not yet benefit from its openness for as long as tooling go.

However, RISC-V would still be infinitely better than Cadence, and them wanting $100k for a basic (and terrible) debugger with coresight-like functionality.


It just means it isn't made for hobbyists... Hardware companies don't care about the lack of good software - they just pay some embedded code monkey a lowish salary to 'just make it work'.


"embedded code monkey"

Is this really the case? :(


It's a pretty uncharitable description, but yeah. There are only two types of companies that do this though: companies where they're working at such a scale that spending a few extra months to years of developer time doesn't make a dent compared to the BOM savings. You're more likely to get vendor support by name recognition at these kinds of companies, so it's not as bad as the hobbyist experience. The second kind is where either the project or the company can't afford the capital outlay for better documented chips (or support packages). Stay the heck away from these if you value your sanity or know your own worth.


Is it even possible to make as much as the hype fields (webdev, data science, ML) in embedded? Is there opportunities for startups in HW sector?


Sadly no - apart from a very tiny number of silicon valley companies, everyone else in embedded tech seems to pay the people who write verilog and assembly the same as the people who make the schematics and the mechanical designs. That's sometimes only 40% of what people who write python and nodejs get in the same city...


I think that's partly due to market forces and the value produced per developer. A few Webdev's can write software that enables an entire web company in a few months. Due to the poor state of embedded development ecosystem the same is not true in hardware, compounded by the time/effort required to workaround bugs in the hardware plus working in error prone C. Then embedded developers/companies don't share workarounds via open source, so every team must spend time working around already solved issues. Personally, I'm using the ESP32 largely because of the open-source SDK, that didn't require an NDA or some crappy IDE, then I ported Nim to it. But normally it might take 2-3x more embedded devs to create products with comparable profits to similar devs in pure software


If you're doing automation of some kind in the bay, embedded has roughly comparable payscales to other dev jobs. Beyond that, no. You'll generally be paid slightly less than webdevs (but still far more than most engineers make).


In my humble experience, slightly less ≈ 30% to 40% less.


Wow, that's a much bigger differential than my experience. Are you comfortable sharing the general region so I can avoid it?


I have been involved in embedded systems for more than 25 years now, and I barely do "pure" embedded development these days. That is projects that comprise only designing a specific PCB hardware, or writing the firmware drivers and top level application for them.

Except for a couple of years I have resided mostly outside of California, but when doing consulting I have mostly engaged with companies in the Bay area.

In the past when I did pure embedded development I had to fight tooth and claw to get rates close to $100/hour, while at the same time iOS app developers or webdevs were starting at $120-$140/hour easily, even after the mobile app craze. Despite taking into consideration that I am both a hardware and a software engineer, and rates outside of California were much lower ($45-$70/hour at the time). Which was one of the reasons I pushed hard to find my clients in the Bay area instead.

These days since I have more experience and business contacts I have diversified into more complex projects that have embedded components and pay me better since they belong to regulated industries. Even now I work with full-stack software programmers in AWS/GCP cloud apps, React/Vue frameworks, modern databases and connected middleware that make around $200/hour. For comparison, in the last years I have been involved in the development of a medical device where I did the hardware design (MCU/FPGA, signal acquisition), HDL/firmware programming (Verilog, RTOS, Linux, BLE), technical host tools (C++, Python, Qt) and was paid $180/hour, and lately a mobile robot project where I did the hardware architecture, integration and system control loops (MCUs), and prototyped the high level software application (RT-Linux, ROS, Nvidia Jetson) for $150/hour. An acquaintance in the valley that was working for a similar project but entirely in the ML/CV stack was billing $270/hour. That may be a bit extreme example since ML is hot today, but nevertheless.


Is this the same in Europe or outside of Bay area? It really isn't fair since you are doing the actual full-stack (from F_freaking_PGA to C++/Vue on host) but the ML type is invoking an amalgamation of libraries to do something with 90% confidence level.


> Is there opportunities for startups in HW sector?

Yes, but certainly not in the Western hemisphere.


Starting out no, but there is plenty of demand for experienced people. If you don't have experience then build up a portfolio.


Your best bet is anti-trust:

Basically, if there more customers but smaller orders, the wasteful overhead of customers reverse engineering the produce will grow. But then there's more incentive for HW companies to overtake the competition by offering better docs.


This. If it doesn’t work with the Arduino IDE/platformio, I would rather not use it. What these manufacturers should do is invest in integrating with these SDKs instead of creating their own. And they should document everything to the level that IPFS is documented (just an example of excellent documentation that comes to mind). Do those two things and you got a winner even if the device itself isn’t technically the superior choice.


Because, to be frank, Arduino is a bit of a toy platform.

It is really good for beginners and for hobbyists to make embedded devices available to a wider audience, but nobody in their right mind would build an actual product on top of Arduino.


People trying to build "real" products on top of Arduino usually do so in spite of, not thanks to, the platform.

Once you start trying to build a polished experience, not a quick hack, you quickly run into limitations that are solvable by dropping to lower levels. I ran into this when I tried to make an ESP8266 temperature sensor - the Arduino stuff didn't have working power management, so it would heat up skewing the measurements, nor anything like threading or coroutines for network handlers, so a single stuck network request would block all others. Ironically, the Arduino port included a coroutines implementation under the hood, but used it in a silly way with a single coroutine. I rebuilt the thing on top of the ESP SDK, stole the thread switching code from Arduino, and used it to build a nice low power version that could handle up to 4 requests at once.

I don't think I've ever seen a well engineered product built on Arduino that wasn't trivial. They might've hacked on it until the user experience is good, but then you peek under the hood and it's clear the engineering isn't good. And that ends up creeping into the user experience in the end, or worse. (OnlyKey comes to mind, which is built on Arduino and also its authors clearly do not have the competence to be writing security/crypto code, but they seem unwilling to accept any criticism).


I know of a good production line that has seen improvements in efficiency and safety due to automation with Arduino. I’m sure there are better ways but in terms of bang for buck, it’s hard to see how a non-programmer could have done better.


That doesn't sound like using Arduino in a product, though?


No, they are on big bits of equipment.


What is the best open source alternative? Ideally one that works similar to PlatformIO where you aren’t tied to a specific IDE.


On things I've personally worked on, I tend to just use the basic libraries supplied by the vendor, or even just implement everything myself from reading data sheets. This is obviously realistic mainly for very small and simple systems, which is what I've usually worked with.


Well prusa did with their printers


They would be one of an exceedingly small minority, then.


Almost every consumer 3D printer out there runs on top of Arduino. Sure that’s a small market compared to all electronics. But it’s still a big market in absolute terms.


This was true three years ago. Plenty of 3D printers are now using ARM. The Monoprice MP Mini Delta has been out about two years now with a price of $175 US. Marlin 2.0 has support for ARM, I think it would be accurate to say "Almost every consumer 3D printer out there runs on top of Marlin", although there are a few alternatives like Smoothie.


>stm

gd32v is a stm32 (peripherals, memory map) clone, except using risc-v rather than arm.


That is going to put a dent in ARMs revenue. Even if that particular chip isn't a big seller, it means ST now has a drop-in replacement for ARM in their other SoCs. Soon people will realize that ISA is often irrelevant, but the peripherals, libraries, and tools are.


That's not ST, it's by the Chinese company GigaDevice that makes (decent) clones of the STM32 series. Your point still stands though.


I'm not so sure. About ISA there is this : https://news.ycombinator.com/item?id=24958423


That's one opinion. Another ARM engineer who contributed one heck of a lot more to ARM's success (designed ARM7TDMI, Thumb, Thumb2) says RISC-V is great. See at 51 minutes in this https://www.youtube.com/watch?v=_6sh097Dk5k


Any ISA can be dissected, but ultimately RISC-V's got the mindshare and the license.


There's no solution, only tradeoffs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: