For anyone interested in the structure of the software that ran Ingenuity (and some hardware design aspects, such as the use of commercial off-the-shelve parts), there is an awesome and critically underwatched video of Timothy Canham explaining everything:
I always thought it was crazy that NASA uses a FPGA/microcontroller/cell phone SOC setup, which makes total sense to me, but mundane industrial things like the traffic light at an intersection needs a giant cabinet with shelves of crazy seimens controllers, it feels like such overkill in comparison, you can see them on this channel https://www.youtube.com/watch?v=udpB-en9KKM . The guy is always arguing with the commenters that its all needed for safety. I never could figure out why some company doesnt come up with a NASA-sized solution to control the worlds intersections instead.
It's not needed for safety, but for liability.
The equipment is costly because it need to be certified, because every piece of equipment inside will need to be certified.
Engineers in this field are averse to new technology because they are liable if something goes wrong.
Infact, NASA wouldn’t use that sort of SOC for anything critical like the rover itself. Ingenuity was always a marginal experiment with a correspondingly high appetite for risk and ability to accept budgetary compromises.
Sure, it was a test platform, to prove that they can use a less RAD hardened design by using the more novel approach. I'd imagine future rovers will definitely use a similar approach.
I'm curious what do they use then? Seems like using an FPGA for sensor gathering and some guidance loops with microcontroller logic control is sensible. I've run into scenarios where just the sensior gathering/IO can take up a significant portion of the time slice of the microcontroller.
https://www.youtube.com/watch?v=mQu9m4MG5Gc&t=7s