The physics engine I'm using is called MuJoCo. And if you're wondering why I didn't write my own physics engine, it's basically because I don't have 20 years.
It's what put MuJoCo on my radar recently! But I was surprised to not see him do any kind of gradient descent to optimize his hyperparameters. MuJoCo has a JAX backend so it should be fairly straightforward.
I'm pretty sure he has used gradient descent in previous videos to optimize systems, maybe this time it was just easier to hand tune rather than set up an optimization feedback harness around MuJoCo.
Though he had to resort to manual calibration. I always find he has interesting problems for domain experts and would like to see him team up with one. Also with programmers for faster programs than self taught python.
In the video in question, he doesn't seem able to choose a good scoring function for the stochastic solver (even over multiple weeks), seemingly choosing a linear sum of distances (see 8:50) between simulation and reality. That's a mistake that not even an undergraduate should make. He needs some domain experts.
But the text file has some markup syntax beyond human language? Point being LLMs are subpar for acting on formal grammars, like cracking a nut with a sledgehammer. That's why its important tools like 11ty and pandoc remain.
That’s somewhat true (in my case it’s it’s laughably simple though).
I also never said that tools like pandoc are obsolete now. Just in my case they are already overpowered and I might migrate to something simpler soon.
Otoh i might just run the current version of 11ty indefinitely and never upgrade.
Local LLM feels like the wrong tool for a file converter? LLMs shine in natural language processing, but their statistical nature doesn't fit consistent file conversions as more Turing-like programs.
The physics engine I'm using is called MuJoCo. And if you're wondering why I didn't write my own physics engine, it's basically because I don't have 20 years.
reply