Are there any good resources for learning how to render dichroic thin films with multiple internal reflections and refraction indexes? Basically, I would love to render something like this[1] in real time.
For a very simple form, you can sample a specific wavelength with each ray, trace it using wavelength-dependent refraction indices (e.g., from the Sellmeier equation) whenever you compute refraction or internal reflection. Then tint the ray by the color and visible intensity of the wavelength when averaging it into the image pixel.
A naive spectral path tracer can be less complex (code wise) than a "normal" one. You just trace each photon with its wavelength instead of tracing RGB (or some other full spectrum) per path. Since this abstraction is closer to physics, it makes a lot of code easier to reason about. Wavelength dependent refraction is straightforward, you just use the index of refraction for the photon you are tracing.
I find this code base extremely easy to read despite not being a C++ (It has that Carmackian quality). Pretty easy to use to port to the language of your choice.
https://github.com/TomCrypto/Lambda
To get the kind of effect you see in the youtube video you'd render diamond or something with extreme refraction.
This is far from real time though of course. It will run an order of magnitude slower than anything you can do with full spectrum rays. So doing it in realtime would probably be just like subsurface scattering is done in real time: you just have to cheat. Perhaps there can be some inspiration from the real thing though.
I'm just an amateur who played around with rendering in various ways, raytraced and rasterized. In order to get the results similar to the video (without hacks), you're gonna need caustics, refraction and reflections, all together with realistic lightning. You're not gonna be able to get that looking nice without a physically accurate render engine, which will most likely be using raytracing. And raytracing (compared to rasterization) is not gonna be able to run anywhere near real-time (more like at least seconds per frame, if you really ramp down the samples [image will be more noisy]).
[1] https://www.youtube.com/watch?v=BVQ34NlTi-E?mute=1