cross-posted from: https://programming.dev/post/15448730

In order to learn programming holograms I’d like to gather some sources in this post.

The linked paper describes already optimised way of rendering holograms. I’d like to find a naive implementation of a hologram i.e. in ShaderToy using interferometric processing of stored inteference patterns like it works in a physical hologram(I guess). I also want this to be a resource to learn how laser holograms work in real life.

To create an introduction project to holographic rendering these steps will be required.

  1. Store a sphere or a cube interference patterns in a texture. This should be a model of our physically correct hologram. Note: If this step requires saving thousands of textures we should limit the available viewing angles(if that’s what helps)
  2. Load the rendered patterns as a texture or an array of textures into a WebGL program
  3. Create a shader that will do the interferometric magic to render the sphere/cube from the hologram model

The performance of the solution is irrelevant. Even if takes and hour to generate the data and a minute to render one frame in low resolution that’s fine.

Note: The goal is not about creating anything that visually looks like a cool hologram or rendering 3D objects with a volume like with SDFs or volume rendering. It’s all about creating a basic physical simulation of viewing a real hologram.

  • Pawel@programming.devOP
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    Thanks! Finally something concrete. Once I return to this to write a POC I’ll revisit your tips here.