Visualizing a Black Hole

A real-time simulation of the visual appearance of a Schwarzschild Black Hole.

Go to live demo

(this most definitely needs a modern browser with WebGL. It also works with many smartphones)

(I've also written a more sophisticated raytracer, If you like prettier pictures that move less.)

What am I looking at?

Physicists/astronomers can skip this.

A black hole is an extreme astronomical object that is the final stage of the evolution of some stars. Its gravitational field is so intense that it bends the path of light rays, giving rise to visible distortions. This is a simulation of what you would see when at close distance from such an object, on the backdrop of the Milky Way.

The most prominent feature is the black, featureless disc. This is the Event Horizon, or rather the image of it; the EH is the so-called "surface of no return", a membrane that can only be crossed by ingoing objects. Since nothing, including light, can escape the EH, its image appears pitch black.

Around the EH, the whole sky is visible, distorted. Multiple images of the same objects can be seen. Occasionally, when the viewer, the BH, and a distant star are suitably aligned, the star's image is distorted in a thin ring of light called an Einstein ring.

Details

Controls

The simulation is interactive. This is what you can do:

Mathematical background

Our black hole is the simplest of its kind, a Schwarzschild BH, with zero charge and angular momentum:

\[ ds^2 = \Big(1-\frac{1}{r}\Big) dt^2 - \Big(1-\frac{1}{r}\Big)^{-1} dr^2 - r^2 d\Omega^2 \]

I'm using, of course, units where \( c = 1 \) and \(r_S = 1 \). From here, one could derive the Christoffel symbols, write down the geodesic equation, and integrate it for every pixel. Easy, right? Of course not. This is a massive computation I have no intention of ever performing in my life. Other people have done full on geodesic raytracing (this guy is pretty awesome, but it's orthogonal to real-time; this one was off to a fantastic start, doing real-time 4D raytracing on the GPU, but got the Schwarzschild metric wrong when converting to cartesian coordinates! And this thing here is fun to play with, but seems very unaccurate, especially after taking a look at the code), but this seems very prohibitive, and very wasteful for the very symmetric Schwarzschild spacetime! Isometries of spacetime will correspond to constants of motion for free-fall geodesics. This means that the geodesic equation will take on a much simpler form.

In particular, we center our reference frame on the (temporarily massive) particle's orbital plane (\( \theta = \pi/2 \). Notice that through this we fixed one of the constants of motion, namely the direction of the pseudo-angular-momentum. We have two additional constants, which we can derive directly from the metric: \[ \left( 1 - \frac{1}{r} \right) \frac{dt}{d\tau} = e \] \[ r^2 \frac{d\phi}{d\tau} = l \] These are recognized as the pseudo-energy and pseudo-angular momentum associated to the time-translation and phi-rotation Killing vectors. Plugging back these conservation laws in the metric yields the equation for the trajectory: \[ \left(\frac{dr}{d\phi}\right)^2 = \frac{r^4}{b^2} - \left( 1 - \frac{1}{r} \right) \left( \frac{r^4}{a^2} + r^2 \right) \] where \(b = l/e\) and \(a = l\); however, inspired by Newtonian tradition, we perform the very useful change of coordinates \(r \rightarrow u = \frac{1}{r} \): \[\left(\frac{du}{d\phi}\right)^2 = \frac{1}{b^2} - (1-u)\left(a^{-2} + u^2\right) \] Finally, we send \(a \rightarrow \infty \) to obtain the trajectory for a massless particle such as a photon. We encounter no difficulties with this limit, since we have factored out proper time from our equation.

And here, I recognized that the equation above was of the form of an energy conservation equation for a standard, Newtonian mechanical system, with \(\frac{du}{d\phi}\) in the role of velocity. LHS is kinetic energy, RHS is minus potential. The corresponding Newton's equation is: \[ u''(\phi) = - u \left( 1 - \frac{3}{2} u^2 \right) \] Which is so satisfyingly simple, and took a relatively minuscule amount of work. This second order equation is preferrable to the above first order equation because the latter is a numerical nightmare rich with square roots, inflection points, and ambiguities. The \(u''\) equation must be understood as the real equation of motion for \(u(\phi)\), while the \(u'^2 \) one is akin to a Hamiltonian conservation law, the implicit equation of orbits in the \(u,\pi \) plane - \(\pi\) being the conjugate momentum to \(u\).

Strong emphasis must be put on the fact that the equation's simplicity is dependant on the spacetime's simmetries.

Circular Orbit

Circular orbits of massive objects are possible at any radius above the photon sphere \(\left(\frac{3}{2} r_S \right)\) - though they're only stable when \(r>3 r_S\), but we'll allow our observer to also force himself on these unstable orbits. I define the orbital speed as: \[ \beta = r \frac{d\phi}{dt} = \frac{1}{\sqrt{2}} (r-1)^{-\frac{1}{2}} \] Where the latter result's derivation is sketched, for example, here. This velocity actually is the relative velocity between a fixed-Schwarzschild hovering observer and a circularly orbiting observer when they pass through the same event.

This ultimately implies that we can obtain the view of the orbiting observer just by correcting for the local Lorentz transformation the data we already acquired for the Schwarzschild observer. Namely, we only need to apply aberration, or relativistic beaming (redshift is ignored for clarity). Incoming photons in our eyes have their 4-momenta transformed as (in local inertial cartesian coordinates): \[ k^\mu = (k^0,\vec k) \rightarrow k'^\mu = \Lambda^\mu_\nu k^\nu = \] \[ = \left(\gamma k^0 + \beta k_3, \left(\beta k^0 + \gamma k^1\right), k^2, k^3\right) \] (I assumed the left-right direction, that "tangent" to the orbit, was the x direction). The change in in energy is the redshift/blueshift that we ignore. The change in momentum instead is aberration and affects the direction from which we see the light ray come. To compute the final momentum, we need the original energy, but since these are photons, we know that \[ k^\mu k_\mu = 0 \Rightarrow k^0 = \left| \vec k \right| \] (We can use \( g_{\mu\nu} = \eta_{\mu\nu} \) since we are expressing \(k\) and \(k'\) in local inertial cartesian coordinates). This allows us to to correct for aberration.

Implementation

I'll be very schematic:

The raytracer

A python/numpy/scipy program takes lists of initial conditions for rays and, after converting them to \((u,u')\) phase points, makes use of the odeint() function to numerically integrate the differential equation. It's completely multicore and computes ray paths in parallel. It outputs the full set of raw light paths around the black hole in the \(u(\phi) \) form.

The texture generator

A python script queries the raytracer for a large set of initial contition, at various radii (from 1 to 10 Schwarzschild radii) and viewing angles. It searches the traced paths for zeroes of \(u(\phi) \) - these correspond to asymptotes of scattering orbits and the angle at which they occur is thus indicative of the final direction of the ray. This program generates a 512x512 RGB texture, where x is the viewing radius and y is the viewing angle, and colour information codifies the deflection angle.

The WebGL application

The only part that runs on your computer is this WebGL applet. Its shader is fed with the previous texture, which it uses as a lookup table. For each pixel in the viewport, it first inverts the effect of aberration according to your orbital velocity. Then, it computes the view angle and reads the predicted deflection from the texture. It then calculates the final 3-vector pointing to the region of sky where the photon would have originated through a bit of vector algebra, and reads the colour of that point from a skysphere texture of the milky way.

Photon orbits starting at the photon sphere (r=1.5) with varying directions. Note therefore that the circular orbit at the photon sphere is unstable.
Scattering of incoming photons from infinity with various impact parameters. Note how the "cross section" for photon absorption is much larger than the event horizon. Consequently, the image of the horizon will be easily more than twice as big than the horizon itself, and all of its surface is simultaneously visible.
The deflection angle map. Colour bands are due to the coding algorithm (1 colour channel is not precise enough)

F.A.Q.

"Can you add an accretion disk like in Interstellar? Can you add objects?"

Sadly, no. The GLSL shader knows nothing more of the photon paths than the deflection angle and does not do raytracing itself (this is how it can run in your phone's browser window). You need the full path to compute intersection with objects that aren't at infinity. Still, one could use the python module I wrote to design a true raytracer with standard global illumination capabilities, and render some really cool pictures (overnight, not real-time). This was the original idea for which I had developed the module, but I met with various difficulties and the render times were so disappointing. Maybe in the future. UPDATE: it is now the future!

"What are the weird ripples? What are the strange bands when zooming in close to the edge of the event horizon's image?"

They're graphical artifacts, probably due to texture colour depth precision, that I've been trying to debug for days They're not genuine physical phenomena.

"Why is the BH not in the center of the screen? Why is it not circular?"

You're looking straight at the BH, but aberration of light from your orbital velocity distorts it and shifts it.

"Is time dilation considered?"

No, timescales are arbitrary. Doing otherwise would make it incomprehensible and the message would be lost on most. Interpret this as meaning that there is no correspondence between visualization time and proper time of the observer.

"Are you sure time dilation doesn't affect the visual appearance?"

Yes, if you assume everything you observe (the BH and the sky) is eternal and stationary in S. coordinates. Then the coordinate time at which the light ray you observe now was emitted does not affect the colour of the light ray. This is why I was able to factor out time from the equations, and solve them with reasonable computational resources. If there were moving objects, instead, I would have to correct for the delay by intersecting the past wordline of the light ray with the "world-tube" of the object; this is the classical problem in EM of intersecting past light cones with charged particles wordline to find the retarded time, only in curved spacetime. If you are curious about what this kind of things would look, I urge you to visit the fantastic homepage of Prof. Andrew Hamilton from the last millennium, which is still the most amazing website in history. It includes various accurate and schematic raytraced animations (.gif animations from such a long time ago that it was written GIF and they were accompained by a warning about their size in KBs) of black holes, thoroughly explained.

"Can you make it so that you can move around?"

Not sure. Probably can, but won't do it now.

"Can I fork you on Github?"

You are encouraged to.

"Can you do other spacetimes?"

I can do some other very symmetric geometries, but they would need a distinct, albeit similar, program. Maybe flat FLRW (Big Bang and expanding Universe).