Skip to content

A Panel app to demonstrate distorsions created by non-perceptual colormaps on geophysical data

Notifications You must be signed in to change notification settings

mycarta/Colormap-distorsions-Panel-app

Repository files navigation

Colormap distorsions Panel app

The notebook in this repo served two goals:

  1. As a playground for continue learning Panel (following the awesome PyData tutorial by James Bednar, Panel: Dashboards for PyData (Austin 2019))
  2. To share an app demonstrating the effect of colormaps on perception (and on the ability to see fault edges on a seismic horizon)

The first version of the app was presented as a lightning talk at the Transform 2020 virtual conference organized by Software Underground; you can watch a video recording of the presentation here.

Conda setup

To create the conda environment for this tutorial run:

conda env create -f environment.yml

To run the notebook interactively with Binder click on the button below

Binder

How to Run on Binder (Step-by-Step)

Don't worry if you're new to this — it's easier than it looks!

Step 1: Launch Binder

  • Click the "launch binder" badge above (the rectangular button with the orange logo)
  • A new browser tab will open showing a loading screen
  • Be patient! The first time may take 1-3 minutes while Binder builds the environment
  • You'll see a progress log — this is normal

Step 2: Wait for JupyterLab to Open

  • Once ready, you'll see JupyterLab — a coding environment that runs in your browser
  • The notebook file (Demonstrate_colormap_distortions_interactive_Panel.ipynb) should already be open
  • If not, double-click on the notebook file in the left sidebar to open it

Step 3: Run the Notebook

  • Look at the menu bar at the top of the screen
  • Click on Run (in the menu bar)
  • Select Run All Cells from the dropdown menu
  • Alternatively, you can use the keyboard shortcut: hold Shift and press Enter repeatedly to run cells one by one

Step 4: Wait for the App to Load

  • The notebook will execute each code cell from top to bottom
  • You may see some output appearing as cells run
  • Scroll down to the bottom of the notebook — the interactive app will appear there
  • This may take 15-30 seconds after running all cells

Step 5: Use the App!

  • You'll see dropdown menus to select different colormaps
  • Change the colormap selection and watch the plots update
  • Compare how different colormaps affect the visualization

Troubleshooting

  • If Binder takes too long: Sometimes Binder servers are busy. Try again in a few minutes.
  • If you see errors: Try clicking Kernel → Restart Kernel and Run All Cells from the menu
  • If the app doesn't appear: Make sure you scrolled all the way to the bottom of the notebook

How to use the app

If you would like some background, please read Crameri et al., 2020, The misuse of colour in science communication. Nat Commun 11, 5444 , and my Society of Exploration Geophysicists tutorial Evaluate and compare colormaps.

Available Colormap Collections

The app includes 5 colormap collections to explore:

Collection Description Examples
matplotlib Standard Matplotlib colormaps viridis, plasma, jet, rainbow, cubehelix
colorcet Peter Kovesi's perceptually uniform colormaps cet_fire, cet_rainbow, cet_bgy
mycarta My custom perceptual colormaps (background) matteo_cube, matteo_cubeYF, matteo_linear_L
crameri Fabio Crameri's scientific colormaps - perceptually uniform, colorblind-friendly batlow, roma, vik, hawaii, oslo
cmocean Kristen Thyng's oceanography colormaps - perceptually uniform thermal, haline, deep, solar, ice

The idea behind this app is to allow comparing any from a wide variety of colormaps to a good perceptual benchmark. As such:

  1. In the top row, I chose grayscale as reference perceptually uniform colormap. (N.B. not all grayscale colormaps are actually truly, 100% perceptually uniform, when you plot Lightness, but this is a decent approximation).

  2. The left column is purely for visual reference of the data with grayscale (top) vs. data with colormap (bottom). (N.B. I plan at some point to add an option to show the deuteranope simulation as an extra).

  3. The real comparison is done in the middle column (it should be fairly intuitive) by only showing intensity with monochromatic palette for the grayscale (top) and colormapped (bottom). A perceptual colormap with uniform incremental contrast at bottom would look like the benchmark at top. Below I am showing an example for Viridis:

Notice that the intensity plots are virtually indistinguishable. A perceptual colormap with uniform but lower total intensity contrast (from least to most intense) will still look similar to the reference, but a bit more washed out due to shorter range. As an aside: I use intensity here for simplicity, as it may be a more familiar idea to more users; I will eventually switch to Lightness. Below is an example with one of my colormaps, CubeYF:

  1. The right column uses Sobel edge detection to enhance the visibility of potential artifacts caused by non perceptual colormaps. Additionally, edge detection is typically an interpretation product, so it is a good way to show what to expect, and how artifacts have an effects a real-world workflow. Below is an example with npy_spectral, highlighting scarps (continuous arrow) and plateaus (dashed arrow):

  1. As further evidence, please compare the hillshsade versions with contours:

  1. The interesting thing to me is that, according to this intensity-based tool, even a perceptual version of the rainbow (cet-rainbow) still has some issues. They are subtle, but they are definitely there, like the thin white strips (caused by yellow hard edges), indicated by yellow arrows and the red with its artificial decrease in intensity (indicated by purple arrow) giving the impression of lows where there should be highs:

Further Reading

For more background and insights on colormap perception effects, check out my blog post: Busting bad colormaps with Python and Panel

License

Creative Commons License
This work is licensed under a CC BY Creative Commons License, with the exception of the data used (a seismic horizon from the Penobscot 3D which is covered by a CC BY-SA Creative Commons License).

About

A Panel app to demonstrate distorsions created by non-perceptual colormaps on geophysical data

Resources

Stars

Watchers

Forks

Packages

No packages published