Scientific Computing

Matplotlib ValueError on LogNorm plots

Matplotlib log10-normalized plots are enabled with plotting options

pcolormesh(dat, norm=matplotlib.colors.LogNorm(), vmin=max(dat.min(), LOGMIN))

This option also works for appropriate 2-D plots from pandas.DataFrame.plot() and xarray.DataArray.plot().

Log(0) bounds error

Explicit plot option vmin=0 or implicit (from data with a minimum of zero) in a log-norm pcolormesh() plot will cause errors like

ValueError: Data has no positive values, and therefore can not be log-scaled.

or

ZeroDivisionError: float division by zero

Fix

Choose a minimum plot value LOGMIN appropriate for plotting the data.

import numpy as np
import matplotlib.pyplot as plt
from matplotlib.colors import LogNorm

LOGMIN = 0.1  # arbitrary lower bound, as appropriate for log-scaled data display

dat = np.random.rayleigh(1., (50,50))

dat[0,0] = 0.  # forcing the ValueError to occur with LogNorm

fg = plt.figure(figsize=(12,5, layout='constrained'))
ax = fg.subplots(1,2)
ax[0].pcolormesh(dat, norm=LogNorm(), vmin=max(dat.min(), LOGMIN))
# vmin= : this averts ValueError by having non-zero cdata minimum.

ax[0].set_title('log')

ax[1].pcolormesh(dat)
ax[1].set_title('linear')

plt.show()

Matlab / GNU Octave

The equivalent code in Matlab / GNU Octave does not give an error.

dat = raylrnd(1., [50,50]);

dat(1,1) = 0;

pcolor(log10(dat))

Majority of new Python work is Python 3

(This post was originally from June 2017).

There is considerable additional effort required to support Python < 3.6 in general while using concurrent and other performant Python programming with the latest common modules like Numpy, Xarray, Matplotlib, etc.


Python 3 is used by a large and growing majority of new and active Python developers in science, engineering, medical research and education. Python 3 was released in December 2008. While initially there was pushback over backward incompatibilities, the performance, efficiencies and features of Python 3 have won out.

The most popular Python packages have supported Python 3 for some time now, including Amazon AWS and Google Cloud Platform.

The main holdouts are of the same nature as those that hang on to COBOL systems. Those with static, unchanging requirements in large proprietary codebases that few people are familiar with. Some programmers thrive and make a decent living servicing those legacy COBOL and Python environments. The majority of STEM coders, analysts and educators have been writing Python 3 code. The Python 3 objections were mostly written before 2016 and almost all were before 2017. Some of their complaints were addressed in Python 3.6 (released December 23, 2016).

A main issue over Python 3 is over the separation between bytes and strings. Applications with IoT and embedded systems distinguish between bytes and strings, so I appreciate the separation of bytes and strings in Python 3. For the global environment I write for, I appreciate that strings are Unicode in Python 3.

Python 3 efficiencies in terms of programmer time come in more efficient syntax. The Python 3 core itself is as much as 10% faster in several aspects, with some standard modules like re processing regular expressions as much as 20x faster. The modernize tool and six and __future__ modules smooth over most of these changes to make backward compatible code. Some of the most enticing changes from Python ≥ 3.6 are not easily backportable. These features simplify syntax and make the code more self-documenting.

asyncio brings core Python features that used to require Tornado, twisted, etc. Asynchronous execution is required for applications that need to scale massively. IoT applications where remote sensors report in are a perfect use case for asyncio. asyncio is a convenient, relatively safe way to thread baked right into Python.

Least-recently used caching is enabled by a decorator to functions. For example, you have a computationally-expensive function where you sometimes use the same arguments, but don’t want to bother tracking the output by saving in a variable. LRU caching is as simple as:

import functools

@functools.cache
def fib(n):
    if n < 2:
        return n
    return fib(n-1) + fib(n-2)

print([fib(n) for n in range(16)])

print(fib.cache_info())

results in:

[0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, 610]

CacheInfo(hits=28, misses=16, maxsize=None, currsize=16)

Pip uses pyproject.toml to completely describe package metadata for installation.

Python type hinting, is used by some IDEs to give code warnings while not actually enforcing strict typing (unless you want to).

import math

def two_sinxy(x:float, y:float) -> float:
    return 2*math.sin(x*y)

This function would not crash if you fed int in on most interpreters, but PyCharm IDE and other can emit warnings when conflicting variable types are passed in/out.

Python argument unpacking, unpacks iterables into functions requiring multiple arguments, expanded with *iterable. Multiple iterables can be unpacked into a single function.

ipaddress is a useful standard library feature for manipulating IP addresses, used in the findssh program to scan for active servers in IP address ranges.

Object-oriented pathlib is standard library and replaces most os.path functions.

f-strings allow f'This is {weight} kg for {price} dollars. instead of 'This is {} kg for {} dollars'.format(weight,price)

Python 3.7 adds several more compelling reasons to upgrade.


Patreon transitioned from PHP → Python 3 in 2015. Key appeals for Patreon to switch to Python 3 included:

  • future-proofing
  • appeal to developer hiring, working with latest language
  • lower risk than porting to static typed language like Scala

Instagram runs fully on Python 3 as noted at the 2017 PyCon keynote at the 13 minute mark.


Starting in 2010, Arch Linux defaulted to Python 3. Ubuntu 17.10 defaulted to Python 3. Ubuntu 18.04 requires Python 3 from programs in the main Ubuntu repository with default Python 3.6. The goal is to demote Python from the main repository.

Executable Python scripts should continue to have the first line

#!/usr/bin/env python

so that users can configure their desired Python version. Many users install a third party Python distribution such as Anaconda Python, PyCharm, Intel Python, etc. that have high performance math libraries such as Cuda-based CuPy.


  • Very detailed notes from Python Software Foundation Fellow Nick Coghlan on why, when, what, how, where of Python 3 transition with fascinating historical notes.
  • ActiveState saw majority of downloads being Python 3 since January 2017.

Upgrade Windows with dual-boot Linux

For a dual-boot Windows / Linux PC, set BIOS / UEFI to boot to the Windows hard drive, especially if using Windows BitLocker. If Windows won’t boot, the Windows HDD boot sector may need repair. Windows error 0x800703ed may occur if a dual boot system tries to start Windows from Grub instead of directly via BIOS / UEFI selection. It is best to use a separate hard drive for Windows.

Backup the PC to an external hard drive or the cloud. Create bootable Windows USB drive via Windows Media Creation Tool. Use Rufus to write the ISO to USB. Boot from USB and select a partition to install Windows on. Don’t delete Recovery partitions or you may lose your Windows OEM license.

Wget HSTS database

HSTS can enhance security, so normally we’d like to have HSTS working. If the Wget HSTS database file permissions are incorrect, wget may emit messages like:

Will not apply HSTS. The HSTS database must be a regular and non-world-writable file.
could not open HSTS store at '~/.wget-hsts'. HSTS will be disabled.

Fix: make the .wget-hsts file have normal file permissions:

chmod 644 ~/.wget-hsts

Detect tmpfs shmem RAM or HDD

/dev/shm/ and /run/shm/ map shmem shared memory to a RAM drive on typical Linux systems, useful for IPC. They are also present in Windows Subsystem for Linux (WSL). On WSL, they write to HDD instead of RAM. It’s easy to tell if RAM vs. HDD is being used for shared memory, since RAM has GB/sec speeds vs. HDD having MB/sec speeds.

Note free space in df -kh /dev/shm and free -h Write 1 GB to shmem:

dd if=/dev/zero of=/dev/shm/blah bs=10M count=100

Recheck df -kh /dev/shm and free -h for which one has 1GB space more used. This tells if hard drive or RAM was used.

For programs using shared memory for heavy writing operations:

  1. order(s) of magnitude slower /dev/shm/ operations when HDD is used versus RAM
  2. wearing of solid state drive if /dev/shm is pointed there

Example data bandwidths:

  • SSD: 100s of MB/sec
  • HDD: 10s of MB/sec
  • RAM: 1000s of MB/sec

macOS RAM drive

macOS doesn’t have tmpfs shmem enabled by default, but a macOS RAM drive can be created.

Switch Visual Studio project version

Even minor point releases of Visual Studio can cause significant behavior changes and ABI breakages. The full range of defaults that changes inside Visual Studio for a major release may be more than can be accommodated with user options. Switching the Visual Studio “platform toolset” version may be of help: in Visual Studio, click Project → Properties → Platform Toolset. If the desired toolset is not present, use Visual Studio Installer to obtain it.

The standards-enforcing /permissive- flag can be turned off. That may not be enough to compile projects from older MSVC versions. The /permissive- flag is under Project → Properties → C/C++ →; Language → conformance mode.

pip install on offline systems

Offline (non-Internet-connected) systems may experience failures on installing Python packages. The package authors may not have CI setup for an offline test case, so they don’t realize it’s an issue. In general, Python packages can use pyproject.toml more effectively in a Python-standard way to overcome these issues.

Instead of telling users to manually install a package such as Numpy, use pyproject.toml instead of setuptools setup_requires. Setuptools assumes the computer will be internet-connected and even if the package is already installed the install will fail on an offline system.

To ensure a package like Numpy is installed first, for example where f2py is used, have a pyproject.toml file including:

[build-system]
requires = ["setuptools>=61.0.0", "wheel", "numpy"]
build-backend = "setuptools.build_meta"

This will auto-install the prereqs before the install begins. When including this pyproject.toml parameter, do not omit “setuptools”, “wheel” or the package may fail to install.

Best practices for Matplotlib plots

The object-oriented Matplotlib API is slightly more verbose, but more robust than the state-machine API.

import matplotlib.pyplot as plt

f1 = plt.figure(layout='constrained')
a1 = f1.gca()
p1 = a1.plot(x,y)

a1.set_title('fun plot')
a1.set_xlabel('x [in]')
a1.set_ylabel('y [out]')

plt.show()

The OO interface avoids updating the wrong plot vs. the state machine interface where the plot in focus is updated.

import matplotlib.pyplot as plt

plt.figure()
plt.plot(x,y)
plt.title('title for figure')
plt.xlabel('x [in]')

plt.show()

“Effective Matplotlib” reference guide for moderately advanced Matplotlib graphs.

Related: datetime in Matplotlib

pip install downloaded .whl file

Pip is a widely-used, complex Python package installer program with a lot of legacy baggage. The Python Software Foundation recognizes the critical need to update Pip, putting $116K to sponsor a senior dev to modernize Pip.

Sometimes, pip install fails to realize a .whl binary wheel is available. Thus pip tries to download and install a package from source code. In the case of a large package like SciPy, OpenCV or Pillow on an embedded system like the Raspberry Pi Zero, it could take hours or even days to compile, probably failing numerous times due to missing prerequisite binary libraries.

A workaround to Pip not automatically finding the desired .whl binary wheel is to download and install the .whl directly. The binary wheels are often available at PyPI from the package download page, for example SciPy. For embedded systems such as Raspberry Pi, there may be non-PyPI sites such as PiWheels.

Download the file, then pip install from the file like:

https://www.piwheels.org/simple/scipy/scipy-1.3.2-cp37-cp37m-linux_armv7l.whl

If the wheel binary is not compatible with the system, it will fail to import or run. In that case, simply pip uninstall my_package_name and try something else.