Fix wget HSTS database error

If the wget HSTS database file permissions get changed, an error may occur on using wget like:

Will not apply HSTS. The HSTS database must be a regular and non-world-writable file.
ERROR: could not open HSTS store at '~/.wget-hsts'. HSTS will be disabled.

HSTS can enhance security, so normally we’d like to have HSTS working.


Make the .wget-hsts file have normal file permissions by:

chmod 644 ~/.wget-hsts

Install and use Intel MKL Scalapack library

Intel MKL has Scalapack, which is installed when you select “Cluster Support” for C and/or Fortran from the Intel Compiler / Intel MKL installer. MKL Scalapack library & include files are under the MKLROOT environment variable for Linux and Windows, like $MKLROOT/{include,lib/intel64}


Several more link flags (including for MPI) are needed to make Intel MKL Scalapack work. The exact link flags and compile flags for a particular system are obtained from Intel Link Line Advisor.



The Intel MKL Scalapack examples are compiled by:

cd $MKLROOT/examples/

tar xf examples_cluster_f.tgz

cd scalapackf

make sointel64 mpi=openmpi compiler=gnu mpidir=/usr/

The resulting “exe” files are under scalapackf/_results/.

For Windows, corresponding .zip examples are available including in %MKLROOT%/examples/


Netlib Scalapack examples are at


non-MKL Netlib Scalapack is available using Meson from

Intel compiler on Windows bug workarounds

There is a tension between the Visual Studio point release and Intel compilers on Windows. The issue arises from Visual Studio frequently updating, and Intel compiler falling behind with the Intel compiler to Visual Studio ABI. This can cause strange and unpredictable errors when compiling a C program, for example

C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.23.28105\include\vcruntime_string.h(18): error: expected an attribute name
  _NODISCARD _Check_return_


Intel support notes that a workaround is to force C++ compiler to be used with the /Tp<source_filename> option.


In Meson, this workaround can be applied in like:

project('foo', 'c')

cc = meson.get_compiler('c')

if cc.get_id() == 'intel-cl'
  add_project_arguments('/TP', language: 'c')
  message('applying workaround for Intel ICL bug with MSVC:')


In CMake, this workaround can be applied in CMakeLists.txt like:

cmake_minimum_required(VERSION 3.0)
project(foo C)

  message(STATUS "applying workaround for Intel ICL bug with MSVC:")

AppVeyor for Linux, MacOS and Windows CI

Related: f2py on AppVeyor Windows

AppVeyor Linux-based CI and MacOS CI works well, like the free Windows-based CI AppVeyor is well known for. For MacOS Travis CI is a prime choice.

Minimal Examples

Minimal working examples of .appveyor.yml for both Windows and Linux using Python, C++ and/or Fortran are given below. Use per-line distinct commands for each OS where needed. The other way is to use matrix:, but unique per-line OS commands are more intuitive for many use cases.

  • For Linux, the stack: stanza picks software/version for a set of common programs–here, Python 3.
  • Note the use of cmd: for Windows-exclusive commands, and sh: for Linux-exclusive commands.


Note: f2py on AppVeyor Windows has its own page–importing Fortran libraries from Python.

This appveyor.yml runs Python 3.7 on the latest MacOS, Windows 10 and Ubuntu 18.04:

- macos
- Visual Studio 2017
- Ubuntu1804

stack: python 3.7

  PY_DIR: C:\Python37-x64

clone_depth: 25

build: off

- cmd: set PATH=%PY_DIR%;%PY_DIR%\Scripts;%PATH%

- python --version
- python -m pip install -e .[tests,full]

test_script: python -m pytest -r a -v


- macos
- Visual Studio 2017
- ubuntu1804

  MINGW_DIR: C:\mingw-w64\x86_64-7.2.0-posix-seh-rt_v5-rev1\mingw64\bin

clone_depth: 25

build: off

- cmd: set PATH=%MINGW_DIR%;%PATH%

- cd bin
- cmd: cmake -G "MinGW Makefiles" -DCMAKE_SH="CMAKE_SH-NOTFOUND" ..
- sh: cmake ..

- cmake --build . --target install

- ctest -V

Manual Linux install

Install additional Linux programs in AppVeyor via commands like:

- sh: sudo apt -yq update > /dev/null
- sh: sudo apt install -yq --no-install-suggests --no-install-recommends bwbasic > /dev/null


Command “python egg_info” failed with error code 1

Try using Miniconda3 C:\Miniconda3-x64

AppVeyor Python 3.8 temporarily broken

As other CIs have done, AppVeyor updated to Python 3.8 in the default Linux, MacOS and Windows images. Since Python library wheels are often not immediately available for recently-released Python versions, this can lead to CI test nuisance errors at setup. Many projects will experience AppVeyor errors on Numpy setup as AppVeyor attempts to compile Numpy since the wheel doesn’t exist yet for pre-release Python versions.

Select Python version

AppVeyor has several Python versions installed. The AppVeyor Python version selection method is OS-dependent.

The examples are shown specific to Windows and Linux for didactic clarity. They would normally be merged to a single .appveyor.yml.

Windows builds

Select Python version on AppVeyor Windows CI by setting .appveyor.yml to the desired AppVeyor Windows Python version directory:

- Visual Studio 2017

  PY_DIR: C:\Python37-x64

- cmd: set PATH=%PY_DIR%;%PY_DIR%\Scripts;%PATH%

Linux builds

Select Python version on AppVeyor Linux CI by setting .appveyor.yml to the desired AppVeyor Linux Python version directory:

- Ubuntu1804

stack: python 3.7

Switch Visual Studio project version

Even minor point releases of Visual Studio can cause significant behavior changes and ABI breakages. The full range of defaults that changes inside Visual Studio for a major release may be more than can be accommodated with user options. Switching the Visual Studio “platform toolset” version may be of help.

Switch Visual Studio platform toolset version

In Visual Studio, click Project → Properties → Platform Toolset. If the desired toolset is not present, use Visual Studio Installer to obtain it.

/permissive- VS2019 defaults

The standards-enforcing /permissive- flag is default in VS2019, but just turning that flag off may not be enough to compile projects from older MSVC versions. The /permissive- flag is under Project → Properties → C/C++ →; Language → conformance mode.

Ensure required packages present for pip install on offline systems

Offline (non-Internet-connected) systems may experience failures on installing Python packages. The package authors may not have CI setup for an offline test case, so they don’t realize it’s an issue. In general, Python packages can use, setup.cfg and pyproject.toml more effectively in a Python-standard way to overcome these issues.

Install package before

Instead of telling users to manually install a package such as Numpy that’s required before fully running, use pyproject.toml instead of setuptools setup_requires. This is because currently, setuptools assumes the computer will be internet-connected and even if the package is already installed, will fail on an offline system.

good example (most reliable and standard)

To ensure a package like Numpy is installed before fully runs, for example where f2py is used, have a pyproject.toml file including:

requires = ["setuptools", "wheel", "numpy"]

This will auto-install the prereqs before runs.

When including this pyproject.toml parameter, do not omit “setuptools”, “wheel” or the package may fail to install. PEP517 assumes unless told otherwise:

requires = ["setuptools", "wheel"]

such as by omitting them in a pyproject.toml file.

bad example (don’t do this)

These may fail on non-internet-connected systems, even if Numpy is already installed:

setuptools.setup(..., setup_requires=['numpy',])



setup_requires =

pip install downloaded .whl file

Pip is a widely-used, complex Python package installer program with a lot of legacy baggage. The Python Software Foundation recognizes the critical need to update Pip, putting $116K to sponsor a senior dev to modernize Pip.

Sometimes, pip install fails to realize a .whl binary wheel is available. Thus pip tries to download and install a package from source code. In the case of a large package like SciPy, OpenCV or Pillow on an embedded system like the Raspberry Pi Zero, it could take hours or even days to compile, probably failing numerous times due to missing prerequisite binary libraries.

Manual install wheel

A workaround to Pip not automatically finding the desired .whl binary wheel is to download it manually and install the .whl directly. The binary wheels are often available at PyPi from the package download page, for example SciPy. For embedded systems such as Raspberry Pi, there may be non-PyPi sites such as PiWheels.

Download the file, then pip install from the file like:


If the wheel binary is not compatible with the system, it will fail to import or run. In that case, simply pip uninstall my_package_name and try something else.

Homebrew binary bottle download

Downloading binary Homebrew bottles without installing Homebrew can be useful to check the location of the bottle contents. This is useful when developing Meson native-files or CMake Find*.cmake modules.

Since 2015, Homebrew distributes bottles from Bintray. For example, HDF5 bottle may be inspected by:

tar --list -f hdf5-1.10.5_1.catalina.bottle.tar.gz

No Homebrew install is necessary for inspection, but actually using the libraries and binaries is of course best done by installing Homebrew.

MyPy Python type checking single file or files

Usually it’s desired to check Python type annotations for an entire project. However, when introducing type checking to a large Python project, it can be beneficial to introduce type checking gradually, to avoid massive refactoring all at once. You will almost certainly find old code bugs due to type checking that will require refactoring, which introduces risk. Mitigate this risk by introducing type checking gradually.

MyPy single file options

MyPy is a strongly recommended type checker, though others exist. We typically use a per-project mypy.ini to make appropriate MyPy default behavior. To tell MyPy to only check certain files, use the MyPy –follow-imports= option like:

mypy --follow-imports=skip myproj/ myproj/

Only certain directories can be checked like:

mypy --follow-imports=skip myproj/sub/ myproj/sub2

Once the type checks pass via mypy --follow-imports=skip, we recommend trying

mypy --follow-imports=silent

to improve robustness of the type check for those files / directories.