Conda mingw-w64 path conflicts

On Windows, Anaconda/Miniconda can install GCC libraries such as libstdc++ and libgfortran et al under miniconda3/Library/mingw-w64/bin. This path is put by default by fresh Anaconda/Miniconda installation on Path by “conda activate”, but no files are installed there until certain packages are installed. These libraries are for a particular GCC version, and are not necessarily ABI compatible with the GCC version on the system perhaps from MSYS2.

This problem can manifest as Windows error code 139 when running other executables that need say libstdc++ but Anaconda’s path is overriding the desired GCC lower on Path.

The solution to this problem could be either:

  • uninstall the conda Python package that installed those libraries
  • rename the Library/mingw-w64/bin e.g. to bin-backup and see if the needed Python packages still work

Detect WSL from Python in Windows

It’s also straightforward to detect if

  • running inside WSL
  • WSL is available

These functions use only Python standard library modules.

Detect if inside WSL

This function detects if Python is running in WSL.

from platform import uname

def in_wsl() -> bool:
    return 'microsoft-standard' in uname().release

Detect WSL from Windows (outside WSL)

import os
import shutil

def wsl_available() -> bool:
    heuristic to detect if Windows Subsystem for Linux is available.

    Uses presence of /etc/os-release in the WSL image to say Linux is there.
    This is a de facto file standard across Linux distros.
    if == "nt":
        wsl = shutil.which("wsl")
        if not wsl:
            return False
        # can't read this file or test with
        # pathlib.Path('//wsl$/Ubuntu/etc/os-release').
        # A Python limitation?
        ret =["wsl", "test", "-f", "/etc/os-release"])
        return ret.returncode == 0

    return False

CMake Ninja Multi-Config generator

We generally recommend using Ninja with CMake, especially for large projects. Ninja speeds up rebuild times significantly and avoids erratic problems with slower GNU Make. CMake also has the Ninja Multi-Config generator

cmake -G "Ninja Multi-Config"

that allows building multiple build configuration types e.g. Debug, Release without CMake regenerating build*.ninja files for each build type with totally distinct build root directories.

Ninja Multi-Config generator options may be used transparently with older CMake and different generators. The default CMake generator can be set with environment variable:

CMAKE_GENERATOR="Ninja Multi-Config"

NOTE: CMAKE_CONFIGURATION_TYPES was broken in CMake 3.22.0, where it was introduced.

To make the default multi-config build “Release” while making Debug config also available, set environment variable CMAKE_CONFIGURATION_TYPES:


Because Release is listed first, the default build type is Release. With those settings, one can quickly build Debug and Release and test like:

cmake -B build

# Release default per above settings
cmake --build build

ctest --test-dir build -C Release

Then quickly build and test Debug without regenerating CMake/Ninja build files by:

cmake --build build --config Debug

ctest --test-dir build -C Debug

CMake version recommendations and install

CMake ≥ 3.19 is strongly recommended for general users for more robust and easy syntax. For project developers, we recommend CMake >= 3.22 as the new features make debugging CMake and setting up CI considerably easier.

Downloading the latest release of CMake is usually easy. For Linux and Mac, admin/sudo is NOT required.

There is an unofficial PyPI CMake package:

python -m pip install cmake

For platforms where CMake binaries aren’t easily available, use build_cmake.cmake.

Key features added

The priority of these features is subjective–we write from the scientific computing perspective.

CMake 3.22 adds several CMake Environment Variables that are generally useful. CMAKE_BUILD_TYPE default for single configuration build systems. CMAKE_CONFIGURATION_TYPES defaults available configurations for multi-config build systems like Ninja Multi-Config–this is broken for initial CMake 3.22.0 release. CMAKE_INSTALL_MODE makes symlinks with copy fallback a good choice for installing programs from CMake. For CTest, the new ENVIRONMENT_MODIFICATION test property makes modifying environment variables for test(s) much easier.

CMake 3.21 adds more preset features, including making “generator” optional–the default CMake behavior will be used to determine generator. The cmake --install-prefix option can be used instead of the cumbersome cmake -DCMAKE_INSTALL_PREFIX=. PROJECT_IS_TOP_LEVEL and <PROJECT-NAME>_IS_TOP_LEVEL identify if a project is at the top of the project hierarchy. ctest --output-junit gives test output in standard tooling format.

CMake 3.20 adds support for Intel NextGen LLVM compiler and NVIDIA HPC compiler. ExternalProject_Add() learned CONFIGURE_HANDLED_BY_BUILD which avoids CMake commanding a reconfigure on each build. try_compiler(WORKING_DIRECTORY) was added. CMake presets in CMakePresets.json now covers configure, build and test, allowing many parameters to be declared with inheritance in JSON. CMake presets are a key feature for CI, as well as user configurations. ctest --test-dir build option avoids the need to manually cd build. cmake_path allows path manipulation and introspection without actually touching the filesystem.

CMake 3.19 added support for ISPC language. string(JSON GET|SET) parsing is very useful to avoid hard-coding parameters. FindPython/find_package accepts version ranges. Intel oneAPI works with CMake >= 3.19.6. Emits deprecation warning for cmake_minimum_required VERSION less than 2.8.12. CMakePresets.json enables configure parameter declarations in JSON.

CMake 3.18 adds CMake Profiler

cmake -B build --profiling-output=perf.json --profiling-format=google-trace

Adds REQUIRED parameter to find_program. Adds file(ARCHIVE_CREATE) and file(ARCHIVE_EXTRACT)

CMake 3.17 adds Ninja Multi-Config generator. cmake –debug-find shows what find_*() is doing. Eliminates Windows “sh.exe is on PATH” error. Recognizes that Ninja 1.10 correctly works with Fortran.

CMake 3.16 adds precompiled headers, unity builds, many advanced project features.

CMake 3.15 adds CMAKE_GENERATOR environment variable that works like global -G option. Enhances Python interpreter finding. Adds cmake --install command instead of “cmake –build build –target install”. Added Zstd compression.

CMake 3.14: I created check_fortran_source_runs(). FetchContent was enhanced with simpler syntax. The transitive link resolution was considerably enhanced in CMake 3.14. Projects just work in CMake >= 3.14 that fail at link-time with CMake < 3.14.

We don’t recommend use of the older CMake versions below as they take significantly more effort to support.

CMake 3.13 adds ctest --progress and better Matlab compiler support. Lots of new linking options are added, fixes to Fortran submodule bugs. The very convenient cmake -B build incantation, target_sources() with absolute path are also added. It’s significantly more difficult to use CMake older than 3.13 with medium to large projects.

CMake 3.12 adds transitive library specification (out of same directory) and full Fortran Submodule support. get_property(_test_names DIRECTORY . TESTS) retrieves test names in current directory.

CMake 3.11 allows specify targets initially w/o sources. FetchContent is added, allowing fast hierarchies of CMake and non-CMake projects.

The versions of CMake below have been deprecated as of CMake 3.19.

CMake 3.10 added Fortran Flang (LLVM) compiler and extensive MPI features.

CMake 3.9 added further C# and Cuda support, that was originally added in CMake 3.8.

CMake 3.8 added initial Cuda support

CMake 3.7 added comparing ≤ ≥ and version comparisons. Initial Fortran submodule support was added.

CMake 3.6 gave better OpenBLAS support.

CMake 3.5 enhanced FindBoost target with auto Boost prereqs.

CMake 3.4 added if(TEST) to see if a test name exists.

CMake 3.3 added list operations such as IN_LIST.

CMake apply patch file

Patching files with GNU Patch has been a common task for decades. Due to technical issues with potential patch library implementation, CMake does not include a native patch function. We use CMake to detect if “patch” is available, which it virtually always is on Unix-like systems. For Windows, we use MSYS2 or WSL.

CMake Patch file repository example uses standard GNU Patch files across operating systems. We put the patched file into the binary directory to avoid ambiguity. The file patching occurs only if the source file changes using canonical CMake approach.

With slight modification, this patching can occur on files obtained via CMake FetchContent, header files, or other resources.

Python 3 Microsoft Visual C++ 14.0 is required

Fix Python on Windows:

error Microsoft Visual C++ 14.0 is required

by installing Microsoft Build Tools for Visual Studio.

Select: Workloads → Desktop development with C++, then for Individual Components, select only:

  • Windows SDK
  • C++ x64/x86 build tools

The build tools allow using MSVC “cl.exe” C / C++ compiler from the command line.

Visual Studio changed the Build Tools from being C++ specific in late 2017. Thus newer Visual Studio versions work in place of older versions.

Windows Python needs Visual C++ libraries installed via the SDK to build code, such as via setuptools.extension.Extension or numpy.distutils.core.Extension. For example, building f2py modules in Windows with Python requires Visual C++ SDK as installed above. On Linux and Mac, the C++ libraries are installed with the compiler.

Python / Visual Studio build matrix

Python Visual C++ 10.0 vcvarsall.bat

Python 2.7 on Windows:

Microsoft Visual C++ 10.0 is required (unable to find vcvarsall.bat)

was formerly fixed by installing Microsoft Visual C++ Compiler for Python 2.7. However, this download is NO LONGER AVAILABLE. It’s recommended in general to use a recent Python 3 version instead.

Visual Studio is used for Python on Windows whenever C, C++, Fortran or other compiled language extension modules are used from Python. For example, a C library may be used for better performance, seamlessly called from Python.

Related: Fix Python3 error Visual C++ 14.0 required

Matlab code on Raspberry Pi

Matlab offers embedded system support including most Raspberry Pi models.

A downside of this prototyping method is reliance on an opaque proprietary toolchain. If your prototype has a corner case Mathworks didn’t anticipate you may have to recode in another language such as Python that runs directly on the embedded computer.

Another alternative is running the Matlab script directly on the embedded system using GNU Octave. We have achieved 30+ fps batch processing on video motion streams on the Raspberry Pi Zero with GNU Octave. Popular Raspberry Pi operating systems can easily install a recent GNU Octave. We suggest installing Octave with

apt install --no-install-recommends octave

We suggest first installing Octave on your laptop to verify the particular Matlab script works with Octave. If the script doesn’t work with Octave and it seems too difficult to do so, consider transitioning to Python.

CMake MacOS Framework link

MacOS Frameworks are linked with the -framework flag. CMake has special handling for these flags, so target_link_libraries should be used for them. The quoting in the example below is necessary to get CMake handling the flags correctly.

add_executable(main main.c)

  target_link_libraries(main PRIVATE
  "-framework Foundation"
  "-framework IOKit"

Alternatively, use find_library() with the framework name.

Python minimal package

The optional “” for setuptools-based Python packages can be reduced to a one-line file for simple Python packages, by putting the project metadata in setup.cfg. The example setup.cfg file below is associated with a file containing merely:

from setuptools import setup; setup()

This is installed as usual like:

python -m pip install -e .

It can be most effective to put all project configuration, including Python package prerequisites in setup.cfg instead of setup.cfg is human-readable and machine-parseable without first installing the package. Putting as many parameters as possible into setup.cfg instead of is important and beneficial for reasons including:

  • reproducible results
  • security risk mitigation
  • getting package prerequisite tree list

This is an example of best practices (since 2016) of minimal using setup.cfg. It does not use requirements.txt.

setup.cfg holds the machine-readable configuration, easy for humans too. The “version” is contained in file “mypkg/” as Python code:

__version__ = "1.2.3"

setup.cfg file contains:

name = mypkg
author = Joe Smith
author_email =
description = My awesome program prints cool messages
version = attr: mypkg.__version__
url =
keywords =
  cool printing
classifiers =
  Development Status :: 4 - Beta
  Intended Audience :: Science/Research
  Programming Language :: Python :: 3
  Topic :: Scientific/Engineering
license_files =

python_requires = >= 3.8
packages = find:
zip_safe = False
install_requires =
#  numpy

tests =

console_scripts =
#  joesprog = mypkg:__main__:cli

“console_scripts” expects a file “mypkg/” with the function “cli()” designed to accept command line input, perhaps using argparse.ArgumentParser

The companion to setup.cfg is pyproject.toml. pyproject.toml is used in general for Python project settings, including ones not using setuptools. The standard way to communicate that setuptools is used for a Python project is with pyproject.toml containing:

requires = ["setuptools", "wheel"]

PEP8 checking via flake8 is configured in .flake8:

max-line-length = 132
exclude = .git,__pycache__,doc/,docs/,build/,dist/,archive/
per-file-ignores =

MyPy type hint checking is configured via .mypy.ini. is used to specify external files installed.


  • install_requires cannot read requirements.txt as file: is not in the setup.cfg install_requires input types
  • python_requires parameter avoids user confusion if they have an old Python version. Instead of them opening a GitHub issue or just not using your program at all, they’ll get a useful error message.

Classifiers are optional, but help your project be indexed better in PyPI (and hence search engines). Classifiers must be from this official classifiers list or they will fail when uploading your package to PyPI.

Python can easily import Fortran code using f2py. See this f2py example setuptools 38.6 and wheel 0.31 support Markdown README. Pip 10 brought pyproject.toml support, necessary for clean handling of Python Extension Modules via Numpy as well as setup.cfg support.

Related: Upload user Python module to PyPI