There is considerable additional effort required to support Python < 3.6 in general while using concurrent and other performant Python programming with the latest common modules like Numpy, Xarray, Matplotlib, etc.
Python 3 is used by a large and growing majority of new and active Python developers in science, engineering, medical research and education.
Python 3 was released in December 2008.
While initially there was pushback over backward incompatibilities, the performance, efficiencies and features of Python 3 have won out.
The main holdouts are of the same nature as those that hang on to COBOL systems.
Those with static, unchanging requirements in large proprietary codebases that few people are familiar with.
Some programmers thrive and make a decent living servicing those legacy COBOL and Python environments.
The majority of STEM coders, analysts and educators have been writing Python 3 code.
The Python 3 objections were mostly written before 2016 and almost all were before 2017.
Some of their complaints were addressed in Python 3.6 (released December 23, 2016).
A main issue over Python 3 is over the separation between bytes and strings.
Applications with IoT and embedded systems distinguish between bytes and strings, so I appreciate the separation of bytes and strings in Python 3.
For the global environment I write for, I appreciate that strings are Unicode in Python 3.
Python 3 efficiencies in terms of programmer time come in more efficient syntax.
The Python 3 core itself is as much as 10% faster in several aspects, with some standard modules like re processing regular expressions as much as 20x faster.
The modernize tool and six and __future__ modules smooth over most of these changes to make backward compatible code.
Some of the most enticing changes from Python ≥ 3.6 are not easily backportable.
These features simplify syntax and make the code more self-documenting.
asyncio brings core Python features that used to require Tornado, twisted, etc.
Asynchronous execution is required for applications that need to scale massively.
IoT applications where remote sensors report in are a perfect use case for asyncio.
asyncio is a
convenient,
relatively safe way to thread baked right into Python.
Least-recently used caching
is enabled by a decorator to functions.
For example, you have a computationally-expensive function where you sometimes use the same arguments, but don’t want to bother tracking the output by saving in a variable.
LRU caching is as simple as:
importfunctools@functools.cachedeffib(n):
if n < 2:
return n
return fib(n-1) + fib(n-2)
print([fib(n) for n inrange(16)])
print(fib.cache_info())
This function would not crash if you fed int in on most interpreters, but
PyCharm IDE
and other can emit warnings when conflicting variable types are passed in/out.
Python
argument unpacking,
unpacks iterables into functions requiring multiple arguments, expanded with *iterable.
Multiple iterables can be unpacked into a single function.
ipaddress
is a useful standard library feature for manipulating IP addresses, used in the
findssh
program to scan for active servers in IP address ranges.
Object-oriented pathlib
is standard library and replaces most os.path functions.
f-strings allow f'This is {weight} kg for {price} dollars. instead of 'This is {} kg for {} dollars'.format(weight,price)
Python 3.7
adds several more compelling reasons to upgrade.
Patreon
transitioned from PHP → Python 3 in 2015.
Key appeals for Patreon to switch to Python 3 included:
future-proofing
appeal to developer hiring, working with latest language
lower risk than porting to static typed language like Scala
Starting in 2010, Arch Linux defaulted to Python 3.
Ubuntu 17.10 defaulted to Python 3.
Ubuntu 18.04
requires Python 3
from programs in the main Ubuntu repository with default Python 3.6.
The goal is to demote Python from the main repository.
Executable Python scripts should continue to have the first line
#!/usr/bin/env python
so that users can configure their desired Python version.
Many users install a third party Python distribution such as
Anaconda Python, PyCharm,
Intel Python,
etc. that have high performance math libraries such as Cuda-based
CuPy.
Very detailed notes from Python Software Foundation Fellow Nick Coghlan on why, when, what, how, where of Python 3 transition with fascinating historical notes.
ActiveState saw majority of downloads being Python 3 since January 2017.
For a dual-boot Windows / Linux PC, set BIOS / UEFI to boot to the Windows hard drive, especially if using Windows BitLocker.
If Windows won’t boot, the Windows HDD boot sector may need repair.
Windows error 0x800703ed may occur if a dual boot system tries to start Windows from Grub instead of directly via BIOS / UEFI selection.
It is best to use a separate hard drive for Windows.
Backup the PC to an external hard drive or the cloud.
Create bootable Windows USB drive via Windows Media Creation
Tool.
Use Rufus to write the
ISO to USB.
Boot from USB and select a partition to install Windows on.
Don’t delete Recovery partitions or you may lose your Windows OEM license.
Using the
LabVIEW Shared Variable Server
with remote computers and devices requires allowing certain network ports to pass through the Windows Firewall.
As compared to LabVIEW Network Streams, Shared Variables can be better for polling of variable states.
LabVIEW Shared Variables use
network ports.
HSTS
can enhance security, so normally we’d like to have HSTS working.
If the Wget HSTS database file permissions are incorrect, wget may emit messages like:
Will not apply HSTS. The HSTS database must be a regular and non-world-writable file.
could not open HSTS store at '~/.wget-hsts'. HSTS will be disabled.
/dev/shm/ and /run/shm/ map
shmem
shared memory to a RAM drive on typical Linux systems, useful for IPC.
They are also present in Windows Subsystem for Linux (WSL).
On WSL, they write to HDD instead of RAM.
It’s easy to tell if RAM vs. HDD is being used for shared memory, since RAM has GB/sec speeds vs. HDD having MB/sec speeds.
Note free space in df -kh /dev/shm and free -h
Write 1 GB to shmem:
dd if=/dev/zero of=/dev/shm/blah bs=10M count=100
Recheck df -kh /dev/shm and free -h for which one has 1GB space more used.
This tells if hard drive or RAM was used.
For programs using shared memory for heavy writing operations:
order(s) of magnitude slower /dev/shm/ operations when HDD is used versus RAM
wearing of solid state drive if /dev/shm is pointed there
Even minor point releases of Visual Studio can cause significant behavior changes and ABI breakages.
The full range of defaults that changes inside Visual Studio for a major release may be more than can be accommodated with user options.
Switching the Visual Studio “platform toolset” version may be of help: in Visual Studio, click Project → Properties → Platform Toolset.
If the desired toolset is not present, use Visual Studio Installer to obtain it.
The standards-enforcing /permissive- flag can be turned off.
That may not be enough to compile projects from older MSVC versions.
The /permissive- flag is under Project → Properties → C/C++ →; Language → conformance mode.
Offline (non-Internet-connected) systems may experience failures on installing Python packages.
The package authors may not have CI setup for an offline test case, so they don’t realize it’s an issue.
In general, Python packages can use pyproject.toml more effectively in a Python-standard way to overcome these issues.
Instead of telling users to manually install a package such as Numpy, use pyproject.toml instead of setuptools setup_requires.
Setuptools assumes the computer will be internet-connected and even if the package is already installed the install will fail on an offline system.
To ensure a package like Numpy is installed first, for example where f2py is used,
have a pyproject.toml file including:
This will auto-install the prereqs before the install begins.
When including this pyproject.toml parameter, do not omit “setuptools”, “wheel” or the package may fail to install.
Pip is a widely-used, complex Python package installer program with a lot of legacy baggage.
The Python Software Foundation recognizes the critical need to update Pip, putting $116K to
sponsor a senior dev
to modernize Pip.
Sometimes, pip install fails to realize a .whl binary wheel is available.
Thus pip tries to download and install a package from source code.
In the case of a large package like SciPy, OpenCV or Pillow on an embedded system like the Raspberry Pi Zero,
it could take hours or even days to compile, probably failing numerous times due to missing prerequisite binary libraries.
A workaround to Pip not automatically finding the desired .whl binary wheel is to download and install the .whl directly.
The binary wheels are often available at PyPI from the package download page, for example
SciPy.
For embedded systems such as Raspberry Pi, there may be non-PyPI sites such as
PiWheels.
Download the file, then pip install from the file like:
If the wheel binary is not compatible with the system, it will fail to import or run.
In that case, simply pip uninstall my_package_name and try something else.