I previously wrote a post about shipping a PyGame app to users on macOS. It’s now substantially updated for the new Notarization requirements in Catalina. I hope it’s useful to somebody!
Notarize your Python apps for macOS Catalina.
It’s October, and we’re all getting ready for Halloween, so allow me to me tell you a horror story, in Python:
Some of you might already be familiar with this chilling tale, but for those who might not have experienced it directly, let me briefly recap.
In Python, the default representation of a number with a decimal point in it is something called an “IEEE 754 double precision binary floating-point number”. This standard achieves a generally useful trade-off between performance, correctness, and is widely implemented in hardware, making it a popular choice for numbers in many programming language.
However, as our spooky story above indicates, it’s not perfect. 0.1 + 0.2 is very slightly less than 0.3 in this representation, because it is a floating-point representation in base 2.
If you’ve worked professionally with software that manipulates money1, you typically learn this lesson early; it’s quite easy to smash head-first into the problem with binary floating-point the first time you have an item that costs 30 cents and for some reason three dimes doesn’t suffice to cover it.
There are a few different approaches to the problem; one is using integers for everything, and denominating your transactions in cents rather than dollars. A strategy which requires less weird unit-conversion2, is to use the built-in
decimal module, which provides a floating-point base 10 representation, rather than the standard base-2, which doesn’t have any of these weird glitches surrounding numbers like 0.1.
This is often where a working programmer’s numerical education ends; don’t use floats, they’re bad, use decimals, they’re good. Indeed, this advice will work well up to a pretty high degree of application complexity. But the story doesn’t end there. Once division gets involved, things can still get weird really fast:
1 2 3
The problem is the same: before, we were working with 1/10, a value that doesn’t have a finite (non-repeating) representation in base 2; now we’re working with 1/7, which has the same problem in base 10.
Any time you have a representation of a number which uses digits and a decimal point, no matter the base, you’re going to run in to some rational values which do not have an exact representation with a finite number of digits; thus, you’ll drop some digits off the (necessarily finite) end, and end up with a slightly inaccurate representation.
But Python does have a way to maintain symbolic accuracy for arbitrary rational numbers -- the
1 2 3 4 5
You can multiply and divide and add and subtract to your heart’s content, and still compare against zero and it’ll always work exactly, giving you the right answers.
So if Python has a “correct” representation, which doesn’t screw up our results under a basic arithmetic operation such as division, why isn’t it the default? We don’t care all that much about performance, right? Python certainly trades off correctness and safety in plenty of other areas.
First of all, while Python’s willing to trade off some storage or CPU efficiency for correctness, precise fractions rapidly consume huge amounts of storage even under very basic algorithms, like consuming gigabytes while just trying to maintain a simple running average over a stream of incoming numbers.
But even more importantly, you’ll notice that I said we could maintain symbolic accuracy for arbitrary rational numbers; but, as it turns out, a whole lot of interesting math you might want to do with a computer involves numbers which are irrational: like π. If you want to use a computer to do it, pretty much all trigonometry3 involves a slightly inaccurate approximation unless you have a literally infinite amount of storage.
As Morpheus put it, “welcome to the desert of the ℝ”.
I’m a little annoyed at my Apple devices right now.
Time to complain.
“Trust us!” says Apple.
“We’re not like the big, bad Google! We don’t just want to advertise to you all the time! We’re not like Amazon, just trying to sell you stuff! We care about your experience. Magical. Revolutionary. Courageous!”
But I can’t hear them over the sound of my freshly-updated Apple TV — the appliance which exists solely to play Daniel Tiger for our toddler — playing the John Wick 3 trailer at full volume automatically as soon as it turns on.
For the aforementioned toddler.
I’m aware of the preferences which control autoplay on the home screen; it’s disabled now. I’m aware that I can put an app other than “TV” in the default spot, so that I can see ads for other stuff, instead of the stuff “TV” shows me ads for.
But the whole point of all this video-on-demand junk was supposed to be that I can watch what I want, when I want — and buying stuff on the iTunes store included the implicit promise of no advertisements.
At least Google lets me search the web without any full-screen magazine-style ads popping up.
Launch the app store to check for new versions?
I can’t install my software updates without accidentally seeing HUGE ads for new apps.
Launch iTunes to play my own music?
I can’t play my own, purchased music without accidentally seeing ads for other music — and also Apple’s increasingly thirsty, desperate plea for me to remember that they have a streaming service now. I don’t want it! I know where Spotify is if I wanted such a thing, the whole reason I’m launching iTunes is that I want to buy and own the music!
On my iPhone, I can’t even launch the Settings app to turn off my WiFi without seeing an ad for AppleCare+, right there at the top of the UI, above everything but my iCloud account. I already have AppleCare+; I bought it with the phone! Worse, at some point the ad glitched itself out, and now it’s blank, and when I tap the blank spot where the ad used to be, it just shows me this:
I just want to use my device, I don’t need ad detritus littering every blank pixel of screen real estate.
Knock it off, Apple.
The life changing magic of a minimal standard library.
Prompted by Amber Brown’s presentation at the Python Language Summit last month, Christian Heimes has followed up on his own earlier work on slimming down the Python standard library, and created a proper Python Enhancement Proposal PEP 594 for removing obviously obsolete and unmaintained detritus from the standard library.
PEP 594 is great news for Python, and in particular for the maintainers of its standard library, who can now address a reduced surface area. A brief trip through the PEP’s rogues gallery of modules to deprecate or remove1 is illuminating. The python standard library contains plenty of useful modules, but it also hides a veritable necropolis of code, a towering monument to obsolescence, threatening to topple over on its maintainers at any point.
However, I believe the PEP may be approaching the problem from the wrong
direction. Currently, the standard library is maintained in tandem with, and
by the maintainers of, the CPython python runtime. Large portions of it are
simply included in the hope that it might be useful to somebody. In the
aforementioned PEP, you can see this logic at work in defense of the
module: why not remove it? “The module is useful to convert CSS colors between
coordinate systems. [It] does not impose maintenance overhead on core
There was a time when Internet access was scarce, and maybe it was helpful to pre-load Python with lots of stuff so it could be pre-packaged with the Python binaries on the CD-ROM when you first started learning.
Today, however, the modules you need to convert colors between coordinate
systems are only a
pip install away. The bigger core interpreter is just
more to download before you can get started.
Why Didn’t You Review My PR?
So let’s examine that claim: does a tiny module like
maintenance overhead on core development”?
The core maintainers have enough going on just trying to maintain the huge and ancient C codebase that is CPython itself. As Mariatta put it in her North Bay Python keynote, the most common question that core developers get is “Why haven’t you looked at my PR?” And the answer? It’s easier to not look at PRs when you don’t care about them. This from a talk about what it means to be a core developer!
One might ask, whether Twisted has the same problem. Twisted is a big collection of loosely-connected modules too; a sort of standard library for networking. Are clients and servers for SSH, IMAP, HTTP, TLS, et. al. all a bit much to try to cram into one package?
I’m compelled to reply: yes. Twisted is monolithic because it dates back to a similar historical period as CPython, where installing stuff was really complicated. So I am both sympathetic and empathetic towards CPython’s plight.
At some point, each sub-project within Twisted should ideally become a separate
project with its own repository, CI, website, and of course its own more
focused maintainers. We’ve been slowly splitting out projects already, where
we can find a natural boundary. Some things that started in Twisted like
incremental have been split out;
are in the process of getting that treatment as well. Other projects absorbed
into the org continue to live separately, like
treq. As we
figure out how to reduce the overhead of setting up and maintaining the CI and
release infrastructure for each of them, we’ll do more of this.
But is our monolithic nature the most pressing problem, or even a serious problem, for the project? Let’s quantify it.
As of this writing, Twisted has 5 outstanding un-reviewed pull requests in our review queue. The median time a ticket spends in review is roughly four and a half days.2 The oldest ticket in our queue dates from April 22, which means it’s been less than 2 months since our oldest un-reviewed PR was submitted.
It’s always a struggle to find enough maintainers and enough time to respond to pull requests. Subjectively, it does sometimes feel like “Why won’t you review my pull request?” is a question we do still get all too often. We aren’t always doing this well, but all in all, we’re managing; the queue hovers between 0 at its lowest and 25 or so during a bad month.
By comparison to those numbers, how is core CPython doing?
Looking at CPython’s keyword-based review queue queue, we can see that there are 429 tickets currently awaiting review. The oldest PR awaiting review hasn’t been touched since February 2, 2018, which is almost 500 days old.
How many are interpreter issues and how many are stdlib issues? Clearly review latency is a problem, but would removing the stdlib even help?
For a quick and highly unscientific estimate, I scanned the first (oldest) page of PRs in the query above. By my subjective assessment, on this page of 25 PRs, 14 were about the standard library, 10 were about the core language or interpreter code; one was a minor documentation issue that didn’t really apply to either. If I can hazard a very rough estimate based on this proportion, somewhere around half of the unreviewed PRs might be in standard library code.
So the first reason the CPython core team needs to stop maintaining the standard library because they literally don’t have the capacity to maintain the standard library. Or to put it differently: they aren’t maintaining it, and what remains is to admit that and start splitting it out.
It’s true that none of the open PRs on CPython are in
colorsys3. It does
not, in fact, impose maintenance overhead on core development. Core
development imposes maintenance overhead on it. If I wanted to update the
colorsys module to be more modern - perhaps to have a
Color object rather
than a collection of free functions, perhaps to support integer color models -
I’d likely have to wait 500 days, or more, for a review.
As a result, code in the standard library is harder to change, which means its users are less motivated to contribute to it. CPython’s unusually infrequent releases also slow down the development of library code and decrease the usefulness of feedback from users. It’s no accident that almost all of the modules in the standard library have actively maintained alternatives outside of it: it’s not a failure on the part of the stdlib’s maintainers. The whole process is set up to produce stagnation in all but the most frequently used parts of the stdlib, and that’s exactly what it does.
New Environments, New Requirements
Perhaps even more importantly is that bundling together CPython with the definition of the standard library privileges CPython itself, and the use-cases that it supports, above every other implementation of the language.
Podcast after podcast after podcast after keynote tells us that in order to keep succeeding and expanding, Python needs to grow into new areas: particularly web frontends, but also mobile clients, embedded systems, and console games.
These environments require one or both of:
- a completely different runtime, such as Brython, or MicroPython
- a modified, stripped down version of the standard library, which elides most of it.
In all of these cases, determining which modules have been removed from the
standard library is a sticking point. They have to be discovered by a process
of trial and error; notably, a process completely different from the standard
process for determining dependencies within a Python application. There’s no
install_requires declaration you can put in your
setup.py that indicates
that your library uses a stdlib module that your target Python runtime might
leave out due to space constraints.
You can even have this problem even if all you ever use is the standard
python on your Linux installation. Even server- and desktop-class Linux
distributions have the same need for a more minimal core Python package, and so
they already chop up the standard library somewhat arbitrarily. This can break
the expectations of many python codebases, and result in bugs where even
install won’t work.
Take It All Out
How about the suggestion that we should do only a little a day? Although it sounds convincing, don’t be fooled. The reason you never seem to finish is precisely because you tidy a little at a time. [...] The ultimate secret of success is this: If you tidy up in one shot, rather than little by little, you can dramatically change your mind-set.
“The Life-Changing Magic of Tidying Up”
While incremental slimming of the standard library is a step in the right direction, incremental change can only get us so far. As Marie Kondō says, when you really want to tidy up, the first step is to take everything out so that you can really see everything, and put back only what you need.
It’s time to thank those modules which do not spark joy and send them on their way.
We need a “kernel” version of Python that contains only the most absolutely
minimal library, so that all implementations can agree on a core baseline that
gives you a “python”, and applications, even those that want to run on web
browsers or microcontrollers, can simply state their additional requirements in
Now, there are some business environments where adding things to your
requirements.txt is a fraught, bureaucratic process, and in those places, a
large standard library might seem appealing. But “standard library” is a purely
arbitrary boundary that the procurement processes in such places have drawn,
and an equally arbitrary line may be easily drawn around a binary distribution.
So it may indeed be useful for some CPython binary distributions — perhaps even
the official ones — to still ship with a broader selection of modules from
PyPI. Even for the average user, in order to use it for development, at the
very least, you’d need enough stdlib stuff that
pip can bootstrap itself, to
install the other modules you need!
It’s already the case, today, that
pip is distributed with Python, but
isn’t maintained in the CPython repository. What the default Python binary
installer ships with is already a separate question from what is developed in
the CPython repo, or what ships in the individual source tarball for the
In order to use Linux, you need bootable media with a huge array of additional programs. That doesn’t mean the Linux kernel itself is in one giant repository, where the hundreds of applications you need for a functioning Linux server are all maintained by one team. The Linux kernel project is immensely valuable, but functioning operating systems which use it are built from the combination of the Linux kernel and a wide variety of separately maintained libraries and programs.
The “batteries included” philosophy was a great fit for the time when it was created: a booster rocket to sneak Python into the imagination of the programming public. As the open source and Python packaging ecosystems have matured, however, this strategy has not aged well, and like any booster, we must let it fall back to earth, lest it drag us back down with it.
New Python runtimes, new deployment targets, and new developer audiences all present tremendous opportunities for the Python community to soar ever higher.
But to do it, we need a newer, leaner, unburdened “kernel” Python. We need to dump the whole standard library out on the floor, adding back only the smallest bits that we need, so that we can tell what is truly necessary and what’s just nice to have.
I hope I’ve convinced at least a few of you that we need a kernel Python.
Now: who wants to write the PEP?
Thanks to Jean-Paul Calderone, Donald Stufft, Alex Gaynor, Amber Brown, Ian Cordasco, Jonathan Lange, Augie Fackler, Hynek Schlawack, Pete Fein, Mark Williams, Tom Most, Jeremy Thurgood, and Aaron Gallagher for feedback and corrections on earlier drafts of this post. Any errors of course remain my own.
chunkare my personal favorites. ↩
Yeah, yeah, you got me, the mean is 102 days. ↩
Well, as it turns out, one is on
colorsys, but it’s a documentation fix that Alex Gaynor filed after reviewing a draft of this post so I don’t think it really counts. ↩
A quick and dirty guide to getting that little PyGame hack you did up and running on someone else’s Mac.
In honor of Eevee’s delightful Games Made Quick???, I’d like to help you package your games even quicker than you made them.
Who is this for?
About ten years ago I made a prototype of a little PyGame thing which I wanted to share with a few friends. Building said prototype was quick and fun, and very different from the usual sort of work I do. But then, the project got just big enough that I started to wonder if it would be possible to share the result, and thus began the long winter of my discontent with packaging tools.
I might be the only one, but... I don’t think so. The history of PyWeek, for example, looks to be a history of games distributed as Github repositories, or, at best, apps which don’t launch. It seems like people who participate in game jams with Unity push a button and publish their games to Steam; people who participate in game jams with Python wander away once the build toolchain defeats them.
So: perhaps you’re also a Python programmer, and you’ve built something with PyGame, and you want to put it on your website so your friends can download it. Perhaps many or most of your friends and family are Mac users. Perhaps you tried to make a thing with py2app once, and got nothing but inscrutable tracebacks or corrupt app bundles for your trouble.
If so, read on and enjoy.
If things didn’t work for me when I first tried to do this, what’s different now?
- the packaging ecosystem in general is far less buggy, and py2app’s dependencies, like setuptools, have become far more reliable as well. Many thanks to Donald Stufft and the whole PyPA for that.
- Binary wheels exist, and the community has been getting better and better at building self-contained wheels which include any necessary C libraries, relieving the burden on application authors to figure out gnarly C toolchain issues.
- The PyGame project now ships just such wheels for a variety of Python versions on Mac, Windows, and Linux, which removes a whole huge pile of complexity both in generally understanding the C toolchain and specifically understanding the SDL build process.
- py2app has been actively maintained and many bugs have been fixed - many thanks to Ronald Oussoren et. al. for that.
- I finally broke down and gave Apple a hundred dollars so I can produce an app that normal humans might actually be able to run.
There are still weird little corner cases you have to work around — hence this post – but mostly this is the story of how years of effort by the Python packaging community have resulted in tools that are pretty close to working out of the box now.
Step 0: Development Setup
Get a good Python.
My recommendation is to use an official build from python.org; these are already compiled in such a way that they will run on a wide range of macs, both new and old. Use a recent Python 3 version, if you can; there are a variety of low-level improvements which make it better for redistribution.
- My previous recommendation was to use Homebrew; this is wrong. Don’t use homebrew; it might build from source, and if it does, it might do it in a way which doesn’t work on a lot of macs out there. If you’re going to compile your own python from source, you need to familiarize yourself with a bunch of tips and tricks for making sure you don’t enable CPU-specific optimizations, too-recent SDK requirements, and so on.
- This goes for pyenv too; it can accidentally configure Python in ways that are not good for redistributables.
- Definitely don’t use the System python. Probably nothing will work.
You probably also want to use a
virtualenv for development. This post is
about how to build a for-real thing that other people can download, but part of
the magic of Python is the interactive, real-time dynamic nature of everything.
Running the full build pipeline every time you change a file or an asset is
slow and annoying. However, there’s a weird thing where certain parts of the
macOS GUI won’t work right (in PyGame’s case, mostly keyboard focus) unless
your code appears to be in an application bundle.
I made this dumb little
which lets you fake out enough of this that the OS won’t hassle you: you need
pip install venvdotapp; venvdotapp inside the virtualenv where you’re
making your pygame app.
pip install all your requirements into your
Step 1: Make an icon
All good apps need an icon, right?
When I was young, one would open up
Resorcerer MPW CodeWarrior
Project Builder Icon Composer Xcode and
create a new ICON resource cicn resource
.icns file. Nowadays there’s some weird opaque
xcassets files and
Contents.json and “Copy Bundle Resources” in
the default Swift and Objective C project templates and honestly I can’t be
bothered to keep track of what’s going on with this nonsense any more.
Luckily the OS ships with the macOS-specific “scriptable image processing system”, which can helpfully convert an icon for you. Make yourself a 512x512 PNG file in your favorite image editor (with an alpha channel!) that you want to use as your icon, then run it something like this:
somewhere in your build process, to produce an icon in the appropriate format.
There’s also one additional wrinkle with PyGame: once you’ve launched the
game, PyGame helpfully assigns the cute, but ugly, default PyGame icon to
your running process. To avoid this, you’ll need these two lines somewhere in
your initialization code, somewhere before
pygame.display.init (or, for that
Obviously this is pretty Mac-specific so you probably want this under some kind of platform-detection conditional, perhaps this one.
Step 2: Include All The Dang Files, I Don’t Care About Performance
Unfortunately py2app still tries really hard to jam all your code into a
file, which breaks the world in various hilarious ways. Your app will probably
have some resources you want to load, as will PyGame itself.
packages=["your_package"] in your setup.py should address this,
and it comes with a “pygame” recipe, but neither of these things worked for me.
Instead, I convinced py2app to splat out all the files by using the
not-quite-public “recipe” plugin API:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41
This is definitely somewhat less efficient than py2app’s default of stuffing the code into a single zip file, but, as a counterpoint to that: it actually works.
Step 3: Build it
Hopefully, at this point you can do
python setup.py py2app and get a shiny
new app bundle in
dist/$NAME.app. We haven’t had to go through the hell of
yet, so it should launch at this point. If it doesn’t, sorry :-(.
You can often debug more obvious fail-to-launch issues by running the
executable in the command line, by running
./dist/$NAME.app/Contents/MacOS/$NAME. Although this will run in a slightly
different environment than double clicking (it will have all your shell’s env
vars, for example, so if your app needs an env var to work it might
mysteriously work there) it will also print out any tracebacks to your
terminal, where they’ll be slightly easier to find than in Console.app.
Once your app at least runs locally, it’s time to...
Step 4: Code sign it
All the tutorials that I’ve found on how to do this involve doing Xcode project goop where it’s not clear what’s happening underneath. But despite the fact that the introductory docs aren’t quite there, the underlying model for codesigning stuff is totally common across GUI and command-line cases. However, actually getting your cert requires Xcode, an apple ID, and a credit card.
After paying your hundred dollars, go into Xcode, go to Accounts, hit “+”, “Apple ID”, then log in. Then, in your shiny new account, go to “Manage Certificates”, hit the little “+”, and (assuming, like me, you want to put something up on your own website, and not submit to the Mac App Store), and choose Developer ID Application. You probably think you want “mac app distribution” because you are wanting to distribute a mac app! But you don’t.
Next, before you do anything else, make sure you have backups of your certificate and private key. You really don’t want to lose the private key associated with that cert.
Now quit Xcode; you’re done with the GUI.
You will need to know the identifier of your signing key though, which should be output from the command:
You probably want to put that in your build script, since you want to sign with
the same identity every time. Further commands here will assume you’ve copied one of the lines of results from that command and done
export IDENTITY="..." with it.
Step 4a: Become Aware Of New Annoying Requirements
Update for macOS Catalina: In Catalina, Apple has added a new code-signing requirement; even for apps distributed outside of the app store, they still have to be submitted to and approved by Apple.
In order to be notarized, you will need to codesign not only your app itself, but to also:
- add the hardened-runtime exception entitlements that allow Python to work, and
- directly sign every shared library that is part of your app bundle.
So the actual code-signing step is now a little more complicated.
Step 4b: Write An Entitlements Plist That Allows Python To Work
One of the features that notarization is intended to strongly encourage1 is the “hardened runtime”, a feature of macOS which opts in to stricter run-time behavior designed to stop malware. One thing that the hardened runtime does is to disable writable, executable memory, which is used by JITs, FFIs ... and malware.
Unfortunately, both Python’s built-in
ctypes module and various popular bits
of 3rd-party stuff that uses
pyOpenSSL, require writable,
executable memory to work. Furthermore,
py2app actually imports
during its bootstrapping phase, so you can’t even get your own code to start
running to perform any workarounds unless this is enabled. So this is just
if you want to use Python, not if your project requires
To make this long, sad story significantly shorter and happier, you can create an entitlements property list that enables the magical property which allows this to work. It looks like this:
1 2 3 4 5 6 7 8
Subsequent steps assume that you’ve put this into a file called
entitleme.plist in your project root.
Step 4c: SIGN ALL THE THINGS
Notarization also requires that all the executable files in your bundle, not
just the main executable, are properly code-signed before submitting. So
you’ll need to first run the
codesign command across all your shared
libraries, something like this:
1 2 3 4 5 6 7 8 9
Then finally, sign the bundle itself.
1 2 3 4 5
Now, your app is code-signed.
Step 5: Archive it
The right way to do this is probably to use dmgbuild or something like it, but what I promised here was quick and dirty, not beautiful and best practices.
You have to make a Zip archive that preserves symbolic links. There are a couple of options for this:
open dist/, then in the Finder window that comes up, right click on the app and “compress” it
cd dist; zip -yr $NAME.app.zip $NAME.app
Most importantly, if you use the
zip command line tool, you must use the
-y option. Without it, your downloadable app bundle will be somewhat
mysteriously broken even though the one before you
zipped it will be fine.
Step 6: Actually The Rest Of Step 4: Request Notarization
Notarization is a 2-step process, which is somewhat resistant to fully automating. You submit to Apple, then they email you the results of doing the notarization, then if that email indicates that your notarization succeded, you can “staple” the successful result to your bundle.
The thing you notarize is an archive, which is why you need to do step 5 first. Then, you need to do this:
1 2 3 4 5
Be sure that
YOUR_BUNDLE_ID matches the
CFBundleIdentifier you told py2app
about before, so that the tool can find your app bundle inside the archive.
You’ll also need to type in the iCloud password for your Developer ID account here.2
Step 6a: Wait A Minute
Anxiously check your email for an hour or so. Hope you don’t get any errors.
Step 6b: Finish Notarizing It, Finally!
Once Apple has a record of the app’s notarization, their tooling will recognize
it, so you don’t need any information from the confirmation email or the
previous command; just make sure that you are running this on the exact same
.app directory you just built and archived and not a version that differs in
Finally, you will want to archive it again:
Step 7: Download it
Ideally, at this point, everything should be working. But to make sure that code-signing and archiving and notarizing and re-archiving went correctly, you should have either a pristine virtual machine with no dev tools and no Python installed, or a non-programmer friend’s machine that can serve the same purpose. They probably need a relatively recent macOS - my own experience has shown that apps made using the above technique will definitely work on High Sierra (and later) and will definitely break on Yosemite (and earlier); they probably start working at some OS version between those.
There’s no tooling that I know of that can clearly tell you whether your mac
app depends on some detail of your local machine.
Even for your
dependencies, there’s no auditwheel for macOS.
Updated 2019-06-27: It turns out there is an
auditwheel like thing for macOS:
fact, it predated and inspired
Thanks to Nathaniel Smith for the update (which he provided in, uh, January of 2018 and I’ve only just now gotten around to updating...).
Nevertheless, it’s always a good idea to check your final app build on a fresh computer before you announce it.
If you were expecting to get to the end and download my cool game, sorry to disappoint! It really is a half-broken prototype that is in no way ready for public consumption, and given my current load of personal and professional responsibilities, you definitely shouldn’t expect anything from me in this area any time soon, or, you know, ever.
But, from years of experience, I know that it’s nearly impossible to summon any motivation to work on small projects like this without the knowledge that the end result will be usable in some way, so I hope that this helps someone else set up their Python game-dev pipeline.
I’d really like to turn this into a 3-part series, with a part for Linux (perhaps using flatpak? is that a good thing?) and a part for Windows. However, given my aforementioned time constraints, I don’t think I’m going to have the time or energy to do that research, so if you’ve got the appropriate knowledge, I’d love to host a guest post on this blog, or even just a link to yours.
If this post helped you, if you have questions or corrections, or if you’d like to write the Linux or Windows version of this post, let me know.
The hardened runtime was originally required when notarization was introduced. Apparently this broke too much software and now the requirement is relaxed until January 2020. But it’s probably best to treat it as if it is required, since the requirement is almost certainly coming back, and may in fact be back by the time you’re reading this. ↩
You can pass it via the
--passwordoption but there are all kinds of security issues with that so I wouldn’t recommend it. ↩