- New more consistent layout on many screens, with improved accessibility.
- Fixed game simulation result not displaying on Game Setup screen.
- Pass light/dark theme to release notes.
There are some cool game changes in development, but they aren’t in this version. This version does contain significant changes to ads however.
- Added Google User Messaging Platform consent flow for ads.
- Removed non-rewarded interstitial ads.
- Fixed campaign not saving after loading a complete game.
A minor update - the site now supports dark mode for devices set to dark mode. A later version of the app will also pass through your custom theme setting when opening site pages such as release notes.
Actual change to policy
Analytics is no longer opt-out. For users subject to the GDPR, it is made explicit that data collected this way is done on the basis of legitimate interest and not consent.
There is a summary section that is intended to be as concise and human readable as possible.
Firstly a simple change: interstitial video ads have been removed. These previously could appear at the end of each game, but they will no longer.
The bigger change is the introduction of Google’s User Messaging Platform to manage consent. This will combine the previous separate dialogs that would appear about consent for analytics and consent for displaying ads into a single flow.
There are advantages and disadvantages to this, which I’ve expanded on over on my personal blog.
For the last few version I’ve been implementing the ability to share game modes. Actually triggering a share was made user facing in 0.20 but the game could load them (with varying degrees of success) before then.
It was a multiple step process because it more-or-less requires a website to exist. For example if you visit this page it includes a link to “try opening in the app”. If you click the link on desktop it will take you back to the same place since that is the fallback page for the sharing link.
An additional feature I developed at the same time as sharing is the ability to embed little game mode widgets in the site.
Another thing I’ve been doing that uses the sharing link is creating YouTube videos of the interesting game modes feature the AI.
- Added new options to modify game to allow a variable number of human players.
- Prevent AI from playing when popups displaying.
- Fixed games not showing results with no humans.
- Fixed AI not understanding misère.
- Updated UI icons.
- Added settings sharing.
- Changing to a non-bounded topology now automatically enables “allow cell reuse”.
- Added some network information to local network screen.
I have tried to be aggressive about keeping the size of the app down as it has grown. I made a specific decision early on to avoid image assets and I went through quite some hassle to enable build features to keep the code small.
A while ago Google introduced a new way to package and distribute Android apps called Android App Bundle.
Previously when distributing an Android app, the developer would produce an Android Application Package (often called an APK, based on its standard file extension) that included everything needed to run on a variety of Android devices. This could include things like assets at different resolutions and code for different device architectures that were not all needed on any single device.
It was possible to produce different APKs that were more specialized but it was quite a bit more effort to setup a build pipeline to produce them and distribute them. With App Bundles, Google Play does this for you. An App Bundle contains the same wide array of content for different devices, but Google Play generates a device specific APK for each device that downloads it.
The result is quite staggering - much more than I expected. The graph here shows the size of the app in MB over time, with the conspicuous drop at the end when I switched to using App Bundles. (One interesting thing to note is that the line splits into two at the end. The size is now slightly variable based on device and the split represents the smallest and largest APKs possible).
The Android version is now distributed using an App Bundle on Google Play.
0.20.1 includes updated logic for “fairness calculations”. This has already been pushed for the server so this is in fact available in all versions when running with cloud simulation enabled.
This came about mainly because of my efforts to try and generate interesting game settings. My idea was to sort all the setting combinations by fairness, generate new ones based on them (via some minimal genetic algorithm), and then repeat until some good results fall out.
My first attempts did not work well. The game simulation works by playing a few turns using the normal AI, and then seeing what it thinks its chances of winning from the current position are. I defined “fairness” as: take the chance of winning of the best player and the chance of winning of the worst player, and then calculate the ratio of the two. Here are some of the types of games it generated at first:
- Games with lots of players. Since nobody can win, it is perfectly fair.
- Games with a win line that is too big to reach.
- Games with a target score that is too big to reach.
I fixed some of it by doing some basic processing on settings before simulating. If the winline won’t fit, or there aren’t enough unique winlines to achieve the target score, I just reject it. Also, for now, I limited it to just two players.
The results were better, but basically just a more complex version of the above: games in which the AI did not manage to actually achieve a win in its simulations.
Refining the AI
One possible problem I identified was that the “chances of winning” calculation isn’t really about winning. It’s actually a kind of fuzzy number combining winning, and tying for first. In this case, tying for first is something I want to avoid. To fix this I parameterized the core AI engine, so the way it assigns a score to outcomes could be varied. There were initially two options:
- Original AI: Winning scores 1, tying for first scores 0.5, everything else scores 0
- “Win only” AI": Winning scores 1, everything else scores 0
This means I can now look for games that have a high fairness, and a reasonably high chance of actually winning a game.
First derived game mode
Final updated AI
Separate to my goals of generating interesting game modes, I also took this opportunity to actually improve the AI. For games with more than two players, the (normal) AI now takes into account relative positions. So, for example with three players the following outcomes all have unique scores:
- Finishing first, with no ties.
- Tying for first.
- Finishing second, with no ties.
- Tying for second.
- Finishing third.
This hopefully stops the situation in which multiplayer games ends up with several of the AIs “giving up” since they’ve determined they can’t possibly finish first.