I’m a queer Creative Programmer, with a side dish of design and consultancy, and a passion for research and artistic applications of technology. I work as a Technical Director at Flammable Penguins Games on unannounced title.
I've had a long career in games and I still love them, also spent a few years building creative tools at Adobe.
Love living in London.
When I'm not programming, playing games, roleplaying, learning, or reading, you can typically find me skating or streaming on Twitch.
Recently, I attended GodotCon and gave a talk at Develop North. GodotCon was eye-opening, and at Develop North, I discussed using Godot professionally. I shouldn't post the talk here, as it's better suited for a conference environment, but much of the information is already on my website in articles.
I want to address a phenomenon being discussed in many circles: how does a professional with two decades of experience differentiate themselves from a university or even teenage student using Unreal 5? Many initial indicators we used to rely on are no longer true. The accessibility and power of middleware, AI code tools, and future AI generation tools make it harder to spot the talent or effort behind a project at first glance.
Of course, a seasoned veteran can examine code, art, or anything they're familiar with and determine the quality. However, most consumers can't, and even seasoned veterans struggle to evaluate a vertical slice from a student team versus an experienced team working fast.
This leads to a problem I faced with saving and loading in our current game. Serialization is an interesting topic, often involving level replays, networking, and various state serialization issues. However, if you're using Godot, Unity, or Unreal, it's almost trivial to serialize a section of the scene graph or complex objects to JSON, save it, and load it back.
Many developers do this—probably more than I'd like to admit. I expressed my frustration about this in the cafeteria at Tentacle Zone, and a mobile app developer told me, "Of course, that's what you do, just do that and move on."
I responded that I cared about loading times, and that I wanted to use platform leaderboard binary attachment systems which are very limited in size to allow for replays. Also I wanted to support infinite undo. I had reasons. His response, “very expensive…”. Before I dive deeper into that “why” argument let's talk technical for a moment.
I'd been working on compressing and optimizing entropy. To clarify what this means: entropy optimization involves reducing redundancy in data. Think of it like this: if you have a game board where most squares are empty, storing "empty, empty, empty, piece, empty, empty" is inefficient. Instead, you could store "3 empty squares, then 1 piece, then 2 empty". This is run-length encoding, a simple but effective compression technique. Alternatively you might just store the position of the not empty squares.
Blind naive use of binary storage isn't necessarily good. Domain-specific knowledge matters. For example, moves in memory need undo/redo actions pre-cached for responsiveness, but you don't need to save those to a file. Moves can be compressed by inferring possible moves and storing a selection index. Batch operations and run-length encoding can further reduce size.
We often store values in 16 or 32-bit integers for convenience, but saved state values might have a limited discrete range, allowing for bit-packing. Some data might be temporary and not need serialization. This results in a tight, fragile data format that takes longer to write.
People might look at a JSON save file and dismiss its size, but network transfer and storage restrictions (like for leaderboard replays) make compression important. It enables new features and keeps the game snappy with fast loading and saving. Performance is a feature!
The fragility of this format can lead to lost development time. I opted for MessagePack, a compromise between flexibility and compactness. MessagePack is essentially a binary version of JSON – it maintains JSON's flexibility but with significantly smaller file sizes. For example, where JSON might store "player_position: 42" as text (taking 19 bytes), MessagePack could store it in just 5 bytes. It's like JSON's more efficient cousin, offering about 50% size reduction without sacrificing readability in development.
Of course we tend to zip JSON and MP can be zipped up as well. Also you should still use domain specific and binary methods to compress but this wrapper lets us store things at high level with version and meta data in order insensitive manner.
Easy-to-use libraries exist for it; I wrote one for Godot and might share it in a future blog post. In our case, this brought file sizes down by 60% while keeping load times under 100ms – a sweet spot between pure JSON and fully custom binary formats.
Sharing this middleware sacrifices some of my specialist advantage. Though I’m building on the strengths of others and frankly the quality of game design, art and animation are significantly higher in juniors than when I entered into the industry because of these tools improvements. Though paradoxically the programming skills are significantly worse, it seems to matter less by the day.
However, the domain-specific knowledge and problem breakdown showcased here are part of craftsmanship and lead to a better product. Would a user notice this within the first two hours on Steam? Maybe not, but I believe attention to detail contributes to long-term quality and reputation.
In the age of big data, achieving widespread recognition is difficult. However, you can develop a reputation within your niche (turn based strategic games, werewolf romance, solitaire card games, etc.). If that means a portion of your established user base will try any title you release, that's valuable. But they often attach to a studio or brand, not an individual artisan.
So back to the why question. Does it matter? Will decades of honing our craft remain valuable in the future? I believe so, and here's why: while one-line JSON serialization might work for many cases, understanding WHEN AND HOW to optimize – and more importantly, when not to – comes from experience. This knowledge lets us build games that not only work but excel in ways that matter to our specific audience. It's not about using complex solutions for their own sake, but about knowing exactly when they'll make a meaningful difference.
The trade-off between development time and optimization isn't just about today's release – it's about building a foundation for future features and maintaining the kind of performance that keeps players coming back. That's the real value of experience in an age of accessible tools.
Though maybe this is wishful thinking from a middle-aged lady who know's there is no damn good reason for your indie game to have a loading screen, but it still does. Maybe well crafted isn't the hallmark of quality and success in the future.
I’m still left with questions and I have some larger ideas on the state of industry I want to write about in the near future, but that is for another time.
Social Bits and Bobs
Website: Claire Blackshaw
Flammable Penguins: Small Press Publishing
Mastodon: @kimau@mastodon.gamedev.place
Twitter: @EvilKimau
YouTube: YouTube
LinkedIn: Claire Blackshaw
Twitch: Kimau
Github: Kimau
TikTok: @EvilKimau
Tumblr: Forest of Fun
Book list: Good Reads