Some games die quietly. They get delisted, lose their multiplayer servers, fade into the digital void. Others get remastered by the original authors with slightly better graphics and a battle pass.
And then there’s the third way: you open the binary in ghidra and start naming functions.
I played this top-down shooter from 2003 with my roommates, and recently remembered the gameplay but couldn’t remember the name. And today I stumbled upon a recreation of it, with an incredible knowledge base for the mechanics and implementation details.
You can run it now with one command, uvx crimsonland@latest, if you have uv installed. If that’s not amazing enough, this was completed in only two weeks with the help of GPT-5.2, and the source code is available.
During my slow switch from macOS to Arch Linux, I’ve been frustrated with how the Apple Magic Mouse behaves, so this driver is the start of improving my experience with it. Currently it addresses my most common frustrations by preventing scrolling when the mouse is moving, resetting scroll tracking on mouse movement, and decreasing the maximum scroll acceleration.
Ruby developers deserve glamorous terminals too. I want Ruby developers to build terminal applications so beautiful that even people who “don’t like CLIs” find themselves captivated.
I’ve recently been using AI to write CLI utilities in Go with Charm, but I know Ruby much better than Go so this is very exciting news. I’m even more excited about the coming work for the libraries to “feel more Ruby-like and idiomatic”.
Ruby 3.4.8 has been released as a routine update that includes bug fixes. The full details are available in the release notes on GitHub.
It’d be nice if the Ruby team maintained the Docker image now that Ruby has a release schedule. Always feels odd to see a new release and then have to wait for an image update to be able to test a deployment.
The release also features the Ministral 3 series—three edge-focused models (3B, 8B, and 14B parameters) designed for superior cost-to-performance efficiency. These smaller models include multimodal and multilingual capabilities, making them suitable for edge deployments.
I’m excited to see small models continuing to advance. While I’m all in on Claude Code for code-related tasks, I enjoy using local models for simple copywriting tasks or for tasks requiring complete privacy.