For the last few months I've been getting back into a game project I initially started some two years(!) or so ago. It's a puzzle game in which you solve picross-like puzzles in 3D space. If you're unfamiliar, a
nonogram
is traditionally a 2D puzzle in which you create an image from clues on the edges of a grid. These historically have been in both print and video games (Nintendo has a number of Picross titles beginning on the Game Boy), but (almost) always presented in a 2D manner.
I can't pinpoint when I got the idea, but I'm sure it was inspired in part by the puzzle elements in Half-Life Alyx. I really enjoyed the minigames in which you're using the multitool to solve some sort of spatial puzzle to open a locker or upgrade station. It's the sort of gameplay that I feel only works in VR. I wanted to pursue that and see what I could come up with. One idea was this puzzle game, which I've decided to run with for the time being.
As I write this, I'm also getting ready for Buffalo Game Space's first "Maker's March", a month-long event encouraging community members to complete a milestone defined at the start of the event. It's a level of accountability that I find exciting as I am notoriously bad at meeting my own deadlines, so I will be participating for sure. So rather than blindly just post about it out of context I wanted to get a blurb up first to set the stage (and also kick off devlogs on here in general).
And for a quick visual, the video below shows the state the project is (mostly) in. This shows the puzzle editor, where I show off saving, loading, and editing of puzzle data. Given the focus on puzzles, I'm going to need to make a lot of them. Building in-game is much easier than manually entering voxel data.
Excited to share more with folks. I plan on posting the actual milestone here once I actually have something semi-solid in place, so expect that probably tomorrow or Friday.
Printing Planners
Tuesday, February 28, 2023
Earlier this month I had a lot of little todos piling up and a bunch of deadlines approaching. Historically I haven't been the most organized when it comes to personal tasking, and I could see this problem rearing it's ugly head again with the oncoming deluge of
things-to-doTM
.
I've always kept a notebook lying around so I started trying to list everything I knew I needed to tackle in it so I would be less likely to forget them all. While it worked, I found my notes went from an assortment of doodles and random thoughts to nothing but bulleted lists. While it was good I was getting shit done, I kinda wanted my notebook to be a bit more flexible in it's use. So, I decided to look into picking up a planner. I wanted something with the following:
A calendar
A timeline of things happening in the day-to-day
Space for all my TODOs for the day
Extra space for random notes/journaling/etc.
While I'm sure there's plenty of existing planners that offer all that, I ended up giving up searching for an existing solution and designed one myself as an exercise.
It's been a while since I made anything zine-like, so this was a fun excuse to boot Scribus back up and put a little thing together. I made sure to design it so that I can print one on demand and use it for any month of any year (assuming the modern calendar doesn't change in size anytime soon). Leaving fields blank for year, month, dates, allows me to print as many pages as I might need for a month, fill out the fields by hand, and be good to go for the month.
After using one for most of February, I found it to be fairly helpful. As with any new routine, you only get as much out of it as you put into it. When I was regularly blocking out my days it proved (for the most part) incredibly helpful in keeping me on track and accomplishing my goals. When I didn't take the time to fill it out, though, I definitely decreased in productivity/focus. Still, I think it was worthwhile enough for me to print one off for March. I even tried to saddle stich it, since the staples in my February planner didn't quite hold out. It's an ugly first attempt, but if it works out better I'll be doing that every month going forward.
If you're interested in printing your own, you can find the cover and insert PDFs below. I print the cover single-sided on card stock, then print all but the outermost insert page on both sides to ensure each day has room for blocked-out time elements on the left and todos/notes on the right for each day.
After a lengthy hiatus in terms of site updates, I'm back with a site update. I stopped work on Deluxe some time ago, and got a bit burned out with day job stuff. I also wasn't the happiest with how the static site generator's been working. So, to hopefully improve things here on the site and make my life easier, I did a bit of learning and 'Flask'-ified my site. I picked up a bunch of new skills working with Docker, Flask, and Nginx to get a nice new backend running my site, which should let me effectively drag-and-drop new posts.
(If you're seeing this post, that means it's working!)
Keeping this one short for now, just to tell my website I'm alive. Planning on having some news up on my current VR project (a puzzler prototype with promise) in the near future.
BGSjam XVIII Post-mortem
Thursday, June 09, 2022
Last weekend I took a break from working on
Deluxe
(which was more of a break from playing Riven, since day job stuff has me avoiding my office in the evenings/weekends) to hang out at Buffalo Game Space for BGSjam XVIII. My intent was to be around to assist with any projects that needed code or Blender assistance, but once that wasn't really much of a thing needed I made a little game.
Inspiration
The theme of the jam was "Passageways". There was some really great brainstorming at the start, and a lot of excellent ideas got tossed around by folks. I couldn't stay late as I had double-booked myself, but the idea of a hallway kept coming back to me. Eventually this morphed into the great base-raiding sequence
Contra
, and I thought it'd be fun to take a stab at remaking that in Godot.
The Result
While it's unpolished, I did end up succeeding at remaking that general concept into a game by the end of the weekend.
Passageway
(very original title) is a small arcade shooter in which you progress down hallway segments by destroying a big red orb at the end of the hall, all while killing/evading enemies.
The Process
The project began as all good jam games do - with a player controller. Quickly hobbling together some basic input mappings and attaching them to a KinematicBody got me most of the way there, but almost immediately afterward I decided the smart thing to do is recreate the Creature class/scene I'm (roughly) using for
Deluxe
. A root KinematicBody with a base collider, a Hitbox area, and a custom Health node with some signals for when it's damaged/killed got me the base scene for the player, enemies, and even the Hallway Switch. Inheriting and adding a new script that contains input parsing drives the player character, some very simple movement logic drives the enemies that move.
The biggest issue I ran into, which I'm still not sure why it was happening, was that the inherited scripts did not fire off the parent functions when the child function is called by a signal connected on the base scene. I had to use the
.parent_func()
approach to get them to fire, which I'm fairly certain I don't have to do in the
Deluxe
project.
Once the creatures were in I started adding some basic weapons, which in turn shoot bullets that are also creatures. Super easy.
Next came the hallways themselves, and this was the first "new" thing I did. The game uses a base Hallway scene that has a Position3D that indicates where the next hallway should be connected, and a script with a signal alerting a Hallway manager when it's been "completed". This was meant to be fired off in various ways (kill all the enemies in a room, defeat a boss, etc), but for the jam it ended up only being "Switch" objects - Creature scenes that, when destroyed, tell the Hallway that it's done.
When the Hallway determines that it's finished, it alerts the Hallway Manager, which in turn selects the next Hallway at random from a pool of Hallway scenes, instances it, and attaches it to the current Hallway. Then some more events fire off, triggering a door opening animation on the current Hallway, changing the current Hallway to the "completed" Hallway and the "next" Hallway to the current Hallway. I also manage animating the transition between Hallways here, and enable/disable the creature logic so the player can't move during the transition and the enemies can't move/shoot. This was a stylistic choice more than anything, as
Contra
does basically the same thing.
With these elements in place, I blocked out five-ish Hallways and threw them into the Hallway Manager. The end result is the game you can download now on itch.
Wrapup
I'm pretty happy with how this turned out. It was the first game jam I've actually participated in in a while, and it was a lot of fun. Not only that, but I think that this little project could turn into a much larger one at some point. I'd like to make it co-op multiplayer and functional on the BGS arcade cabinet. Maybe add some aiming similar to
Sin & Punishment
, and obviously give it some real art and music/sfx. So we'll see.
PS
As mentioned at the start, I've fallen a bit behind on
Deluxe
progress. This whole month I'm dealing with a big release for my day job and I've got a bunch of non-game-dev fun things on my calendar because hell yes it's Summer and I wanna get outside.
BUT
, I do suspect that in the next week or so I'll start working on it again and will have another update to share following that next sprint.
Deluxe: Sprint 3
Friday, May 20, 2022
Another two weeks, another sprint report. This time round I learned a whole lot about interacting with the Physics server via GDScript, and got some very cool AI stuff implemented. Also managed a little bit of cleanup, making the creature scenes clearer.
Consolidate Hitboxes and Hurtboxes
My initial creature scene used two colliders for damage - one for causing damage to another collider, and one for taking damage. The more I worked with it the more redundant it seemed, so I decided to consolidate them into a singular collider.
This was so easy to do. Thank you Scene inheritance, Godot. All I had to do was update the Physics layers checked for the Enemy/Player Hitboxes and add an
Impact_Damage
exported variable to replace the Hurtbox
Damage
export. Removed the Hurtboxes from baseline creatures and deleted the baseline Hurtbox scenes/script, and everything just worked.
Teleportation
Teleportation in VR is a solved problem in most engines, and
Godot is no exception
. However, I like trying to implement my own version of things occasionally, when existing implementations are a bit heavy and it's not a terribly complex thing to implement on your own. So I spun up a TeleportationManager that handles calculating when and where a player can teleport around the scene. In doing so I actually learned about a lot of features that I'll be utilizing in the future.
When the user wants to teleport, the manager runs a quick and dirty simulation of throwing an object out from the controller. The manager has parameters for the distance between each step of the throw calculation and a gravity value. On each step, starting at the controller's position and using the -Basis.Z of the controller's transform, I set a start and end point that make up a step in the arc simulation. Using the
PhysicsDirectSpaceState
, I call
intersect_ray()
and check for any collisions in this step. If there aren't any, I set the next start point at the current end point, then calculate the new end point by adding normalized vector of the difference between the previous start and end positions, the gravity multiplied by the current iteration step, and the length of the step defined by the manager. If there
is
a collision, I use the normal to verify it's a flat surface the user can teleport to, and then set the potential teleport position to the intersection position.
Using the
PhysicsDirectSpaceState
is so much cleaner to me than using a RayCast node. It's similar to the way raycasts work programmatically in Unity, and much more flexible than the RayCast node IMO. There's also options for checking intersection using shapes, which I imagine I'll be able to use for things like spherecasting.
Rendering the arc utilizes the
MultiMeshInstance
node. I'd never tried it before, and it was shockingly easy to spin up. I set the MultiMesh up with a mesh in the editor, then programmatically set the number of instances and their positions based on the cached points of the arc created during the throw simulation described above. I initially meant to use the cached data to dynamically create an arc mesh, but I kinda like the billboarded sprite look and might keep it.
Graybox first level
Kinda wild that I've already hit a point where I need to start prototyping play spaces. The primary goal of this was to have something that a player could teleport around and enemy AI could fly about, searching for the player ship or another target of interest. CSGs were a huge help here, letting me quickly hobble together some basic geometry that's slightly more interesting than the basic meshes while still being easier and faster to hammer out than creating a small level in Blender.
Navigating Flying NPC prototype
Most of this sprint was focused on figuring out how to get the enemy AI to actually work. The initial implementation worked on a theoretical level, but was way too computationally expensive (and also overkill, as I later worked out). The followup was simpler, extensible, and what currently exists in the project.
At first I thought it'd make sense for the enemies to have a good general awareness of their immediate surrounding geometry. Not just the player, but the level as well. I started digging into raycasting as a way for the AI to "look" in various directions and navigate towards a given target. Using the
PhysicsDirectSpaceState
(accessible via any Spatial node), I set one raycast to see if there was any objects blocking the way to the AI's target and a subsequent pair of nested loops that would scan the surrounding space with calls to
intersect_ray()
. If there wasn't anything between the AI and the target, it set it's next destination to the target's position. Otherwise, the looped raycasts would search a limited distance around itself, and the point nearest the target with no collisions would be the AI's next position. Once the AI was near the position it selected, the process would repeat.
This worked in principle, but the high number of raycasts crushed the framerate (I think, more on that later). What probably would've passed as fine on desktop for a prototype was nauseating in VR, clocking in around 45 FPS on average.
So I started thinking about what I actually wanted this AI to do. After some whiteboarding, notebook doodling, and playing a bit of
Descent
, I came up with a state machine approach. Each enemy creature got an
AIBrain
node, which contains (among other things) a state the brain is currently in. Each AI tick (currently Physics ticks, because this stuff is fast) an
AIService
evaluates what the brain knows and makes decisions on whether or not to change state, and what state to change to. The brains have a bunch of knobs for twiddling, like vision distance, field of view (FOV), boredom, pain tolerance, and more, that should allow for some distinct behaviors. Additionally, it's up to the creature itself to determine just how it should move or act based on the state of the brain and what data it contains. Thanks to class inheritance and signals, I can create new creatures that all move, look, and shoot differently with (hopefully) minimal fuss.
I captured some video of the initial prototype of this in action, with a number of ships with the same brain parameters. They all begin in the "IDLE" state, have a FOV of 90 degrees and can see 1 meter, and get bored after a couple seconds. If the player ship comes into their view distance, in their field of view, AND isn't blocked by some other geometry, the enemy enters "CHASING" mode, moving very slowly while turning towards where it saw the player so it's eventually facing dead-on. If it loses sight of the player ship, either by the player moving out of it's FOV, view distance, or just moves behind a big blocking object, the enemy enters "SEARCHING" mode. In this instance, that means moving in the direction it last saw the player ship. While "SEARCHING", it's 'boredom' level is increasing ever second. If the boredom level hits or passes the brain's 'attention span' parameter, the enemy goes back to the "IDLE" state. However, if it sees the player before it gets too bored, it goes back to "CHASING".
There's additional logic for moving into weapons range to enter an "ATTACKING" state, and an "EVADING" state for moving out of the way of player projectiles or other threats, but that's not fully wired up yet and will tie into Sprint 4.
What's Next
Enemy AI attack and evasion
While the states are there, more testing is needed for attacking. Evasion also needs the logic actually implemented on the enemy, so it can do something more than just recognize it's being attacked.
Grip Swap
A holdover from this last sprint, it just got pushed as I felt the AI work was a higher priority.
Snap Turn
Another VR standard, the ability to snap turn the player in increments is super handy. I've already found myself missing it while testing AI, so this'll be a relatively high priority this sprint.
Pickups/Player Weapon Changing
Items that the player can collide with to increase health or upgrade their weapon is a big one. This will probably get split into a few different tickets, one being a rough draft of how the weapon upgrade system will work.
Wrapup
As always, things take longer than expected. Still, progress on this has been solid IMO. Thanks for following along so far. Be sure to follow me over at
mastodon.gamedev.place
and
diode.zone
for bite-sized content updates and videos as they get posted!