Ludum Dare 49: 48 Hours, One Game and a Bunch of Physics Calculations

[linkstandalone]

Last weekend I took part in Ludum Dare 49. The theme was “Unstable” and I developed a platformer involving a bunch of unstable platforms. It’s called Watch Your Step and can be played in your browser here. You can also find the source code here. I haven’t done much work on my adventure game prototype the last couple of weeks, I’m instead writing a blog post about the game, how it came to be, and lessons I might apply to development of other games.

Tools I used for this project:

I also used some of the editor scripts I’ve written for the adventure game prototype which proved useful for the sprite animations.

Planning

Ludum Dare is a biannual game jam that takes place every April and October since 2002. Participants can choose to enter either the Jam (3 days) or the Compo (2 days). The Compo is a solo event and is taken a bit more seriously. All assets are meant to be created by the developer within the 48 hour time period, though you can decide to exclude asset types from being judged. This will be the fourth Ludum Dare I have participated in in a row and the fifth overall. For my first event I participated in the Jam but found that working three days in a row was a bit too much. Obviously I could take it more slowly but I ended up feeling pressed for time anyway. The Compo also fits nicely in a weekend, starting in the UK at 2am on the Saturday in April and 11pm on the Friday in October.

For the previous events, I would spend an hour planning and then try and squeeze out as much as possible in gameplay, spending maybe three hours on art and an hour or two on sound but I wasn’t particularly strict about it. This made my games have a lot of half-finished mechanics as well and worse than “programmer” art and sound. This time, I planned out how I would work during the event to the hour, including when I would eat and sleep.

Screenshot of my schedule for the weekend. This did not change much from the original plan. The main change is the addition of time for creating levels

As you can see from the spreadsheet above, most of the time is still spent on game development. However, there is also six hours just for art, three for music and sound, and two to do any last minute “polish”. When I decided I would develop a platformer, I replaced two of the gameplay hours with hours to work on levels, as I usually ended up rushing to create levels in the Sunday evening of the event. As you can see from the finished result, this led to a much more cohesive experience compared to my previous jams.

Initial Ideas

The theme was announced at 11pm UK times on Friday 1st October which was “Unstable”. Before going to bed, I planned what type of game I would develop. I was initially interested in developing a platformer as I would end up using some of the lessons learned to help with development of The Pilgrimage. However, I took the theme and thought about doing it in a more abstract way: mental stability. The idea was that you would be a freelance writer (originally an office worker à la Office Space) who would try to get work done while balancing their physical and mental health. It would be a point and click game, mostly taking place at a desk where you would click to do a particular action and manage resources such as stress, energy (tiredness) and money. You would just keep playing day by day until you gave up. I think it would be an interesting idea that could be developed to a playable game on Saturday and then adding more mechanics and art on Sunday. Happy with the idea, I went to bed on Friday close to midnight, happy that I had a good idea for event.

The next day, I completely changed my mind and went back to the platformer. My reasoning was that it would be a straightforward idea that would be very quick to get something out that is playable, compared to the more abstract resource management game. Also, I think drawing the scenes of the office would be more challenging for me to quickly look good compared to creating sprites for a platformer. I kind of regret the decision, as I feel like it would have been even more fun to work on something with a less common premise and mechanics. However, I think I’ll be more adventure next jam.

The idea behind the platformer would be simple: a 2D side-scroller where the player tries to move a character from left to right, avoiding enemies and trying not to fall. The more unusual mechanic would be balancing on tightropes which would involve both moving the character along and using the arrow keys to keep balance.

Art Changes

The original art style I came up with in the two hours on Saturday ended up resembling Super Mario Bros. It even had a flag pole at the end of the level. On Sunday, I decided instead to change the style to be sci-fi instead. This also added to the story, where the character is someone who needs to escape a collapsing city and avoid robots. Instead of a flag at the end of each level, instead the player would need to reach a portal. It then made sense to put a portal at the beginning of subsequent levels to make them feel more connected. I had also hoped to add a background to the game which would be a cityscape that would more slowly relative to the camera as the player moved across the level for that extra bit of immersion. Unfortunately I did not have the time in the end so had to settle for the default blue background.

Screenshot of the original art style, vaguely resembling Super Mario.
The final art style I had for the game.

Sounds and Music

For most of the sounds, I used rFXGen again. For the death and victory sounds, I quickly created them with a MIDI instrument in Ableton with a couple of notes. A descending tritone always work well for a quick “uh-oh” error sound. For getting them playing at the right time, I didn’t go for anything particularly sophisticated. Classes where actions will happen that result in a sound would contain a reference to the AudioSource and the AudioClip In a more serious project (like eventually in the The Pilgrimage), I would use a dedicated audio manager that events can be sent to and prevent too many sounds being played close to each other. Since you’re allowed some base code beforehand, I might end up using that in future projects once it its written.

For the music, originally I was going for something in minor and a bit faster paced. The mood I was originally trying to set was kind of frantic, where the player was trying to escape as quickly as possible. I spent the first hour of the two hours trying to get something work with an arpeggiated Cmin7 chord which ended up sounding pretty cool in isolation. It still sounded quite empty and I just couldn’t write a lead that would work. I was trying to write a melody in C minor but kept floating around the Eb and decided instead to write the track in Eb major instead and it ended up working quite well. It also produced a shift of mood to something a little more light-hearted, which worked better with the cartoonish movement of the main character. I wanted something that was not too short to be repetitive but also not too long. I settled on a 12 bar line that would repeat throughout the track. When adding chords, I ended up settling on a 12 bar blues and that also influenced how I ended up shaping the melody. I added a few extra layers including drums and it ended up sounding quite good within the two hours I had allotted myself. I then stuck in on an AudioSource and had it looping with a small break and that was it.

Equations of motion are annoying sometimes

As well as relying on the inbuilt physics engine including components such as the PlatformEffector2D, I also needed to do some physics calculations for other aspects of the game. These were the floating platforms and projectiles shot by enemies and I ended up spending a bit too much time on than I hoped.

Floating Platforms

The floating platform that was in the submitted game. It handles correcting linear movement quite nicely but is quite slow to correct rotation.

As well as having collapsible platforms that would shake for a bit before falling, I also wanted platforms that the player could jump on which would sink when the player walks on them and then eventually return to their original position. I ended up doing this using equations of motion I learned in secondary school. In particular:

v2 = u2 + 2as

where:

It took a while to figure out how to use this equation to get a result I liked. To summarise, it would accelerate either upwards or downwards if the platform was not in its original position or it was not stationary (or both). It would assume that it was accelerating in the opposite direction of its velocity and use the equation above to figure out the velocity it would be at the target position. If v2 was negative, then it can be reasoned that the platform would not ever reach the target position if the acceleration was constant. If the velocity was greater than 0, we would accelerate in the opposite direction of the target position. Otherwise, we would acceleration towards the targe position. I applied the same formula for the angular rotation and it worked well enough as can be seen from the video. I also restricted the maximum rotation in each direction. In this case I restricted it to 45 degrees either way mainly as it was a nice round number but would still give trouble to the player if they weren’t careful.

Retrospectively looking at some of the other Physics 2D components that are available, the TargetJoint2D component would have been a suitable substitute that seems to correct rotation much better. This would be added like I did with the HingeJoint2D component for the rotating platform. At least I will know to use it next time I might require it. I appreciated how much the Unity devs had actually put into developing not just the physics system but other components on top of it to make game developers’ lives easier.

An example of the floating platform with a TargetJoint2D attached with a max force of 70 (the mass of platform is 4), a damp ratio of 1.0 and a frequency of 5. It’s still restricted to rotate a maximum of 45 degrees.

Determining Trajectories for Projectiles

A crude diagram showing the variables needed for a projectile calculation. Both the initial velocity (u) and the final velocity (v) are unknown but we only care about the initial velocity.

I also had robots that would shoot projectiles. I didn’t have time to get the robots themselves to move but I did get them firing projectiles at the player from a stationary position. In order to hit their target, I would have to figure out the initial velocity that the projectile would need to have. After wrestling with the equations of motion for about an hour, I realised that there was a range of possible values for the initial velocity as the final velocity did not matter. I tried using inequalities but didn’t get very far and realised the practical solution was to sample a few initial velocities and work out if they would hit the target. I did this by applying the following equation:

s = ut + 1/2at2

where t is time. I calculated t from the x component quite easily as the acceleration is 0. I then used that value in the equation for the y component to calculate the initial velocity of the y component, since I knew the displacement of the target (the player) from the firing position. If the magnitude of the entire initial velocity is close to the magnitude I wanted, that would be the magnitude of the projectile. If not, I would pick another initial x velocity. For each projectile, 1000 samples are tried.

I had settled on a target initial velocity magnitude on 7.5. I tried 10 but found it a bit too fast for me as it was nearly impossible to dodge out of the way. People who ended up playing the game thought that this was still too fast. If I had a bit more time to playtest I might have found this out before the submission deadline.

What I got out of it

Over the course of the 48 hours, I learnt a few things that could be useful for other game projects and one that is a lot more universal.

Deadlines are useful

As well as game development and my day job, I also enjoy making music. Though I find myself a lot of the time being overwhelmed by choice and pressured to make a perfect track. In this event I only allotted myself two hours and was able to produce a track that was acceptable. Deadlines can make it a lot easier to cut scope and not worry about making something that is “perfect”. It can also lead to taking shortcuts that can be beneficial. Though I probably won’t give myself deadlines like “make a track in two hours”, I might try using deadlines over a longer period of time such as making 10 tracks in a week, or the approximate deadlines I have for the phases of The Pilgrimage and for my devlog posts.

Being prepared before the event is important

In previous jams, I was never prepared for the event. I usually forgot to even have an empty Unity project. This time I wanted to take the event a bit more seriously as I was working on it alongside my main project. First of all, I planned my scheduled for the 48 hours so it was easy to see sections I might want to move around or cut out and replace with different activities. I also had editor scripts from my other projects prepared that would quickly create animations from sprites. This was very helpful for the project as I could change sprites on the fly and they would work perfectly in the game. It really paid off as I wasn’t spending time messing around with tools and how they worked and instead could just develop the game and create its assets.

Unity Physics Components

As mentioned before, Unity no only has rigidbodies and colliders as part of the physics system but also effectors and other components that I was not really aware of until this jam. In particular, I ended up using the HingeJoint2D and the PlatformEffector2D components. The platform effector is nice as you can constrain colliders to a segment of a particular degree. In that case it’s easy to allow characters to jump from underneath platforms and on to them. The hinge component was used to have platforms that rotate around a point. In this case they did not rotate themselves but would be moved by the player. The video below shows both of the effectors in use for one platform.

A rotating platform in the game with the player interacting with it. Notice that the player can jump onto it from below. This is due to the PlatformEffector2D component.

Though the main game I’m working on isn’t a platformer, it will have physics at least concerning collisions and now I am more aware of the extent of the components that Unity already provides that may be useful.

I did enjoy this Ludum Dare event (as I usually do) and found that planning and having a clear idea from the beginning really help with developing a full (though simple game). I think for next time in April I want to try a more original/interesting idea like the initial resource management game, even at the cost of other elements of the game.

Sat, 09 Oct 2021 18:15:10 +0100

The Pilgrimage Devlog #2: Basic Animations, Editor Scripts and Sound

[linkstandalone]

The last time, I talked about the vision for The Pilgrimage and broke its development down into three phases. Currently I’m at the beginning of the initial adventure game prototype that will cover most of the main game loop and game mechanics. In the last few weeks I have looked at four things:

  1. Character being able to travel on the z-axis
  2. Idle and walking animations for the player and the guards
  3. Editor scripting to improve sprite processing
  4. Basic sounds and control

Moving on the Z-Axis

Last time, the player could move along the x-axis by moving left and right along the floor and on the y-axis by using doors to move between floors. However, despite the game being 2D, I was interested in letting the player and NPCs move along the z-axis at different points so there will be multiple xy-planes to move in. It makes the world feel a little less flat. The example I wanted to use for this for the prototype is a cell that the player will start off in, though in the prototype at the moment, the player starts outside of the cell and can unlocked and move into it.

The basic design for travelling on the z-axis is as follows:

In order to make sure that objects that move between the z-axis are in front of or behind the right objects, I am using sorting layers. The reason why I can’t use z values to determine sorting order is due to the player having priority over every object at a particular z value. Unfortunately the priority for rendering in 2D is sorting layer > order in layer > z value. At the moment, there are currently two sorting layers: Default and Back. When there are more than two layers, it will probably be easier to label them according to their z value e.g. Layer 0, Layer 1, Layer 2 etc. so it will be easy to switch between them.

Physics2D does not work for 2.5D

Initially I was using Collider2D and Rigidbody2D in Unity for characters and walls. However, this lead to a couple of problems:

Therefore, I found that I needed to switch from 2D physics to Unity’s regular 3D colliders that are very thin. The z-axis gateways, however, have to be wider on the z-axis so that characters will collide for both the foreground and background z values.

For picking up items, 2D colliders still work well with mouse clicks so I will be keeping them for now.

Basic Animations

I did not want to go into too much detail for the animations, though I found as I started working on them, I wanted to make them at least passable. I wanted to at least communicate “this one is the player who is a prisoner, and these are the guards.” I think in the end, the guards’ animations came across quite while the player’s not so much. This is due to two main reasons. The first is that the walk cycle of the player was done with eight total frames rather than the twelve in the guards’ walk cycle. The second reason is that I used Krita for creating the player’s animations but Libresprite for the guards’ animations. For the moment though, they’re a good enough for developing this prototype.

I found that 48x48 sprite size was large enough to be detailed enough without being overwhelmed by needing too much detail at larger sizes. The characters ended up being about 20x48 pixels when including arms. In terms of the style, particularly of the heads, although this might not be the final design, I liked the style that Pixel Chest is doing for their characters in Sons of Valhalla. I found that although they don’t use real life human proportions, they’re not “chibi” or “cartoony” either at about 5 to 5.5 heads tall. The main stylistic element I borrowed from Sons of Valhalla were the eyes. They’re just one or two pixels wide but the effect works very well. I found trying to make the eyes any bigger made the heads look too cartoony. In a way, they remind me of the way the eyes are done for characters in Darkest Dungeon where they appear permanently in shadow.

A campfire scene from Darkest Dungeon. I love how their eyes are always shown in shadow yet the characters still manage to be so expressive. This will be more of a challenge with pixel art, relying more on animation.

The image above shows few tests of heads. Most of them follow the style that I’m currently going for. The second column shows the different styles that I have attempted. The slightly narrower heads with single pixel eyes in the top right will probably work for any child characters in the game. Female characters are distinguished mainly by their smaller foreheads and generally narrower heads.

For the guards’ idle animation, I liked playing around with the shade of the pixels when he breathes in and the little bounce his torso gives when he breathes out.

From Krita to Libresprite

I had been familar with Krita before and knew it was possible to create animations, however I found creating the animations and switching between frames to be quite awkward. There was also no way to preview the animations and Krita is quite awkward for creating single frame views of pixel art. Therefore, after creating the player animations, I tried creating the guard animations in Libresprite, the free fork of Asperite. As well as working natively with pixel art, I also liked the fact that there is a separate preview window that allows you to preview animations. You can also easily include multiple animations in the same file using tags and export them into separate sprite sheets. At the moment, I’ve found having a tag for the base character and layers for the head and torso, each leg and each arm to work quite well.

Screenshot of Libresprite and how I am using it. The reference layer is displayed at half opacity and used as a reference image when creating animations

Animation Controller and Override Controller

Now I needed character’s animations to run inside Unity. Fortunately, Unity provides an Animation Controller that has a state machine to support animations. I found that I needed three different parameters to switch between animations:

  1. Walking left
  2. Walking right
  3. Idle

Originally I had three states that would correspond to each of these animations, though I quickly found I would need four states. The idle state would be split into idle facing left and idle facing right. Therefore if a character is walking to the left and then stops moving the idle animation facing left would play and the idle animation facing right would play when stopping after walking to the right. However, only a single idle parameter is needed. The animation controller with the states is shown below.

The current animation controller for characters

If multiple characters have the same state machines for controlling animations, like the player and the guards currently have, Unity provides an Animation Override Controller that allows you to override the animations being used for a particular Animation Controller. This reduces quite a lot of duplication.

Editor Scripting for Automated Sprite Processing

While working on the animations and editing them, I found importing sprite animations into Unity to be very awkward. You need to make sure the sprite mode was multiple, then you would have to slice the sprites and then drag the spritesheet into the scene. This would generate extra objects including an Animation Controller and a Material as well as the animation. If you want to update an animation and add or remove frames, you would have to repeat this process. Therefore I decided to write a few editor scripts in order.

First I created an editor script that would allow me to create an animation with a much tidier interface. It exists as a menu item for any sprites that are set to multiple mode. This creates a window where I can set the size of each sprite (in the case of the characters this is 48x48 pixels), the frame rate and whether the animation will loop or not.

The spritesheet is sliced properly and then an animation clip is created with the same name as the spritesheet in title case and placed in the Animations directory. Sprites are sliced using the InternalSpriteUtility. The code to slice the spritesheet is shown below:

  Rect[] rects = InternalSpriteUtility.GenerateGridSpriteRectangles(texture, Vector2.zero, frameDimensions, Vector2.zero, false);

  List<SpriteMetaData> metas = new List<SpriteMetaData>();
  int rectNum = 0;

  foreach (Rect rect in rects)
  {
    SpriteMetaData meta = new SpriteMetaData();
    meta.rect = rect;
    meta.name = filenameNoExtension + "_" + rectNum++;
    metas.Add(meta);
  }

  // Only save if the number of sprites changes to avoid infinite reloads.
  if (metas.ToArray().Length != importer.spritesheet.Length)
  {
    importer.spritesheet = metas.ToArray();
    EditorUtility.SetDirty(importer);
    importer.SaveAndReimport();
    AssetDatabase.Refresh();
  }

This triggers a reimport of the spritesheet with sprites and so I added an OnPostprocessSprites callback that will update the animation clip if one currently exists:

  private void OnPostprocessSprites(Texture2D texture, Sprite[] sprites)
  {
    var animationPath = SpriteAnimationGenerator.GetAnimationPath(assetPath);

    // Recreate animation if it exists
    if (File.Exists(animationPath))
    {
      var importer = (TextureImporter)TextureImporter.GetAtPath(assetPath);
      var animationClip = AssetDatabase.LoadAssetAtPath<AnimationClip>(animationPath);

      // Delay needed so texture can be loaded
      EditorApplication.delayCall += () => { SpriteAnimationGenerator.UpdateAnimationClip(animationClip, assetPath, animationPath); };
    }
  }

The EditorApplication.delayCall is important here as if the UpdateAnimationClip is called without it, the sprites are not initialised at this point in time so the animation clip will be empty. However, putting it as part of the delayCall means that it will run after all the inspectors update, so the sprites will be initialised. The main part of UpdateAnimationClip looks like the following:

  public static void UpdateAnimationClip(AnimationClip animationClip, string assetPath, string animationPath)
  {
    var texture = AssetDatabase.LoadAssetAtPath<Texture2D>(assetPath);
    var importer = (TextureImporter)TextureImporter.GetAtPath(assetPath);

    EditorCurveBinding spriteBinding = new EditorCurveBinding();
    spriteBinding.type = typeof(SpriteRenderer);
    spriteBinding.path = "";
    spriteBinding.propertyName = "m_Sprite";

    var sprites = AssetDatabase.LoadAllAssetRepresentationsAtPath(assetPath);

    var keyFrames = importer.spritesheet.Select((s, index) =>
    {
      var objectReferenceKeyFrame = new ObjectReferenceKeyframe();
      objectReferenceKeyFrame.time = index / animationClip.frameRate;
      objectReferenceKeyFrame.value = sprites[index];
      return objectReferenceKeyFrame;
    }).ToArray();

    AnimationUtility.SetObjectReferenceCurve(animationClip, spriteBinding, keyFrames);
    var curveBinding = AnimationUtility.GetObjectReferenceCurveBindings(animationClip)[0];
    var curve = AnimationUtility.GetObjectReferenceCurve(animationClip, curveBinding);

    // Update Animation Controllers and Animation Override Controllers that use the animation

    AssetDatabase.SaveAssets();
    AssetDatabase.Refresh();

Finally, to support automatically updating the animation when the spritesheet is edited, I added a OnPostprocessTexture that will slice the spritesheet if the animation clip currently exists and the spritesheet is edited.

  void OnPostprocessTexture(Texture2D texture)
  {
    var animationPath = SpriteAnimationGenerator.GetAnimationPath(assetPath);

    // Recreate animation if it exists
    if (File.Exists(animationPath))
    {
      AssetDatabase.SaveAssets();
      var importer = (TextureImporter)TextureImporter.GetAtPath(assetPath);
      var animationClip = AssetDatabase.LoadAssetAtPath<AnimationClip>(animationPath);

      // Should all be the same size so just get first one
      var firstSpriteRect = importer.spritesheet[0];

      // Delay needed so texture can be loaded
      EditorApplication.delayCall += () => { SpriteAnimationGenerator.SliceSpritesheet(assetPath, new Vector2(firstSpriteRect.rect.width, firstSpriteRect.rect.height)); };
    }
  }

It was quite fun to delve into editor scripting in Unity for the first time. I had been aware of it existing when making games in Unity before but I thought it would be more useful for larger projects but something like this animation creation and automatic animation updater would be useful even for small projects that use pixel animations.

Sound

For sound, I wanted to have some basic sounds to represent object interaction. I thought at first about actually recording sounds, though I just wanted something rough to test the Unity engine rather than getting the sounds exactly right. Therefore I used rFXGen to generate sounds for five main events:

  1. Player picks up an item
  2. Player unlocks the door
  3. Player fails to unlock the door
  4. Player tries to use a locked door
  5. Player uses a door.
Screenshot of rFXGen. Though it has a simple interface, you can produce a wide range of arcadey sounds.

It made sense to have events 3 and 4 to use the same sound. Since these are player-centric, I made them 2D and stereo so the player’s position will not affect the volume of the sounds. I ended up creating an AudioManager class for a single AudioSource. At the moment, it’s difficult to have sounds overlap so it’s really just a thin wrapper. When more sounds are added including ambient, the logic to manage the sounds will probably be more involved to avoid sounds clashing.

In terms of sound, I think I’ll be next experimenting with 3D sounds for player and guard footsteps, so that the guards get louder as they get closer to the player.

Custom Cursor Icons

As you may have noticed from the recordings of the game, I’ve also added custom cursor icons. I created a CursorManager class that then makes calls to Cursor.SetCursor just to have a central location to change the cursor icon.

Next Post

Next time, I hope to look at pathfinding for the guards as well as a sneak system so our character can avoid being caught by the guards and escape from the dungeon. Then the player will actually have a game over state.

Tue, 14 Sep 2021 19:56:28 +0100

The Pilgrimage Devlog #1: A Vision and a Start

[linkstandalone]

I’ve had the idea for an adventure game for the past couple of years, which I think a lot of people might enjoy. Unfortunately I haven’t found one yet, so I’ve decided that I’ll develop it myself. And it will be called: The Pilgrimage.

The Grand Vision

A few years ago, I first read the first two Hyperion novels (Hyperion and The Fall of Hyperion) and I enjoyed them a lot (particularly the first one). Although they are science fiction, the main structure of the first novel is inspired by The Canterbury Tales by Geoffrey Chaucer. In both, there is a framing device of a pilgrimage and eacho of the pilgrims tells their story. I had thought about the idea of a game where instead of playing a single character, you would play each of these characters as they’re telling their stories and play as them in a flashback with an unreliable narrator, similar to Call of Juarez: Gunslinger. Some of the characters may have met each other before, some without knowing it. Some may have experience the same event but from different experiences. Each night, one character will tell their tale around the campfire. Player choices would influence what happens in other tales as well as the frame story, allowing replayability. The story will then conclude when they have reached their destination.

Though the details aren’t settled yet, I have a clear idea of the main vision:

Collage of current inspirations for the game, including The Banner Saga, Kingdom and The Lion’s Song

To incorporate all of these character’s stories into the game, there is an overarching loop for the game, where each day is one iteration:

  1. Characters of the pilgrimage talk to each other, with one of them telling the story the story of why they’re on this pilgrimage
  2. Player plays through that story as that character - This is the main content in the game
  3. Pilgrims continue their journey the next day until the evening, passing by towns with pilgrims joining along the way
  4. Repeat until they reach their final destination

Step 3 is inspired by The Banner Saga when the clan would move from location to location. I want to incorporate this into the game as it provides a nice sense of progression and a bit of respite for the player. I think it will also make the journey feel more like a whole, rather than a series of minigames.

To get this game done will involve a lot of work, so I will need a solid plan.

The Plan

Obviously this is a large undertaking, especially for a solo dev. In terms of programming, I don’t think I will run into too much trouble as most of the mechanics are straightforward and in terms of the look and feel of the game, the most complicated part will be getting the lighting right, and Unity takes care of it for the most part.

I actually tried working on The Pilgrimage back in February in Godot though I got as far as a basic dialogue system and a couple of characters on the screen before I decided to work on other things. The idea was to have a self-contained story for one of the characters (a Paladin), to have something that’s playable as a demo. It would also introduce the overarching Canterbury Tables/Hyperion framing device by having the paladin talk to the other characters around the campfire.

I think the main reason why I moved on to other things was because the scale of the demo was too large to tackle. Up until this point, the largest project I’ve done in terms of scope and time spent was a platformer with a few levels. Not only would I be writing the game code, but I would also be working on sprites, animations and sound. This would all be without having much of an idea of what the final product would look like, or if it would it be worth the effort.

This time around I have a new plan: create a prototype that has most of the character-level gameplay with art and sound just to test the code parts of the game e.g. animations, triggering sound, lighting. That way, I’ll have a playable prototype to find. While the framing device will probably be the main selling point of the game, players will be spend most of their time playing the individual stories of the characters, so that has to be as enjoyable as possible. At the moment, I expect to complete this prototype in about three months, certainly by December.

Diagram showing the phases of the game, including what they will cover

After I’ve gotten the prototype to the point that is enjoyable and have a few people playtest it and enjoy it, I can then proceed with creating the demo I originally had planned which will take place over a day of the pilgrimage from sunrise to sunset. I think I will stick with the paladin’s story for now as I think it is interesting enough in its own and involves all of the mechanics. Most of the programming should be done in the demo. I will need to add the travelling and camping parts, figure out how player choices will be saved for other parts of the game and tie ii all together. The bulk of the work, however, will be on everything else:

This will take up most of the time working on the problem, especially since I’ll happily say I’m mediocre in each of those areas compared to programming so I will be learning a lot on the job. I expect this phase of the project to take six to nine months but I won’t be surprised if I have to revise this estimate.

Finally, once I have the demo done that people can play, I can then spend the rest of the time working on all the other stories and having player choice influence the main story. I expect this to take up to three years, as I plan to have around seven to eight separate stories, though I may end up being more efficient in producing content due to the processes learnt it working on the demo.

Progress so far

Since last week, I’ve been working on the adventure game prototype. The main premise is that the player is imprisoned in a dungeon and has to escape without being caught. This involves interacting with objects (e.g. keys, doors); talking to prisoners or guards and sneaking around so it feels like the perfect example for a small proof of concept to cover the main character gameplay. As you can see from the video, I have the dialogue system in place using Yarn Spinner as well as Unity 2D lights and hand-drawn normal maps for the stones in the wall. The player can also pick up items (in this case keys) and use them to unlock doors.

Video showing early gameplay from the prototype, including dialogue and using items

My plan for the next couple of weeks, I want to get a few things done with the prototype:

After that I will be looking into pathfinding (so guards can go through doors) and a basic stealth system.

At the moment I plan to write these devlogs at about once a month, though I’ll probably write the next one when the items above are finished, detailing anything interesting I found when it comes to development.

Wed, 18 Aug 2021 22:01:49 +0100