Holiday Laziness & Employment

Happy New Year! Had I wrote a post when I was supposed to, a “Merry Christmas” would have been in order, but it is what it is.

I’ve got some great news. I’ve been hired! Woo-hoo! The position is an entry-level software engineer (programmer). I’m incredibly excited to get started, so that I can continue to expand my knowledge of programming. The paycheck doesn’t hurt either :). In the future, I would like to write a post on my experience with the interview process, and highlight some things that were done that I feel led to my success. Hopefully, it will help some of y’all still looking, or for those of you who are looking for new opportunities outside of your current employer.

“But, what’s been going on with the game?”, you ask? I knew you were eager to find out ;). Well, progress has been pretty slow over the past few weeks. This is partially to blame on the holiday season, and partially (mostly) to blame on my addiction to Dragon Age: Inquisition (so good *drool*).

The Good

Both Nate (my teammate) and I decided to get together for a day and work on the project collaboratively. He decided to break ground on modeling our characters (planning on one male and one female), and I continued to hack away at the movement code. It cannot be stressed enough how effective this strategy is. We were able to cover more ground in one 5 hour session than we ever had in a week working independently. I attribute the success to motivating one another, because we could both see the results of the other person’s work. It also kept each of us honest. If one of us hit a wall and needed a break, it was spent constructively scoping out what the other individual had accomplished, rather than skimming through Facebook for the next half hour.

I am most excited about the thing which I have no involvement in: beginning to model our characters! Nate broke ground on the female model first. female-2 female-1 Through some manner of wizardry, Nate imports these images into Blender, and transforms it into this:

Torso of Female model.

Torso of Female model.

As stated, this is just beginning of the modeling process, which is why there is little to no definition to the model. I am still very excited to see this part of the game getting started, and cannot wait to update you all further on its progress. The code update is fairly minimal, and may come off as even boring to some. Nevertheless, the additions, which revolve around refining acceleration and deceleration, were greatly needed.

post1

The two highlighted “if” statements, if you remember, were once a single statement. This, we found, caused the player avatar to freeze in place if Speed exceeded MaxSpeed, which happened often. It would remain frozen as long as input continued (player kept swiping down), and resumed as normal once input stopped and the deceleration method was executed. Thus, the offending portion was moved to its own “if” statement, and if true, set the Speed to the MaxSpeed. One problem solved.

Another portion in the first statement worth mentioning is when we set the Acceleration variable equal to a set of values. This was to make the acceleration look more gradual, and therefore lifelike. There exists a similar set of logic within the “Decelerate” method as well, but it may be unnecessary in the end. The foundational logic is there, and it looks pretty good. Some more tweaking with the numbers wouldn’t hurt though.

The Bad

Yes, it was good that we sat down and broke some serious ground as a team. However, I think that is the only time we worked on the project between then and now. Matters are only likely to get worse as well with me getting hired, and he still occupied by his job, family and school work. We may have to hold more scheduled pow wows in the future in order to see this thing through.

The rest of “The Bad” involves running into more coding problems, of course. I had some extra time after finishing up the acceleration and deceleration bits, and decided to break ground on getting our avatar to move right and left of the path, depending on where the player’s fingers were on the screen. It did not go well, but it wasn’t a total disaster either.

Governance for moving left and right. Housed in the updateSpeed() method.

Governance for moving left and right. Housed in the updateSpeed() method.

It should be stated that the logic depicted above actually works as intended. It will move the avatar depending on where the fingers are located. However, when playtesting, the avatar will jitter in the direction of the finger, and then reset along the path. Thinking that something within the MoveCharacter() method was interfering, I temporarily disabled the whole method. Sure enough, the player moves towards where the finger is located.

Okay, it is a good thing that the movement logic works, but we have two additional problems to deal with now.

1.) The avatar was free to move anywhere it wanted to. We only wanted the avatar to be able to move a set distance to the left and right of the the path.

2.) The camera, since it is a child of the avatar, follows the avatar along that axis. We only want it to follow the avatar’s forward movement. This means we will have to remove it as a child to the avatar, and write an additional script specifically for the camera’s movement with regard to the avatar.

What’s Next

For me, the big focus will be resolving the two issues we now have mentioned in “The Bad”. Nate will continue to chip away at the female model in his free time. Collectively, we need to come up with a course of action. Setting a time and day of the week were we can both sit down and hack at this seems like the best way to go for the foreseeable future.

– Andrew Medhurst

C# – I’m Getting Better At This

What I’ve Been Up To

My educational background indicates that I am technically a game designer, but through this project, you would have thought that I went to school for programming. I’ve become so proficient at leveraging C#, that I feel confident in applying to a few entry-level developer positions. This doesn’t mean I’ll get them, but I feel like I could do them.

That doesn’t mean that I’ve stopped applying to Game Designer positions, because trust me, I have been…A LOT! As a matter of fact, I’ve been applying to anything closely related to video games. UI/UX designer, Game Tester, .Net Developer, nothing is off the table. When you are poor, you don’t really get to be picky.

On another note, I’m working on giving my portfolio a new home! This is what I have so far.

Homepage

Pretty impressive, huh? Well, it isn’t much to look at right now, but there is certainly more to come. In the end, the idea is for the page to be a portfolio piece in and of itself. If you’d like to see my humble homepage in the flesh, feel free to hop on over to andrewmedhurst.github.io.

The Game

My update for the game is going to be pretty short this week. I was able to get the avatar to run along or path. It also accelerates when input is identified, and decelerates when none is present. This basic functionality is as far as it goes right now. In the coming weeks, the feature will be altered in an effort to make it feel more natural and intuitive.

If you look at the inspector section on the right-hand side, you’ll see the global “Speed” variable gradually increases by the value of “Acceleration” as the user swipes downward on the mobile device. It then decreases by “Deceleration” when input is no longer received.

For now, we have a very primitive solution for getting the main camera to follow the avatar as it moves along the path. It is attached as a “child” object to the player avatar, and will follow it wherever it goes.

Child

Main Camera object is housed under the Player (avatar), causing it to follow the Player.

The Code

Now for how it works:

MovementCode

Now with comments! Hopefully, this will make the code easier to digest.

As usual, all of this code resides in the updateSpeed() function (I really need to change that name), which is then called in the Update() function.

Let me just say that TouchPhase is my savior. Without it, this would probably be much more complicated. and take much longer to figure out. My goal is to have the comments in the image describe what is going on. In a nutshell, when the variable of finger 1 moves in downward direction (swiping), Speed is updated at a rate of (Speed times Acceleration) over seconds. When no fingers are detected, the code controlling the rate of Deceleration is executed.

Now, located underneath all of this is the formula that actually pushes the avatar forward. It invokes the transform function, and tells it Translate, or update the position of the avatar, by our first fingers DeltaPosition on the X-axis by the current Speed value over Time.deltaTime (seconds). The zeroes represent no change to the Y or Z axis.

Coming Up…

As stated previously, this week will revolve around getting movement just right. This will include setting some sort of delay for after no input is detected. This is so that the avatar will not lose speed in between swipes. After no input is detected for a second or two, then the deceleration code will execute. Both the Acceleration and Deceleration variables will likely be multiplied by another very small float value. This is an effort to give both features a more gradual look and feel to them. Right now they seem rather rigid and sudden. MaxSpeed will also be revised, as the avatar seems to move pretty fast at a fraction of that value already.

In the event that all of that gets wrapped up in a timely manner, I’ll begin exploring how to make the avatar strafe left and right, depending on how far away finger 1 is relative to the avatar’s location. This feature will be used to dodge obstacles, once those are implemented.

As always, thanks for your continued interest in this project, and thanks for reading!

– Andrew Medhurst

Cube-atar has ups!

First of all, I want to take a moment to celebrate that I am publishing another blog post only 7 days after my previous one. This may not seem like a big deal to most, but I like to celebrate small victories.

With that out of the way, I’m also pleased to say that the milestone for this week was achieved. Our avatar can jump! Hence, the title of this post. The mechanic works just as described in my previous post. When two fingers are pressed to the screen at the same time, the cube jumps and then lands on the ground.

In this post, I’ll elaborate on this mechanic, as well as delve into the fundamentals of our avatar’s forward movement.

Update()

Just as a reminder, this is a function native to Unity’s API, and is used to run the code within it at each frame. I have used it to run two other functions, updateSpeed() and MoveCharacter().

Update

The lines of code within these two functions could technically reside directly in Update(). We decided to separate them into individual functions, then call them in Update(). This is intended to keep things looking clean and organized. The final line in Update() is designed to move the avatar forward along the avatar’s x-axis when the variable “Speed” is greater than 0. This will likely move into the updateSpeed() function in the future, as it looks a bit out of place in its current location.

updateSpeed() – Jumping

This is the hub for all of our touch functionality as of right now. The name of the function doesn’t accurately reflect that, but that is because its original purpose was solely to update the Speed variable. Obviously, things change.

JumpCode

Lines 60 through 65 give the avatar the conditions that need to be met in order to jump. Before I break down the code, we are going to need some background on Input.GetTouch(0) and  TouchPhase.Began in order for this to make sense.

Input.GetTouch(integer): When a finger is pressed to the screen, it is stored in an array, which assigns that touch a specific ID. When a finger is removed, it is deleted from the array. We can recall this touch ID with GetTouch(integer). Since array’s use an index of zero, GetTouch(0) is equal to the first touch detected.

TouchPhase.Began is the phase that is executed on the next frame once GetTouch(0) is detected. It will execute for that single frame alone, and will not continue running, regardless of how long that finger remains pressed to the screen.

Now, look at the associated “if” statement. In English, it says that, “If 2 fingers are detected against the screen AND if the Touch Phase Began for finger 1 while finger 2 is also touching the screen, execute the jump.”

This should indeed make the cube jump. However, there is one small problem…

There appears to be nothing in place to determine what consists of a single jump. This means that the player can continuously tap the screen with two fingers and essential fly through the enter game world, which is most certainly not the mechanic that we’re aiming for.

Luckily, the issue can be resolved with the placement of a simple boolean called “canJump”. It’s default value is set to true.

canJump

canJump, which must be true in order to jump, is set to false when a jump is execute. This is then set back to true when the avatar lands.

The floor checking code, which resides in the MoveCharacter()* function, essential asks if the character’s Y position (up and down) is on par with the floor’s Y position. If it is, the jump is considered to be finished, and the player can execute a jump once again. Now the jump looks more like something we would want.

* Unfortunately, that is all that will be shown of the MoveCharacter(). Much of the code consists of example code from an asset called iTween, which is not free. I have to respect the work of others.

I know you can’t see it, but your are just going to have to trust that I’m actually tapping a screen to get it to jump ;).

updateSpeed() – Movement

The next portion of updateSpeed() is intended to update the global Speed variable by the change in position of a finger that is swiped in a downward direction.Technically, this feature is broken right now. iTween’s pathing feature is somehow interfering with it’s execution.

MoveCode

Even still, the logic remains the same. A local variable called nbtouches is declared on line 70. This is to keep track of how many fingers are currently being pressed to the screen. In all honesty, we could likely do without the for loop that follows, but it is there just as a fail safe. On line 75, if we have more than zero touches, but less than 2, GetTouch(i).

Remember GetTouch? We are basically calling for the first finger that is touching the screen once again. Although, this time we are not listening for a second finger as well. This time, we are listening for a change in the deltaPosition (location) of that finger, over the course of Time.deltaTime (seconds). That formula updates speed, and thus updates the transform.Translate formula we have running in Update(). Pretty cool, huh?

Up Next…

The biggest focus is going to be getting the forward movement and iTween path to intertwine and work in concert correctly. In my experience, this is harder than it sounds. Taking an external asset and code, and trying to marry it up with your native work is surprisingly mind-numbing. Getting the avatar to accelerate and decelerate in a believable, organic looking manner should take up a good chunk of time as well.

Additionally, we may encounter an issue with jumping while moving forward. We just want to ensure that the movement velocity carries over to jumping properly.

Thanks for tuning in and for your continued interest!

– Andrew Medhurst

The Hiatus

Obviously, my original intention of providing weekly updates on my current project didn’t hold up. It has been, oh, 4 months since my initial post announcing my bold undertaking? Calling it a hiatus would be an understatement. I wish a legitimate excuse existed, but there is none. Sometimes life just runs off on you.

My absence, however, should not be interpreted as abandonment of our small team’s cause. We have been at work. Most of our progress involves sifting through Unity’s API in an attempt to piece together our game’s mechanics via C# scripts. We have also been sifting through Unity’s asset store in an attempt to find resources that may provide us with preexisting tools that we could use. You know, working smarter, not harder and all that.

It goes without saying that an undertaking of this kind rarely runs smoothly, and our case is no different. In this post, I while go through (hopefully keeping it as brief as possible) the steps we’ve taken, the steps we’ve retraced, and some of the challenges we are immediately facing.

The First Prototype

For whatever reason, despite our platform of choice being mobile devices for the game’s final release, I pushed us in the direction of creating our first prototype by using the keyboard as an input method. This did help us get something working on the screen, but it ended up being a waste of time, as the API commands for touch input are vastly different from that as simple key presses on a keyboard. Thankfully, we switch input methodologies prior to completing the prototype, but that means the prototype is still in the works currently.

Despite all of this work now technically archived, I want to go through it, and explain how some of the scripts will actually act as a sort of template for the work that now needs to be done for touch input.

Let’s start from ground zero. This is our player avatar:

avatar

Pretty sophisticated stuff, right? Okay, maybe not, but it gets the job done for the time being.

The floor started out with a simple white colored texture which we stuck with for a while. Once we started implementing movement, however, it was difficult to determine if the avatar was behaving the way we wanted to. Thus, we nominated to use a checked texture so it was easier to see if the avatar was even moving at all.

Now that we had our avatar, what exactly did we want it to do? Well, we knew we wanted it to move forward. Since our game was of the “endless runner” type, we weren’t concerned with being able to move backwards. The simplest way to achieve this would be attaching a Rigidbody component to the avatar. Then, a script is created and attached to the avatar.

Unitystuff

Now some code is needed within the script to actually get the cube moving.

code1

This basic script starts out with declaring a public variable called speed. It is public so that it may be altered within the editor at run time. That way, if the avatar is moving too quickly or too slowly, this float value can be edited without having to revisit the script itself.

The rest of the code resides in the indigenous Unity function “FixedUpdate()”. The code within this function runs at each fixed frame, and is used in favor of the “Update()” function when dealing with Rigidbodies. Most objects in a game will need to be continuously refreshed and updated during the game. Thus, most will find their way to one of these two functions.

First up is an if statement saying that, “if either the W key or E key is pressed, do something”. That something is applying force in the forward direction (Z axis) of a Vector3. This force is applied by a rate of our Speed variable over the course of Time.deltaTime (change of time in seconds). The logic behind assigning two keys is an attempt to represent a sort of running movement with two fingers in the best way we could on a keyboard. We later added booleans that would enable one of the keys after the other was pressed, and disable the same key once it was pressed (i.e. E pressed = E disabled, W enabled).

Another if statement states that if the space key is pressed down, a separate function called “Jump()” should be executed. Jump() applies a force in the Vector3.up (Y axis) direction by another public float called jumpSpeed.

iTween and the PlayerController

That is about as far as we got with using a rigidbody. We came across an asset called iTween. The examples provided by this asset showed a way to move the player avatar without the help of a rigidbody. We found this favorable, as it was easier to customize further additions to the code. Rigidbody components come with a host of preexisting characteristics (gravity, drag, etc.) that can interfere with desired behavior in the long run.

The key feature that we were interested from iTween was its ability to attach a game object to a path. Once this is done, the object will move along the path while rotating along it. This is to simulate the avatar facing “forward” as it meets turns in the path. The player themselves will not control any rotation. I would show you exactly how this is done, but while iTween is free to download, the associated examples are not. As I do not want to infringe on anyone’s intellectual property, I will refrain from displaying and going through the code (plus, it is kind of long).

If you want to take a closer look at iTween, please feel free to check out the webpage. It has a lot of functionality, and comes in handy for a host of different applications.

http://www.itween.pixelplacement.com/index.php

Acceleration & Deceleration

We had the basics down. The player avatar moves forward along a path, but not very well. When running in the real world, an individual would gradually gain speed and eventually reach a top speed. Speed would conversely gradually decrease and stop if they ever stopped running. To make the game more lifelike, we wanted to mimic this behavior within the game.

Believe it or not, the bit of code below does just that.

code2

Whoa! What’s going on here!?

For this particular feature we are concerned primarily with the MaxSpeed, Acceleration, and Deceleration public float variables. Speed is also important, but I’ve already explained what that guy is all about. These variables are all assigned an initial value, but are kept public so that they can be changed in the editor if needed.

The function labeled DetectKeys(), which is ultimately called in Update(), will set the rules for our acceleration/deceleration. The highlighted region says that (deep breath) if the W key is pressed AND Speed is greater than negative MaxSpeed (essentially a safeguard) AND Speed is less than MaxSpeed AND boolean moveForwardW is equal to true, accelerate the avatar by a rate of Speed plus Acceleration over the span of Time.deltaTime. If these conditions are not met, decelerate the avatar by a rate of Speed plus Deceleration over the course of Time.deltaTime until Speed is equal to 0. Got all that? This same logic is repeated for the E key.

The logic works, but not efficiently. Key presses had to be timed perfectly in order for the momentum to carry over to the next key press. If timed improperly, the avatar would decelerate to a stop, and the process would then need to be repeated until another mistake in timing was made. This was clearly not the behavior we wanted. You would need to be a veteran of the game in order to perceive these controls as intuitive. It was at this point that we nominated to abandon prototyping the game with a keyboard, and thus made the move to code for mobile platform touch input.

Up Next…

We are in the midst of working through transcribing our keyboard code over to touch. As of right now, Unity is recognizing how many touch inputs there are at a time, and we are capable of moving the avatar forward when swiping in a downward direction.

Over the course of the next week, we will be working on getting the avatar to jump when two fingers are tapped against the screen simultaneously. I promise to give a timely update from now on!

Thank you for taking the time to read my rantings, and for your continued interest in our little project.

Respectfully,

Andrew Medhurst

The Beginning of a New Chapter: Becoming An Indie Game Designer

Okay! I have officially graduated from Full Sail University! Now what? Well, seeing as to how most of the job boards are looking for game designers with at least 2 years of experience, and seeing how I am still a novice at the whole networking game, I decided to become an “Indie Developer”. Woo-hoo!

So, what does that mean for me, exactly? Well, I can’t do everything on my own, so I asked a very good friend of mine if he’d like to go on this journey with me. Thankfully, he said yes, because I am no good at art and he is awesome at it. This makes him our sanctioned artist.

Like everything in life, there are pros and cons to working in a small (super small in this case) team.

Pros

  • We have complete creative freedom over what we want to do and create, what tools we want to use, and the pace that we take.
  • We decide what tools we use.
  • Not much gets lost in translation, and there isn’t a heavy emphasis on documentation/ record keeping, as there are only two parties involved.

Cons

  • We aren’t getting paid for this.
  • Self-motivation sometimes really sucks.
  • I have experience with UDK. However, we nominated to use Unity3D, which involves a learning process.
  • A host of other organizational software and tools are needed. Finding the right one is half the battle.

Nut-shell Concept

The concept of our game was inspired by my buddy’s wife. She recommended creating a simple app where the user could swipe their index and middle finger on the screen of a mobile device to run on a treadmill. We took that and thought that it would make an interesting platform/obstacle type of game. Now, granted, there are a host of “endless running” games out there on the iTunes and Google Play app stores, but I never came across on with the mechanic we were looking to adopt. Hence, “Running Game” (working title) was born.

Our Tools (so far)

Considering I had such a rough experience with UDK while I was a student, I wanted to work with Unity3D, which is widely regarded in the small-time indie realm of game development. We spent the first couple of months running through the video tutorials they have on the website. During this time, I fell in love with the C# programming language, which meant one more thing needed to be learned. If you are interested in Unity and never got around to testing it out, I highly recommend it. You can find the tutorials here: http://unity3d.com/learn/tutorials/modules.

Since my partner and I will be working on this project at different times and different locations, we knew we needed some kind of version control. Github caught our interest, mostly because it is free, and because many others within the community use it. We then utilized http://bitbucket.org/ to act as our repository for our project. And finally, considering a GUI is always preferable to typing in a command prompt, we use SourceTree (https://www.atlassian.com/software/sourcetree/overview) to push and pull our data to said repository. We are sorting our way through documentation, tutorials, and other known best practices for the mentioned tools, as it can be pretty particular about what can be pushed and when. Do it wrong, and it throws a hissy fit.

Coming Up…

We final broke ground on our first prototype this week! So far, we have a cube that moves… Hey, it is better than nothing, okay! I hope to update this site with our progress at the end of every week. I will be posting problems and solutions for everything I come across during this project, so stay tuned!

Final Project video walkthrough

I did it! I finally graduated from Full Sail University’s Game Design program. The last four months of the degree revolved around a team of 4 other students and myself creating our very own game from concept to realization. I learned a great deal about UnrealScript, communication, cooperation, utilizing Adobe Flash to create menus, and so much more. It really was an amazing and trying experience.

I decided to condense the game into two video walkthroughs of the game, simply because there is so much stuff going on behind the scenes. It would take ages to go over everything! If you have questions with regards to a specific feature, please let me know, and I will do my best to shed some light on the topic of interest. Enjoy!

Part 1

Part 2

Fun With UDK: Another Project

Like most other projects, this one also covered the course of three weeks. The team included five individuals, including myself, but this time the game had to feature a minimum of three separate levels or locations.

Thankfully, I was able to locate all of the critical assets for this project, so all textures and meshes should be visible.

Picture1

The School Maze. Consists of the majority of the game.

The original concept consisted of a nerdy school boy, named Little Russ, trying to get through the day without getting pulverized by a bully. This bully went by the name “Big Boy Bruce”.  Primarily, the game consists of the player engaging in dialogue with Big Boy Bruce, and then selecting a response to the situation with the numerical keys. The chosen response will effect certain stats, such as health and courage.

Response options listed on screen for dialogue scenarios. Breaks cardinal rule of obstructing the player's vision of the main action area, but the team couldn't figure out how to reposition the text.

Response options listed on screen for dialogue scenarios. Breaks cardinal rule of obstructing the player’s vision of the main action area, but the team couldn’t figure out how to reposition the text.

When the player entered a dynamic trigger volume (used to assign checkpoints in the project previous), all movement would be disabled and an AI to represent Bruce would approach. We had to use default skeletal meshes, as no one on the team knew how to create custom meshes via other software.

Picture5

Notice how selecting “2” has changed the stats. I don’t know why we included an “out of 10” display on the HUD. A singular number that changes would have been sufficient.

After a response was selected, the related stats would update, and the AI would move out of frame and be destroyed. At this point, movement capability was restored to the player so that they may progress.

Kismet was utilized heavily to govern what executed and when. Kismet is a visual scripting language, indigenous to UDK. It really helps for people like me who don’t grasp traditional coding very well. Each dialogue sequence needs a new set of AI to represent Bruce, with subsequent guidance on how the dialogue pans out. Therefore, there are 8 similar looking sequences to the one in the image below.

I’m going to cover Kismet quickly with images, as the previous post covered visual scripting in great detail. Once again, this was my domain of the project, and tested my abilities greatly.

Picture6

One Kismet sequence for governing the spawning and movement of Bruce. Tells AI when to approach player and begin dialogue, when to allow the player to select a response, and what to do after a selection is made.

Sequence begins with spawning an enemy actor.This sequence spawns one once the level is loaded, but all others spawn once a specific trigger is touched.

Sequence begins with spawning an enemy actor.This sequence spawns one once the level is loaded, but all others spawn once a specific trigger is touched.

Picture8

When the player actor touches a certain trigger volume, the “Set Physics” node disables the player’s movement capability, tells the enemy AI to move to a specific location, and then commences dialogue on-screen.

Picture9

When trigger touched enables options for selecting a response to the dialogue.

The spiderweb mess above consists of toggle nodes that enable specifically delegated keys. When one of these is pressed, all those options are toggled back off so that they don’t function again. An announcement is made to confirm the selection. The announcement node is then linked to custom-made nodes that alter the player’s stats on the HUD. These are not native to UDK, and had to be made in Microsoft Visual Studio, utilizing UnrealScript. This was not done by me, but another member that is more savvy at scripting than I. Enemy actor was then set to move to a set destination. After reaching the destination, the actor is destroyed and movement capability is restored to the player.

Operations of the Dodgeball Mini-Game

So, affecting the outcome of a game through dialogue choices is fun and all, but it doesn’t make for a very fun game all on its own. Therefore, we came up with a few mini-games to offer up some variety. One of these is “Dodgeball”.

Picture2

Dodgeball Mini-game Room. Positioned so that you can see the camera angle and Interpolating Actors that will be animated for the player to dodge.

To create the balls, we had to use the spherical brush, add the volume to the level, and then converted them to Interpolating Actors. Those apple-looking icons on the ground act as relays to designate where the player would appear when teleported to the room. There are a series of Dynamic Trigger Volume boxes surrounding the red outline. These are used to tell the game when the player has moved out of bounds, and should stop the game.

Picture10

When a specified trigger is touched, the player is teleported to the dodgeball room. The camera is reassigned to the one in the previous image, a brief announcement is displayed, and the dodgeballs begin their animation sequence. The red lines imply boundaries, and the game ends if the player goes past them.

If a player is struck by a ball, 1 unit of health is lost. After all animations have completed, the player is instructed to press the spacebar to return to the main area of the game (school maze), and the game continues from where it was interrupted.

Picture11

All Kismet used to execute Dodgeball

The kismet sequence for guiding this mini-game is as follows:

Picture12

Trigger touched teleports player to dodgeball room. Announcements play and a series of matinees execute. Afterwards, a trigger is teleported to the room.

Picture13

The teleported trigger allows the player to teleport back to the school when the spacebar is pressed.

Picture14

If one of the cyan colored Dynamic Trigger Volumes are touched by the player, player movement disabled, announcement played, and teleport the trigger that will send the player back to the school, into the room for use.

Picture15

If the player is touched by one of the InterpActors (balls), remove 1 unit of health from the HUD.

The Food Fight Mini-Game

The food fight utilized the same ball mesh from the dodgeball mini-game. As stated previously, no one had any skills to create a “food” mesh, so we worked with what we had.

When the player teleports to the cafeteria, the camera shifts to a 2-Dimensional side-scrolling perspective. Rotation keys are disabled and collision volumes are placed so that the player cannot deviate from a linear path. The player must make their way from one side of the room to the next. If they are struck by the food, 1 unit of health is deducted from the HUD.

Picture3

Food Fight Mini-game Room
Picture16

Starting location of the player when teleported to the cafeteria. Teleportation initiates when a trigger is touched by the player.

Picture17

The game in action! I had already hit some “food” at this point, hence why my health is low.

When the player reaches the end, there is a trigger that will teleport the player back to the school. I decided not to show the kismet for this one, because it is really unorganized and I don’t think it is properly linked together. Plus, I’ll have more visual scripting to show off soon.

Like right now!

End-Game Conditions

So, what’s the deal with the Courage and Health meters on the HUD? When the Health runs out (equals zero), the game is set to end and shut down. If the Courage meter is above 5 when the player reaches the exit to the school maze, they have successfully made it through the school day and win the game. If it is below 5, Little Russ gets beat up by the bully and the game is lost.

Picture18

Kismet sequence to visualize the process of fulfilling and executing end-game conditions.

What could have been done better

We neglected to implement a tutorial for moving, selecting answers for dialogue, and how to play the mini-games.

If I had more time, I would have switched dodgeball around where the player is the one trying to hit AI targets, rather than being the one to dodge the balls.

Custom meshes to represent food and better balancing of projectiles for food fight. Some of them come in too quickly and feel unavoidable.

More time playtesting the game. Not everything works as intended, and this hurt us in the final deliverable of the game.

Fun with UDK: Visual Scripting with Kismet

So, I’ve showcased what a designer can do with some flash cards, a gridded mat, and a whole bunch of documentation. Now it is time to show off a digital level that was designed and developed using the Unreal Development Kit (UDK). It is free to download here, in case you wanted to tinker with it a bit: http://www.unrealengine.com/udk/downloads/

This project, like the other, was conducted with other team members. I believe it included 4 weeks to create the game, but it may have only been 3. A bit of time has elapsed since then, and my memory is somewhat foggy regarding the details. I nominated not to include the documentation for this project, as all of what is typically required was covered in the post for the board game “Elixir”. This should cut down on the length for you readers. Don’t say I never did anything for ya! 😉

UDKPost1

1st half of level

Now, you may be wondering to yourself, “Are those blue and gray checkers supposed to be a part of the level?” No, they are not. Sadly, the computer that houses some of the textures and other assets of the level is currently in the hospital. I will update this post with additional details about the level when I get a hold of those assets, but before I move onto the visual scripting bit, I will briefly describe how the level should look if everything was there.

The player, when starting the game, would start in the upper left-hand side of the image above. It would be the area shrouded behind all of the trees.  That large patch of checkers in the center is actually supposed to have a running river going through it. Rocks were placed in the river, so that the player could jump across the river safely. If the player feel into the water, they would take damage to their health bar over time. The team’s texture guy did a fantastic job of making the running water and mist coming off of it. The numerous grassy platforms should contains various rocks and other foliage. These rocks wold also be used as platforms and cover to hide from malicious non-playable characters (NPCs).

UDKPost2

2nd half of level

To reach the second section of the level, the player would cross the bridge that connects the two halves together. Once again, this is intended to be decorated with various rocks and plant life. The player must navigate through a catacomb of more rocks, defeating enemies on the way.

The large pagoda structure looked fantastic with all the textures and lights on it. This is the place the final boss was located in. We tried our best to make it visible throughout the course of the level, not only to convey the structure’s significance, but to maintain the player’s orientation while playing. If there is ever any confusion as to where they need to go, they only need to look for the large pagoda.

UDKPost3

Cave Entrance: Originally had locked door that could only be opened by a key that is dropped by the “final boss”. Breaching the cave means the level is complete.

Once the player defeats the boss, a key static mesh would appear where it fell. The player need only to walk over the key to “pick it up”. Having this item is required to open the cave door (that would have been there, fully animated in the final version). Entering this cave and acquiring the final item signifies the end of the level.

Kismet Visual Scripting

Kismet1

All Kismet sequences governing events in the level

Kismet was my baby of the project, and I loved doing it. This is primarily because I am terrible at traditional scripting with code, and because, despite being a novice, I picked up visual scripting very quickly. This Kismet is very unorganized. Most of the time each type of sequence would be categorized (AI, Respawning/Checkpoints, Objective items, weapons, etc).

Kismet2

The blue named rectangles are examples of categorizing the kismet. The front page should be covered with these, rather than having sequences running around here and there. At least that is my practice these days.

Kismet3

Double-click on one of these categories will open up all the kismet sequences housed within

But, considering that this was my first major project with Kismet, you’ll have to forgive my sloppiness.

Let’s start by exploring the first thing I learned how to make work! We used and abused triggers, because they are one of the easiest and most accurate ways to trigger all sorts of events.

Kismet5

Ya know, these things.

When the player enters the radius, which we can manually set, an event will occur. The image below, in ever day English, says that when the player touches a certain radius, a series of announcements will appear on the screen for the player to read. We included story narrative, instructions, or taunts from the enemy utilizing this method. It’s primitive, but it works.

Kismet4

Triggers and the associated announcements. Delay of two seconds means the player will have enough time to read the message before the next one appears.

Triggers can also be used to activate the animation of Interpolating Actors. These actors look a lot like static meshes, but are converted so that they can be animated. The “Matinee” node in Kismet is used to record this movement. It then plays back in the game. When the player leaves the trigger volume, the animation is set to reverse, so that the doors close.

By default, all triggers are set to activate once. When a player enters a trigger the action executes only once. If we had left this setting as is, and the player accidentally left the trigger volume, the doors would close and never open again, effectively locking the player out of the pagoda. Thus, in the settings of the trigger node, “Max trigger count” needed to be set to 0, which means the trigger would activate infinitely.

Kismet6

Triggers to govern the animation of the doors when the player touched the trigger

When the Matinee node is double-clicked, the UnrealMatinee opens in a new window. The timeline for how long the animation lasts can be altered by clicking and dragging the triangles at the bottom of the timeline. Keys, the triangles within the timeline, serve as breakpoints. In this instance, the doors remain closed at the first key. By the time the playback reaches the second key, the doors will be fully open. Animation is as simple as that!

Kismet7

UnrealMatinee governing the animation of one pagoda door

Setting Player Checkpoints

Next up is my pride and joy, setting up checkpoints that the player would respawn at in the event that the player avatar died.

kismet11

The process starts with placing a few of these Dynamic Trigger Volumes in (the big cyan colored rectangle)

A touch node for each of the trigger volumes is then linked to activating a nearby spawn location. This activating toggle node is then linked to another toggle node that turns the other spawn beacons off, so as to not respawn the player in an undesirable location. Each time the player enters a new Dynamic Trigger Volume, a new spawn location is activated, while the others are turned off once again. Tah-Dah! Checkpoints done.

Note: The “Player Spawned” sequence doesn’t govern respawning. It is there solely to check if the player avatar has the intended weapon at the time it is spawned. If it does not, it assigns a weapon through the “Give Inventory” node. The default spawn location (the first one placed in the map) is automatically set to “on”. As such, the player will spawn there if they die until that spawn region is toggled off through other means.

Kismet12

All Dynamic Trigger Volumes for the level, linked to a toggle node turning on the proper spawn location while turning the others off.

Artificial Intelligence

I still hold a grudge against scripting the Artificial Intelligence for this level. It kept me up many a night, trying to iron out pathing and properly engaging the player in combat properly. In the end, the scripting looked something like the hot mess depicted in the image below.

Kismet13

And that isn’t even all of it! Some of the sequences couldn’t fit on the screen.

Obviously, I cannot walk through each and every single instance of spawning, moving, and telling the AI when to engage the player. To do so would take ages, and would likely bore you, the reader. With that said, I will take one instance and walk through it. Most others follow the same school of thought anyhow.

Kismet15

One AI sequence for spawning, moving and governing who to shoot at and when. The gray colored comment boxes keep the order in which AI is spawned organized.

When the player touches a particular trigger, each actor factory attached to that trigger will spawn 1 AI skeletal mesh at a predetermined path node (it doesn’t have to be a path node, but it is what I used). After that, the “finished” output links to two separate nodes: Get Distance and Move to Actor.

Kismet16

Portion governing searching if player avatar is within a specified distance. If so, look at the player avatar and start firing at the player. If not, stop firing.

First, the Get Distance node compares two actors to one another. In this case, it is the AI actor and the player. The Compare Float then sets that distance between 0 and 1000 units. If the player is within 1000 units to the AI, the AI will begin tracing (looking at) the player. If visible, start firing at the player. If not, stop firing. Then, a condition is checked to see if the AI is still alive after the loop. If so, repeat the whole process over again.

Kismet17

Move to Actor sequence. Tells AI actor to move to one path node selected at random, delay, then move to another.

By utilizing the “Move to Actor” node, one actor can be instructed to move to another, then another, and so on. The “target” (the actor we want to move) is assigned to the AI, while the destination includes all of the path nodes we want to allow the AI to move to. Getting away with one “Move to Actor” and “Delay” is doable. I repeated the process twice to ensure it didn’t bug out.

A delay is necessary, not only to ensure that the process doesn’t confuse itself with too many commands at once, but to make the AI movement look more natural. It looks pretty odd to the player if the enemy is just running back and forth from one point to the next all the time.

Final Boss & The Key

Much of the boss’ kismet sequence looks similar to the other AI enemies. However, there are some additions made, on account that a key needed to appear to fall from the enemy when it was defeated.

BossKismet

Sequence for spawning, moving, and telling the boss when to attack. Teleports and attaches the key mesh to boss, and keeps it hidden from the player.

I’ve focused in on the key portions of difference. They consist of binding the the key to the boss and repeatedly check for the fulfillment of Remote Event “Death”.

BossKismet1

Sequence to move the key and bind it to the boss

BossKismet2

Remote Event that checks if the boss has died.

When the boss dies, the remote event “Death” initiates and executes the sequence depicted below. The Key becomes visible, an announcement is displayed, and the key’s matinee animation plays back.

Kismet10

Remote event “Death” sequence

After the player walks over the key, a Boolean sets the variable “Key” to true. This is critical for getting the cave door to function.

Kismet8

Trigger for picking up the key. Turns “Key” variable to true.

When player approaches the cave door trigger when “Key” is set to true, an announcement plays and a matinee for opening the door plays. If false, a separate announcement plays and the door remains closed.

Kismet9

If the key variable is true when the player touches the trigger in front of the cave, the cave doors will open. If false, the player receives a message that the door is locked.

And that is the functioning of the game in a nutshell! Once again, I’m deeply sorry that I didn’t have pictures to show it functioning in action, but if you’d like to try the final product for yourself click on the following link to access the .exe: https://www.dropbox.com/s/a8j49p40imkh66i/TeamFire_Wk4_Gold_0629.udk

Fun Facts

Originally, we intended the combat to be melee based, rather than using guns. Getting a collision volume to bind to the melee weapon and work against enemies was incredibly difficult, and binding one to enemies was out of the question. Though it broke the theme of the game, we had to use guns in the end, as combat was essential to have in the game to some degree.

The pagoda was pieced together with volume brushes completely within UDK. An alternate software was not utilized.

Getting the AI to stop moving and start firing when they saw the player was extremely difficult to sort out. The process took over a week to iron out.

Creating “Elixir”: Designing & Developing A Board Game

When deciding which project to discuss first, I decided that going chronologically would be best. With that said, I determined it would be best to start with an analogue board game project that was developed alongside four other aspiring designers. I ask that you humor me, as this is my first post as a “blogger”. As I am a visual learner, I love pictures and use them gratuitously, but I will attempt to be brief and to the point.

The game project as a whole consisted of four weeks. Three of those weeks were used for the conceptualization and creation of a complete game, to include mechanics, a board, player pieces, and any other elements that the game may require.  The final week was used to playtest another team’s concoction and offer them feedback.

Week 1: The Team Charter

Once team members were introduced to each other, via email, Skype, or other methods of communication, we felt it important to iron out the details of how our team would function. The vessel for this task: The Team Charter. Charter

The image above is the first major chunk of the charter. Mission and Values primarily exist as “feel good” fillers for potential investors. Though we aren’t trying to get any funding for the project, it is important to practice professional writing skills. Strong communication guidelines are much more crucial every team member to establish and understand, as communicating ideas and being present for voting on developing creative directions of the game is the lifeline of such projects.

Contingency procedures for when a member can not make it are also important, not only simply because it is polite to tell fellow team members that you can’t make it, but so the team can give you notes for what developed during that member’s absence. Oh yeah, and I wrote that portion. 😀

BioCharter

In the spirit of effective communication, all team members were required to deliver an exhaustive list of usernames, email addresses and phone numbers were the individual could be reached at, should the need arise. A biography of the individual was also published, but wasn’t really significant for the functioning of the team. For the sake of the other members’ privacy, I am only showing what I produced, as shown above.

CharterRoles

As anyone can attest to, regardless of what career path that individual may be on, establishing every team members’ “lane in the road” is essential for a fluid team experience. Fluidity is achieved by assigning each member a job title with specific tasks, in an effort to reduce duplication of effort and mitigation of potential conflicts based on a “not my job” argument.

Thus, I give you the “meat and potatoes” of the team charter: Team Roles & Responsibilities. A title of each role is followed by a description of responsibilities for the role. The table below that lists the primary (P) individual, along with any supporting or alternate (A) individuals. Everyone needed to be instrumental in deciding the core mechanics of the game, as well as playtesting the prototype game. This is why either a “P” or and “A” is listed in every cell of those rows.

CharterConduct

The final three sections of the charter mainly govern how team members will treat one another (logically could be summed up with “with kindness and respect), and how design decisions will be made by the group. If correction needed to made for a team member who wasn’t pulling their weight, this document would be referenced to remedy the situation.

Week 2: Concept & Documentation

After the first few group meetings, it seemed like everyone was on-board for going with a post-apocalyptic theme for our game. That decision was probably the only easy one that we made, because next came: THE BOARD!

JeremyOwensQuickBoardMock-Up

First Mock-up of Elixir Game Board

If you find yourself asking “What the heck is that?”, don’t worry we were all pretty much in the same predicament. This was our first draft (more like something to get started with) of our game’s play space (board). In summary, we knew that all player pieces should start in the same location, ya know, for fairness’ sake. This location is indicated by the long line extending from the right. There would then be multiple paths for the player to take, and the rings indicated the zones where different enemy types (goblins, wombats, evil clowns) would be encountered. This was the humble beginnings of our combat system,as we knew we wanted to integrate it somehow. In the center would be the “elixir”. Retrieving the item was the main goal of the game. Whichever player reached the center first acquired the elixir. They then needed to bring it back to the original starting location (long line to the right). Whoever did this won the game.

2nd Draft of Elixir Game Board

2nd Draft of Elixir Game Board

Our second draft made the board look more official. Enemy zones were named, we had an actual “Start”, and the Elixir location was on the board. But, much like the board prior to this one, it was easy to spot what paths players would logically take, and which ones would never be touched. By taking a quick glance, one would infer that the dominant strategy would be simply to move along the top horizontal path, down the angled path, and into the upper elixir path. Reverse the order once the elixir was picked up to go back to the start and win.

This was the first time the team had to come to a harsh reality. Even though we all loved the way the board looked, we knew that this would happen when people played the game, and it isn’t what we wanted. If we had numerous branching paths, we wanted players to have a reason and a desire to explore them.

Picture2

3rd Draft of Elixir Game Board

I know the image says 2nd Draft. Ignore that, please. I regard all drawings and attempts at creating a game board as a draft.

Aside from that, the board looks much different now! Dominant strategies seem less obvious with the pathing. We actually tried to make easier paths more congested with encounter spaces, as to give these areas a good risk/reward factor. Individual blocks indicate one movement space and are color-coded for specific events, and the board looks official with a title and “start” and “key” spaces. The rules for each color code are listed above the board.

All team members were tasked with recreating the draft, grabbing a 6-sided die and testing out if the spaces and paths seemed balanced. This playtesting was informal, and was just to get a feel for the board.

More explanation will be given for the rules when I discuss the rules document later on, but for now, each color coded space means the following:

  • Red = Encounter: Player fights against non-player opposition when landing on this space when all movement has run out for that player’s turn.
  • Green = Reward: Player draws a Reward card from the reward deck when landing on this space.
  • Blue = Equipment: Player draws an equipment card from the equipment deck when landing on this space.
  • White = Null: No event takes place when the player lands on a white space.
Picture3

Sample of Game Card Names and Effects

Now , I don’t want to bore you with an exhaustive inventory of all the cards the team had created by the end of week 2, so the image above is just a snapshot of what we had (we had 183 cards total by this time). I include it because it should be understood how critical thorough documentation is in order to keep every member of the team on the same page. This is doubly important because the team was working abroad. Not seeing other members in person means that important aspects of the game can get lost in translation if there is no solid documentation to keep everything in order.

Picture4

Sample of Playable Characters List

All members of the team had played their fair share of role-playing games, and one of our favorite features was choosing what our avatar would look and feel like. This level of personalization was so dear to us, that we wanted to integrate it into our board game. The image above lists a few of the individual playable characters, their background story with regard to the story of the game, and any special traits they may have. By the end of the week, we had outlined 15 different selectable characters.

Week 3: Construction and Playtesting

Week 2 required continuous, thorough documentation. Was week 3 any different? Absolutely not!  This week required carefully plotting out the rules and regulations for the game, then playtesting and tweaking this rules in an attempt to achieve the most balance possible.

Turn Flowchart

Turn Phase Flowchart: Describes all events and conditions that can occur during a single turn phase for players.

Encounter Flowchart

Encounter Phase Flowchart: Rules that govern combat in the game

Duel Flowchart

Duel Flowchart: Rules for governing Player vs. Player (PvP) combat phase

Flowcharts are an excellent method for conceptualizing rules. As I am a visual learner, they work better for me to understand what is going on. Notice we added a duel system. This serves to purposes: to add even more competition between players and a method for players to steal the elixir from one another.

These flowcharts were the foundation of our rules. Numerous attributes changed over the course of the week, but eventually a finalized “rule book” was published. It is much too large to cram into this post, but if you’d like to glance over it, click on the following link: https://www.dropbox.com/s/q1hq6gcp0lb89bg/Ludus_Prototype_0113.pdf

Picture5

The final game board in all its glory

All team members were required to physically create their own version of the board and playtest the game. Yes, this requires having actually friends and family, which is sometimes hard to come by with everyone’s busy schedule.

Numerous changes were made to various types of cards, but the most significant change was removing the individual stats of the different characters we made. We leveled the playing field by giving everyone the same starting stats. Having so many different kinds created huge imbalances between the different characters. The “Key” area replaced the elixir. The player had to grab the key and return to the start in order to open a chest that housed the elixir. Same basic rules applied, just the order changed.

The GDD

The Game Design Document (GDD) is the game project’s holy book. It houses all changes to rules, game elements, procedures, and so on (basically everything I’ve shown you so far). I saved it for last because it is so massive and undergoes so many changes throughout the span of a project, that inserting it into the post over and over again would take hours for both myself and the reader of this post. If you’d like to take a look at the finalized version, feel free to take a look here: https://www.dropbox.com/s/2kzcspycbj0uxuo/LUDUS_GDDRevised_0113.docx

Feedback

It is very difficult to remember what feedback we got from other teams, as we did not record it for safe-keeping and it was a long time ago. From what I remember, though, the game was very long for most and some of the rules were vague or unclear.

Lessons Learned

Documentation is everything, seriously. When working with a team that you do not see in person, and with people who live in different time zones and have other tasks to attend to, having minutes to meetings and rules documented thoroughly keeps everything running smoothly.

There is also a lot to be said about playtesting, and that is to do it all the time. It is never too soon to test your game. The team did moderately well for cramming it all into one week, but the game could have been much more balanced if we had started sooner and more vigorously.

Welcome!

Welcome to the home of Andrew Medhurst’s Game Design portfolio blog!

So, what can you as the viewer expect to see in this blog? Well, stuff involving games! Being a novice game designer, I will post pictures of, and elaborate on some projects that I have worked on. These range from analog board games, to digital games that were created in Unreal Development Kit (UDK). All works were created and crafted on a collaborative team of like-minded designers.

In the future, I’d like to expand blog entries to include, but not limited to:

  • My personal definition for some game design principles (i.e. game mechanics, theory of “fun”, the MDA process, etc.)
  • Game Ideas. Though these are a dime a dozen, I won’t post an idea unless I really want to hold onto, and it has thoroughly been thought through with reinforcing documentation.
  • Deconstruction of a game played. This consists of a pro/con comparison of the game, mechanics of game, and usage of the platform it is on.

If you’d like to take a glance over my past professional experience, please feel free to visit my LinkedIn profile at the following link:

http://www.linkedin.com/profile/view?id=179796521&trk=nav_responsive_tab_profile

If you’d like to view and download my professional history, feel free to access my resumé here:

Click to access AndrewMedhurst_Resume.pdf

You can get ahold of me via email at amedhurst@gmail.com, or send me a messge on Twitter @amedhurst76

That’s all for now. Once again, welcome! Feel free to browse around, and enjoy your stay.

Respectfully,

Andrew Medhurst