Categories
Uncategorised

Developer Journal : Prototype Gameplay

Developer Journal

I’ll be going over the gameplay of my prototype, and how each level in my Unreal Game is designed to help teach players the mechanics of the game.

Good level design is crucial for the prototype as it is a good way to introduce the the mechanics as straightforward as possible through player experience.

“Level design incorporates player capabilities, game mechanics, obstacles, and discoverable elements that create a positive user experience.”

Master Class, 2021

When making my prototype, I considered the best way to incorporate my mechanics would be to let the player discover them at their own pace once first introduced.

Many of the early levels have a fairly linear path which makes them easy to navigate, so the mechanics laid out are more obvious. In the first level the player starts directly in front of a locked door in a single room, and must find a way to a key.

The red key is accessed by a sliding block in a one-way gap that can be only pushed into one direction, the player will be lured to the left side by the glowing key and find the dark area.

After getting the key, the player should head towards the red door to get the blue key which opens the locked gate at the entrance – the idea this should convey is that each coloured key opens their counterpart doors (Figure 1).

Figure 1

After being taught the mechanics of the sliding blocks and keys, the next level further focuses on the sliding blocks and also introduces a new mechanic.

The player starts off in a one-way gap in front of a sliding block, being their only path they have no choice but to push this block. Afterwards they’ll then learn that they are able to push blocks downwards – players may also be lured to the exit and make another bridge with this new knowledge in mind (Figure 2).

Figure 2

However, once they try to form the second bridge, the block pushed off the platform reveals a pressure plate that requires a block to be pushed onto it for the door at the exit to keep open.

This means the player has failed the level, as it is impossible to get the block back up, their only option is to self-detonate as hinted under the level name (Figure 3).

Figure 3

On their second retry, the player would try to form the bridge on the lower path of the level instead to get more blocks. There’s enough blocks to form both bridges and also leave one to weight down the pressure plate (Figure 4).

Figure 4

The next level combines most of the mechanics learned from previous levels, these are formed into a much bigger scale level to further test the player’s knowledge on what they’ve learned.

This level requires the player to collect keys, slide blocks and to make use of the pressure plates to unlock their assigned doors (Figure 5).

Figure 5

The fourth level takes place in complete darkness, and requires the player to make paths around it using the sliding blocks much like the second level but much harder.

The main challenge here is to explore the level’s layout whilst making use out of the player’s passive ability: the light (Figure 6).

Figure 6

The final level of the prototype introduces the first enemy, who starts off in a square room. This is a basic representation of the enemy’s path and direction, as they move straight forward and turn 90 degrees left once they hit a wall.

If the player rushes, then they may come into contact with the enemy and initiate a chase sequence. If the player makes it to unlocking the door, there still will be a delay before it fully opens so there’s a high chance of the enemy catching up to the player by then.

The level teaches you to stay out of the enemy’s sight, and to be patient (Figure 7).

Figure 7

The game has also gone through playtesting, and much of the praise has gone towards how the levels are structured – albeit a couple of bugs as seen in Figure 8.

Figure 8

I’m glad to know that the levels I’ve designed were a success at teaching the game’s core mechanics in an engaging, and fun manner.

To improve upon this, I’d polish the current levels to iron out some bugs and also add in some new ones, though due to time constraints I can’t add more mechanics and further explore current ones as I would’ve liked to of done.

Next time, I may try to get some more play testers so that many more will be likely to uncover oversights like bugs.

Bibliography

Master Class. (2021) How to Become a Video Game Level Designer. [online] Available at: https://www.masterclass.com/articles/how-to-become-a-video-game-level-designer [Accessed 19 May 2021].

Categories
Uncategorised

Developer Journal : Unreal & Blender

Developer Journal

As I’ve mentioned creating 3D Models in some other blog posts, I’ll be going over my process in using Blender to create models and importing them over to Unreal.

For my project, I chose to use Blender as it is a free and open source program. This is mostly intended to benefit independent artists and small teams.

“We build a free and open source complete 3D creation pipeline for artists and small teams, by publicly managed projects on blender.org.”

blender.org, 2021

Because of this, I consider learning and using Blender is good opportunity to show that we can go the extra mile for our Indie Dev project and detail it more.

For my game, I want a low-poly and pixelated style. To avoid any inconsistences for when I get to texture the model, I use a grid snap function to make sure the model’s shape stays accurate to each square on the grid (See Figure 1).

Figure 1

Next up after the model is done would be setting up a UV map for unwrapping and texturing the model, a UV map is a flat representation of the model and is primarily used to wrap textures.

“A UV map is the flat representation of the surface of a 3D model used to easily wrap textures. The process of creating a UV map is called UV unwrapping.”

Thomas Denham, 2021

As you can see, the UV in Figure 2 is unwrapped into 2D shapes on the left representing each face of the model. These 2D shapes can be drawn on to form the textures for the model as shown on the right.

Figure 2

Once a model is done, I export it as an FBX and set the smoothing to “Face” – this is a type of smoothing Unreal supports when importing models and prevents errors from occurring (See Figure 3).

Figure 3

Finally, the model and texture both get imported into Unreal as an asset. To get the model to display the texture I simply open up the asset and set it a new material with the texture on it (Figure 4).

Figure 4

To make the assets stand out much better during gameplay, I edit the materials to have an Emissive Colour attached and then wire it through a multiplier. They also have their Roughness set to 0 as to give them a very glossy look to further capture the attention of the player (Figure 5).

Figure 5

In-game, this makes the objects stand out much better in the dark by using a subtle glow. The reason as to why I’ve done this for my game is to help with the readability of important objects the player will be interacting with (See Figure 6).

Figure 6

I’ve also made plenty of models for my project for detail, this goes for most of the items and environmental objects you can see.

Figure 7 shows a key I modelled to go with the doors, this is a collectable picked up by the player.

Figure 7

Figure 8 shows the sliding blocks, these are given bevelled edges to help distinguish them from the walls.

Figure 8

Figure 9 shows a cell wall, this is essentially just for detail and serves no purpose within the game.

Figure 9

Figure 10 shows a character model I created for the player, however I have not managed to import and animate this into the game due to time constraints.

Figure 10

Overall, I feel as if Blender and Unreal are great tools to work with.

I think it is important to learn different skills with different programs to better improve the versatility of my skills, as in Indie Development working individually or in a small team may require some extra legwork for things such as details – but that is not problem if you have learn how to do perform those skills independently.

I think in the future to improve upon these skills I could try and learn to animate within Blender and to import those into Unreal.

Bibliography

Blender Foundation. (2021) About — blender.org. [online] blender.org. Available at: https://www.blender.org/about/#:~:text=Blender%20is%20the%20free%20and,video%20editing%20and%20game%20creation. [Accessed 18 May 2021].

Denham, T. (2021) What is UV Mapping & Unwrapping?. [online] Concept Art Empire. Available at: https://conceptartempire.com/uv-mapping-unwrapping/#:~:text=A%20UV%20map%20is%20the,used%20in%20the%203D%20space. [Accessed 18 May 2021].

Categories
Uncategorised

Developer Journal : Enemy AI

Developer Journal

As of the 10th of May, I’ve made some more progress on my prototype. This time I’ve made made an enemy AI for the game, this was done with the use of Blackboards and Behaviour trees.

Figure 1 shows the enemy AI in-game with a patrol behaviour, this state is initiated on the start of the level and is reactivated when the enemy loses sight of the player.

Using line traces, the AI is able to find any wall it can collide with and sets a position to move to. Once it reaches it, the AI will simple turn 90 degrees left and repeat.

Figure 1

In Figure 2, Another behaviour is when the AI starts to chase the player, this initializes once the player comes into contact with the AI’s sight radius. Once they’ve reached the player, they’ll explode and result in a game over screen.

Figure 2

To start creating the AI, I needed to create special blueprint known as an AI Controller. This works similarly to the Player Controller, but is more focused on receiving inputs from the environment and game world.

Figure 3

“The job of the AIController is to observe the world around it and make decisions and react accordingly without explicit input from a human player.”

unrealengine.com, 2021

Using this, we are able to get the base for our AI. However we are able to make this AI a lot more intelligent with the use of a Behaviour Tree.

Creating an Advanced Asset, we are able to get our Behaviour Tree. With this, we are able to give our AI Controller some logic to follow.

Figure 4

“While the Behavior Tree asset is used to execute branches containing logic, to determine which branches should be executed”

unrealengine.com, 2021

Finally, we’ll need a Blackboard. This’ll contain certain variables we need to refer to in our Behaviour Tree.

Figure 5

Anything we want our AI to know about will have a Blackboard Key that we can reference.

unrealengine.com, 2021

Now that we have all we need, I now need to go into the AI_Enemy character I created and to set its AI Controller Class to the one I’ve just created (See Figure 6).

Figure 6

Inside the AI Controller, it runs our behaviour tree once the game starts (See Figure 7).

It also has an event linked to AIPerception, which is used to create a sight for our enemy that’s casted to our player.

If successfully sensed then a boolean will be activated in our Blackboard to activate a chase sequence – I’ll be able to explain this later.

Figure 7

Next up is the Behaviour Tree itself, as you can see there are two different branches from the selector; Patrol and Chase.

If the boolean used to detect the player mentioned earlier was false, then the left part of the tree will play. If the boolean was true then the right part of the tree will activate instead.

Each will lead to a sequence that will play certain tasks, starting from the left side, this is where the player gets given actions to do in a specific order that’s numbered (See Figure 8).

Figure 8

Inside both sequences, you can see that they both take the SeePlayer boolean from our Blackboard (See Figure 9).

Figure 9

Our first task in the left sequence starts with the enemy finding a wall, this is done using a LineTraceByChannel which starts at the enemy’s position and heads forwards in front of them by 1000 units.

Once the line hits a wall, it’ll take the vector of the hit location and use it as the enemy’s next location to move to (See Figure 10)

Figure 10

After the execution is finished, the next task in the sequence is played which will be a Move To. This is assigned to a vector variable from the Blackboard known as TargetLocation, this is where the location for where line hits is stored (See Figure 11).

Figure 11

After the enemy has moved to its location, the next and final task is to rotate 90 degrees to the left. Using a SetActorRotation, I am able to change the rotation of the enemy with a value.

The value gets the actor’s current rotation on the Z axis and divides it by 90, it then rounds it to the nearest integer and times it by 90 – this is done so we can get a perfect snap for the rotation as to avoid any inaccurate rotations.

90 is then simply taken away from the player’s z axis and thus turns them a perfect 90 degrees left once we put the value into the new rotation (See Figure 12).

Figure 12

Once this task is finished, the whole sequence is then repeated from the first task again until the PlayerSee boolean activates.

In our next sequence for when the PlayerSee boolean is activated, a chase sequence is initiated. The first task of this sequence is to simply get the player character’s location using our AI’s perception from earlier (See Figure 13).

Figure 13

Next up in the chase sequence would be a task that made the AI move to this location (See Figure 14).

Figure 14

This sequence will keep going until the enemy loses sight of our player, thus making the SeePlayer boolean false again.

As this was my first time trying to make AI within Unreal, I feel as if the task was a success and I was able to comprehend how to use the Blackboard and Behaviour Trees.

Although the AI is quite simple, it still may need some improving such as having to go back to it’s original location after chasing the player – this may have to be left for any extra polish I can fit into the game.

Bibliography

Docs.unrealengine.com. (2021) AIController. [online] Available at: https://docs.unrealengine.com/en-US/InteractiveExperiences/Framework/Controller/AIController/index.html [Accessed 16 May 2021].

Docs.unrealengine.com. (2021) Behavior Tree Quick Start Guide. [online] Available at: https://docs.unrealengine.com/en-US/InteractiveExperiences/ArtificialIntelligence/BehaviorTrees/BehaviorTreeQuickStart/index.html [Accessed 16 May 2021].

Docs.unrealengine.com. (2021) Behavior Trees. [online] Available at: https://docs.unrealengine.com/en-US/InteractiveExperiences/ArtificialIntelligence/BehaviorTrees/index.html [Accessed 16 May 2021].

Categories
Uncategorised

Developer Journal : Unreal Keys and Doors

Developer Journal

As of the 3rd of April, I’ve learned how to use Structs within Unreal to create a proper colour-coded key system with doors.

I’ll be using structs as a more accessible way to edit multiple types of data for an actor in my project.

“A struct is a collection of different types of data that are related and held together for easy access.”

unrealengine.com, 2021

This is important as the collectable keys have more than one variable – A single key will contain; an image, text and colour.

How it works in the game is that I’ve currently got a key system where certain colour-coded keys open doors with matching colours, I’ve also made models for each of these actors as seen in Figure 1.

Figure 1

Each key also has a unique material and light assigned to it from the struct.

Not only that, but I’ve also got a widget list that keeps track of all the current keys the player has collected throughout the level on the top right of the screen in Figure 2.

Figure 2

So how was I able to do this? I created a structure blueprint for the data I wanted to store within the keys and doors, this contained variables for a 2D Texture, a text variable and a linear colour (Figure 3).

Figure 3

For the keys and doors, a variable of this data structure is stored in both of them.

This will be used to help detect whether the variable of the key the player is using on the door matches the exact key data variable the door contains. Figure 4 shows a struct variable in one of the actors of Key Data containing the values we’d entered earlier.

Figure 4

The keys will have to be collected by the player, so the Player has an array type of the Key Data variable (Figure 5).

Figure 5

Back to the key actor, there is a component overlap that casts to the Player and calls a custom event for storing the key in the array. Here, we get the key variable from our current key that is being collected to insert into the player’s inventory (Figure 6).

Figure 6

Using a custom event, we can now use the New Key to add it to the player’s inventory array. Included is also a casting towards our game mode blueprint – this will come in later once get to the widgets (Figure 7)

Figure 7

Now for our door actor, we have a begin overlap event that casts to the player’s inventory and checks to see if it contains a matching key – this is a structure whose variables completely match our collected key’s.

If the player does contain a matching key, then the door will be opened (Figure 8).

Figure 8

To also help differentiate different keys and doors between each other, an event BeginPlay will check to see a key or door’s Key Data variable and change their material based on if the text matches the Literal Text.

For example, if the struct contains a text variable called “Red Key”, then that exactly equals to the Literal Text named “Red Key” as well. If this is true then set the material of the key or door to red.

If it’s false then move onto another section where it will be compared to a different text, if the text doesn’t match any then it will turn the door into a Neutral colour (Figure 9).

Figure 9

Keys will also set the colour of their light to the colour variable in the struct as well, Figure 10 shows what the doors and keys look like in-game.

Figure 10

As for the Widgets, I want the keys to display on the player’s interface so that they know what current key they’re holding – this can be useful to help the player keep track of all the keys they’ve stored.

I created an object blueprint that would store variables from Key Data to then be used later to display on the Widget (Figure 11).

Figure 11

I’d made a ListView widget containing where these collectables will be stored, using a custom function called Add Key.

There you can see that there is an input for New Data, this is adds the data received to the list (Figure 12).

Figure 12

Although before we start adding to our list, we need a separate Widget which will be how we want each individual line to display on our list.

In Figure 13, I’ve binded the text, color and image of an object to this particular line I want to enter.

Figure 13

To make this compatible with the ListView widget, I implemented a UserObjectListEntry interface.

This helps get our object supplied by the Add Item method earlier, this is then casted to the KeyInfoData and is stored as a variable (Figure 14).

Figure 14

Now all we do is just bind our list’s widget entry class to widget we’d just created and it will display the entries on the list like so (Figure 15).

Figure 15

In our game mode blueprint, we create the list when the game starts, we also promote the return value of the widget into a variable to be referenced later (Figure 16).

Figure 16

Using a custom function within our game mode blueprint, we can add any entry widget to the list by taking the Key Data and using it as the function’s input.

Also included is a Key Array within the game mode to add the following keys to the array (Figure 17).

Figure 17

As from earlier, our cast to the game mode from picked up keys enable them to be displayed in the list like this.

As some of their functions are binded to the data struct’s variables they display the images and colours correctly to the picked up key’s (See Figure 18).

Figure 18

This was one of the more challenging tasks of working on my game, but I am glad I managed to learn and figure out how to get Data Structures working within Unreal.

This was important to the game as to make it more readable to the player by displaying the data of their collectables to the User Interface, although not completely necessary I still feel as if it can go a long way to help make the game more accessible.

However, I think some parts can be improved a bit better such as how changing the object’s material could use a simpler and more efficient function rather than using a lot of branches.

Bibliography

Docs.unrealengine.com. (2021) Struct Variables in Blueprints. [online] Available at: https://docs.unrealengine.com/en-US/ProgrammingAndScripting/Blueprints/UserGuide/Variables/Structs/index.html [Accessed 9 May 2021].

Categories
Uncategorised

Developer Journal : Widgets and Transitions

Developer Journal

As of the 26th of April, I’ve made more progress on my game by adding elements to the User Interface known as Widgets as well as some Level Transitions.

For my game, I’ve considered adding multiple levels as well as a simple UI to help transition the players to the next level by giving them a button to press once the current level had been beaten.

As seen in Figure 1 of the video below, this shows the results of my implementation of these features working within the game.

Figure 1

The Widget used at the end of each level simply accomplishes the player once they’ve beaten the current level and are given an option to continue onto the next.

In the designer section of the widget, I kept the UI relatively minimal looking with a short message and a working button.

Figure 2

Not only does Unreal allow us to design just a UI with the widget system, but we can also assign something to happen if the UI is interacted with. Using the button, we can make it so that once clicked the Player’s input can trigger a certain function within the game.

“The Widget Blueprint uses Blueprint Visual Scripting to design the layout as well as script functionality for UI elements such as what happens when a button is clicked or if a value changes.”

unrealengine.com, 2021

When the button is clicked on our widget, we want to activate a function that easily transitions the player to the next level or restarts the level if they’ve failed.

Figure 3 shows the blueprint for when the button on the UI is clicked, it also uses a branch to determine a different result whether the player is dead or not.

The game simply uses Names to help give it the level name it wants to open. If the player has failed then it’ll get the current level’s name and reopen it, or if they’ve succeeded then they will fetch a Name variable from the exit which contains the next level’s name.

Figure 3

The exits within the game have their own editable instance which allows me to set their value to the name of a level they’d transition the player to once touched, Figure 4 shows the function of the exit giving their name value to the widget to use later on.

Figure 4

I feel as if I’ve gotten a good idea of how inserting widgets into Unreal works as this task remained to be successful, however the current UI I’ve made for the project seems to look very rough and may need some better looking visuals – this won’t be a top priority however as I need to get a playable build of the game out first.

Bibliography

Docs.unrealengine.com. 2021. Creating and Displaying UI. [online] Available at: https://docs.unrealengine.com/en-US/InteractiveExperiences/UMG/HowTo/CreatingWidgets/index.html [Accessed 4 May 2021].

Categories
Uncategorised

Developer Journals : Box Traces

Developer Journal

As of the 19th of April, I’ve used this week to further refine my top down Unreal prototype.

As of recently, I’ve decided to create sliding blocks for my game. These blocks are moved into any direction if the player collides with them at a certain face, it will then accurately move it along a grid using BoxTraceByChannel.

BoxTraceByChannel is a collision function that uses traces for its detection, the trace can detect any objects blocking the way and thus can provide a result.

“Sweeps a box along the given line and returns the first blocking hit encountered. This trace finds the objects that RESPONDS to the given TraceChannel”

unrealengine.com, 2021

An example of how I used this is that for the sliding block, traces are used to check each square in front of the blocks before moving it. If there is nothing blocking the trace, then the block is permitted to move – This is shown by the red colour as seen in Figure 1.

Figure 1

If the traces detect any solid objects such as walls or items, then it will display as green in front of the cube and not allow it to move as seen on Figure 2.

Figure 2

Here is an example of how it should work in game as shown in Figure 3, the sliding block should slide alongside the floor and stop if there is a wall in front of it. There is a trail of red traces alongside some green ones against a couple of walls, this is the collision at work.

Figure 3

To make this work, I gave the sliding block two collision boxes inside of it in Figure 4. As the player has a small pointed bit on its front, it will be able to activate one of these if it faces towards the cube.

Figure 4

If the player were to overlap onto one of these box collisions, it would cast to the player to see if they were pushing the block, and then get their X or Y location to see which side the player was colliding from – this is what can make the box go into different directions. (Figure 5)

Figure 5

Once collided, a box trace will spawn in front of the direction the block will be sliding to check and see if there are any collisions in the way. This is done by just spawning a trace 50 units ahead of the object and snapping it to a certain location with a custom function to make it more accurate. (Figure 6)

Figure 6

If the traces don’t collide with anything, then a Move Component function is used to simply move the sliding block ahead. (Figure 7)

Figure 7

This was a fairly hard thing to do, but with some help and research I was able to get it working.

From this I managed to learn how to use traces for collisions, which can prove to be useful if you’re looking to make objects that move accurately without having to depend on physics.

Figure 8

I also briefly learnt how to use custom functions such as the one to help me with object snapping shown in Figure 8, this would calculate a grid size and offset to snap an object to.

Bibliography

Docs.unrealengine.com. 2021. BoxTraceByChannel. [online] Available at: https://docs.unrealengine.com/en-US/BlueprintAPI/Collision/BoxTraceByChannel/index.html [Accessed 30 April 2021].

Categories
Uncategorised

Developer Journal : Unreal Game

Developer Journal

As of the 12th of April, I used this week to make a start on my top down Unreal prototype.

So far, I’ve worked on actors and characters using the blueprint system. As well as trying out making my own textures and materials for the game. These are all in one level so far, which will serve as the testing grounds for any future features I might be able to add (See Figure 1).

Figure 1, Self-Collection

First off, I will describe how I managed to create a playable character. For this I needed to use a PlayerInput object to be able to set some keyboard inputs the player could use to control their character.

“The PlayerInput Object is responsible for converting input from the player into data that Actors (like PlayerControllers or Pawns) can understand and make use of.”

docs.unrealengine.com, 2021

In order to create a playable character, it was important for me to set some Input Mappings for the player. This would allow me to set certain mappings within the game with keys connected to them (See Figure 2).

Figure 2, Self-Collection

There are multiple mappings you can assign keys to, these being Action Mappings and Axis Mappings. As my game doesn’t have any necessary actions yet, I’ve just used Axis Mappings – note that I have also set some axis scales to minus values.

Action Mappings are a discrete button or key press bounded to an event-driven behaviour, the idea for this is that it would usually be assigned to a single action typically that required a single press of a button.

“The end effect is that pressing (and/or releasing) a key, mouse button, or keypad button directly triggers some game behavior.”

docs.unrealengine.com, 2021

An example of how an Action Mapping would be used is that it could be for actions as simple as jumping, couching or attacking – these are typically events in games that only require a single press.

Axis Mappings are bound to more continuous game behaviours, the advantages of this is that it can allow for smooth transitions in movement as opposed to more discrete game events.

“The inputs mapped in AxisMappings are continuously polled, even if they are just reporting that their input value is currently zero.”

docs.unrealengine.com, 2021

Some good examples for Axis Mappings would be general movement keys, like “WASD” controls. As these mappings are continuous, it would be preferable to hold these keys down for movement instead of constantly needing to input them.

Using blueprints, I connected set mappings to add movement to the player. The set world direction values will then be scaled by the axis scales I’ve connected to them (See Figure 3).

Figure 3, Self-Collection

For instance a regular W input will move the character forwards by X+1.0 , but an S input will move them by X-1.0 as it has a negative scale.

I will now mention briefly as to how I managed to import some custom textures I’ve made myself, importing textures is as easy as dragging and dropping a file into a folder.

Figure 4, Self-Collection

As I’ll be using pixelated textures, I need to set their texture group to “2D Pixels” within the Level of Detail tab in order for them to display properly.

Materials work similarly to how actors and characters are made, as they also require blueprinting (See Figure 5).

Figure 5, Self-Collection

As I don’t want the material of this texture to be too shiny, I’ve decided to increase the Roughness Input to 1 – What this does is control how rough or smooth a material’s surface is.

“A Roughness of 0 (smooth) results in a mirror reflection and roughness of 1 (rough) results in a diffuse (or matte) surface.”

docs.unrealengine.com, 2021

As my texture is pixelated, I wouldn’t want it to look too reflective or shiny. However, if I were using a different texture that was meant to be reflective and smooth I could increase the values if I wanted – this may depend on what I can add into the game.

Bibliography

Docs.unrealengine.com. 2021. Input. [online] Available at: https://docs.unrealengine.com/en-US/InteractiveExperiences/Input/index.html [Accessed 19 April 2021].

Docs.unrealengine.com. 2021. Material Inputs. [online] Available at: https://docs.unrealengine.com/en-US/RenderingAndGraphics/Materials/MaterialInputs/index.html [Accessed 19 April 2021].

Categories
Uncategorised

Developer Journal : Variables and Functions

Developer Journal

As of the 15th of Monday, I’ve been learning more about using the Unreal Engine.

During this week we covered on how to use variables and functions in the blueprint system, both of these are crucial to cover as they are mainly used to store important information from the game and to make certain actions possible.

What variables can do is to give certain objects or actors within the game world certain properties using values such as; floats, integers, booleans and even strings.

“Variables are properties that hold a value or reference an Object or Actor in the world.”

unrealengine.com, 2021

Shown in Figure 1, you can see that there are some examples as to what some variables can be.

Float & Integer variables may keep track of things such as time as they use numbers as values, whilst something like Booleans can be assigned to more simpler on/off functions such as toggling a player’s light – Strings may also be used for something such as storing the player’s name.

Variable Nodes
Figure 1 (UnrealEngine.com)

Essentially, variables can be used as statistics to help record what is happening within your game. As we are currently using the blueprint system, these variables will be mostly present within our node graphs we use to structure most of the game with.

This brings us into using Functions, which can be created and stored on a singular blueprint and then can be assigned to a simple function which can be called from a different graph.

“Functions are node graphs belonging to a particular Blueprint that can be executed, or called, from another graph within the Blueprint.”

unrealengine.com, 2021

An example of how a function may be used is shown in Figure 2, this presents us with a “Take Damage” function inside a character’s blueprint. Once this function is called, it will take away some of the player’s health.

PlayerHealthFunction.png
Figure 2 (UnrealEngine.com)

The reason why this may of been done is that there may be multiple ways to take damage within the game, and instead of having to recreate the structure over again for each way it is instead assigned to a simple function which can just be called and still serve the same purpose.

Unreal uses Classes, which can contain both the variables and functions we’ve covered together. These can be used as Parents which can allow the properties within them to be inherited into other blueprints.

“Selecting a Parent Class allows you to inherit properties from the Parent to use in the Blueprint you are creating.”

unrealengine.com, 2021

Anything in Unreal can be used as a parent class, for example a parent Actor blueprint can contain properties for ammo and health – which a child blueprint can inherit.

As it is separate from the parent blueprint, we can add some specific functionality that only applies to just the child blueprint such as a unique weapon only available to the actor whilst still inheriting the functionality shared from the parent blueprint.

A more general example of how a parent system is used can be seen here on Figure 3.

Figure 3 (Grewell, C)

The parent class is a tree, and its children are different sorts of trees that have their own specific properties unique to them. The text in bold displays the new properties of each child object, whilst also inheriting the previous properties from their parent.

I personally think that learning classes, variables and functions are a great at trying to get your head around how a good chunk of the blueprinting system might work in Unreal – it also helps with programming generally.

Although I’ve already been familiar with using these in different engines and programs before such as Visual Studio, it is a good idea to keep revising them and also seeing how they may be used differently.

Bibliography

Docs.unrealengine.com. 2021. Blueprint Variables. [online] Available at: https://docs.unrealengine.com/en-US/ProgrammingAndScripting/Blueprints/UserGuide/Variables/index.html [Accessed 21 March 2021].

Docs.unrealengine.com. 2021. Functions. [online] Available at: https://docs.unrealengine.com/en-US/ProgrammingAndScripting/Blueprints/UserGuide/Functions/index.html [Accessed 21 March 2021].

Docs.unrealengine.com. 2021. Blueprint Class. [online] Available at: https://docs.unrealengine.com/en-US/ProgrammingAndScripting/Blueprints/UserGuide/Types/ClassBlueprint/index.html [Accessed 21 March 2021].

Grewell, C., 2018. Classes. [online] Medium. Available at: https://medium.com/applab/classes-d2fb62e266c0 [Accessed 21 March 2021].

Categories
Uncategorised

Developer Journal : 2D Graphics

Developer Journal

As of the 8th of Monday during March, I’ve considered making a start on researching for our 2000-word essay on a technical discussion within the field of games development – we are given several topics to chose for this.

What’s important is that our discussion must be substantiated with evidence throughout it, so research is considered important for this task.

I’ve decided for my topic that I was going to do graphics in 2D games, so to help familiarize myself I’ve delved into researching different image file types such as Raster and Vector images.

Figure 1 (Wikipedia)

Vector graphics (See Figure 1) are known as the most flexible of the image types, they use mathematical formulas to form which can be easily retain a high quality no matter how much their shape and size changes.

“This means that the paths and objects that make up a vector image are individually scalable, preserving the quality of the image when scaling it up or down.”

Dawn Kuczwara, 2021

However, vector graphics are poor at creating realistic images due to this as they are best suited to more minimal styles.

Raster graphics are created by assembling pixels together to create graphics, they’re also the most common format of image types on the web – they also take up much less data than vectors.

“Raster images are ideal for realistic images, like photographs”

Dawn Kuczwara, 2021

They’re also capable of portraying realistic images much better than vectors can (See Figure 2).

Figure 2 (olypress.com)

The quality of Raster images however can degrade once their size is manipulated from the original proportions.

As Vector graphics are scalable, they’re ideal for features such as text fonts and even icons like logos. Figure 3 shows an example of what a Raster and Vector graphic looks like once it becomes scaled up.

Figure 3 (Amit Saxena)

I think that both image types can be vastly useful for many different situations within games, a good example would be how User Interfaces can use Vector graphics so that their quality stays the same regardless of how big a user’s screen is.

However Raster graphics may be better for details that don’t change size as much such as textures for models or scenery, it can also help with saving up memory within the game so Raster graphics may be good to consider if optimization needed to be improved upon.

Bibliography

Kuczwara, D., 2021. Raster Images vs Vector Graphics – Curotec. [online] Curotec. Available at: https://www.curotec.com/insights/raster-images-vs-vector-graphics/#:~:text=Vector%20graphics%20are%20known%20as,created%20with%20individual%2C%20colored%20squares.https://www.curotec.com/insights/raster-images-vs-vector-graphics/#:~:text=Vector%20graphics%20are%20known%20as,created%20with%20individual%2C%20colored%20squares. [Accessed 16 March 2021].

En.wikipedia.org. 2021. Vector graphics. [online] Available at: https://en.wikipedia.org/wiki/Vector_graphics [Accessed 16 March 2021].

Saxrena, A., 2017. What Is Vector Graphics [An Introduction]. [online] Dignitas Digital. Available at: https://www.dignitasdigital.com/blog/what-is-vector-graphics-an-introduction/ [Accessed 16 March 2021].

Olympus Press – Commercial Printing. 2013. Vector & Raster Graphics in Offset Printing – Olympus Press – Commercial Printing. [online] Available at: https://olypress.com/vector-vs-raster-graphics-in-printing/ [Accessed 16 March 2021].

Categories
Uncategorised

Developer Journal : Unreal Basics

Developer Journal

As of Monday the 1st on March, we learnt Unreal during the week. We learned the basics of the engine and how to use its user interface.

When we first started, we had to set our project settings. Within the project settings there are many ways to customize your project such as whether you want to use the blueprint system or C++ – We also have the option to set our quality and platform. (See Figure 1)

Project Settings Fullscreen
Figure 1 (docs.unrealengine.com)

We are going to be using blueprints for our first year of using Unreal, this is essentially a Visual Scripting system. This is a node-based interface which is used to create gameplay elements, used to define object-oriented classes or objects in the engine.

“This system is extremely flexible and powerful as it provides the ability for designers to use virtually the full range of concepts and tools generally only available to programmers.”

docs.unrealengine.com, 2021

This tool is very helpful to many beginners, as it is a simplified version of programming. This can also be especially useful for making small prototype projects as the blueprint system proves to be easy to use to create these types of projects within a short amount of time.

Another thing we learnt about was using Actors within a scene, this is any object you can place within a level. This can range from a simple shape to characters and even some lighting. (See Figure 2)

Place0.png
Figure 2 (docs.unrealengine.com)

Actors can also be modified in a multitude of ways, they are capable of many 3d transformations as well as housing some properties, this means they can be highly customizable.

“Actors are a generic Class that support 3D transformations such as translation, rotation, and scale.”

docs.unrealengine.com, 2021

These are basically what we will be mostly placing within our level, Actors can be thought of as containers that can hold special types of objects known as “components”.

Figure 3 (docs.unrealengine.com)

Many different types of these components can control how Actors work within the engine – this can range from how they’re rendered to how they move.

As shown in Figure 3, this is an example of what an Actor hierarchy of components looks like. As you can see, there are several components attached to this object which enable a mesh to show as well as some effects such as audio to emit from it.

I think that so far, learning Unreal has been quite interesting and I’m excited about getting around to using the engine. I’ve already used several different game engines before, such as Construct 3 or using Phaser with Visual Studio – however I don’t have as much experience with using 3d engines, So this’ll mostly be new and exciting to me.

Bibliography

Docs.unrealengine.com. 2021. Get Started with UE4. [online] Available at: https://docs.unrealengine.com/en-US/Basics/GettingStarted/index.html [Accessed 11 March 2021].

Docs.unrealengine.com. 2021. Blueprints Visual Scripting. [online] Available at: https://docs.unrealengine.com/en-US/ProgrammingAndScripting/Blueprints/index.html [Accessed 11 March 2021].

Docs.unrealengine.com. 2021. Placing Actors. [online] Available at: https://docs.unrealengine.com/en-US/Basics/Actors/Placement/index.html [Accessed 11 March 2021].