Developer Journal : Unreal Keys and Doors

Developer Journal

As of the 3rd of April, I’ve learned how to use Structs within Unreal to create a proper colour-coded key system with doors.

I’ll be using structs as a more accessible way to edit multiple types of data for an actor in my project.

“A struct is a collection of different types of data that are related and held together for easy access.”, 2021

This is important as the collectable keys have more than one variable – A single key will contain; an image, text and colour.

How it works in the game is that I’ve currently got a key system where certain colour-coded keys open doors with matching colours, I’ve also made models for each of these actors as seen in Figure 1.

Figure 1

Each key also has a unique material and light assigned to it from the struct.

Not only that, but I’ve also got a widget list that keeps track of all the current keys the player has collected throughout the level on the top right of the screen in Figure 2.

Figure 2

So how was I able to do this? I created a structure blueprint for the data I wanted to store within the keys and doors, this contained variables for a 2D Texture, a text variable and a linear colour (Figure 3).

Figure 3

For the keys and doors, a variable of this data structure is stored in both of them.

This will be used to help detect whether the variable of the key the player is using on the door matches the exact key data variable the door contains. Figure 4 shows a struct variable in one of the actors of Key Data containing the values we’d entered earlier.

Figure 4

The keys will have to be collected by the player, so the Player has an array type of the Key Data variable (Figure 5).

Figure 5

Back to the key actor, there is a component overlap that casts to the Player and calls a custom event for storing the key in the array. Here, we get the key variable from our current key that is being collected to insert into the player’s inventory (Figure 6).

Figure 6

Using a custom event, we can now use the New Key to add it to the player’s inventory array. Included is also a casting towards our game mode blueprint – this will come in later once get to the widgets (Figure 7)

Figure 7

Now for our door actor, we have a begin overlap event that casts to the player’s inventory and checks to see if it contains a matching key – this is a structure whose variables completely match our collected key’s.

If the player does contain a matching key, then the door will be opened (Figure 8).

Figure 8

To also help differentiate different keys and doors between each other, an event BeginPlay will check to see a key or door’s Key Data variable and change their material based on if the text matches the Literal Text.

For example, if the struct contains a text variable called “Red Key”, then that exactly equals to the Literal Text named “Red Key” as well. If this is true then set the material of the key or door to red.

If it’s false then move onto another section where it will be compared to a different text, if the text doesn’t match any then it will turn the door into a Neutral colour (Figure 9).

Figure 9

Keys will also set the colour of their light to the colour variable in the struct as well, Figure 10 shows what the doors and keys look like in-game.

Figure 10

As for the Widgets, I want the keys to display on the player’s interface so that they know what current key they’re holding – this can be useful to help the player keep track of all the keys they’ve stored.

I created an object blueprint that would store variables from Key Data to then be used later to display on the Widget (Figure 11).

Figure 11

I’d made a ListView widget containing where these collectables will be stored, using a custom function called Add Key.

There you can see that there is an input for New Data, this is adds the data received to the list (Figure 12).

Figure 12

Although before we start adding to our list, we need a separate Widget which will be how we want each individual line to display on our list.

In Figure 13, I’ve binded the text, color and image of an object to this particular line I want to enter.

Figure 13

To make this compatible with the ListView widget, I implemented a UserObjectListEntry interface.

This helps get our object supplied by the Add Item method earlier, this is then casted to the KeyInfoData and is stored as a variable (Figure 14).

Figure 14

Now all we do is just bind our list’s widget entry class to widget we’d just created and it will display the entries on the list like so (Figure 15).

Figure 15

In our game mode blueprint, we create the list when the game starts, we also promote the return value of the widget into a variable to be referenced later (Figure 16).

Figure 16

Using a custom function within our game mode blueprint, we can add any entry widget to the list by taking the Key Data and using it as the function’s input.

Also included is a Key Array within the game mode to add the following keys to the array (Figure 17).

Figure 17

As from earlier, our cast to the game mode from picked up keys enable them to be displayed in the list like this.

As some of their functions are binded to the data struct’s variables they display the images and colours correctly to the picked up key’s (See Figure 18).

Figure 18

This was one of the more challenging tasks of working on my game, but I am glad I managed to learn and figure out how to get Data Structures working within Unreal.

This was important to the game as to make it more readable to the player by displaying the data of their collectables to the User Interface, although not completely necessary I still feel as if it can go a long way to help make the game more accessible.

However, I think some parts can be improved a bit better such as how changing the object’s material could use a simpler and more efficient function rather than using a lot of branches.

Bibliography (2021) Struct Variables in Blueprints. [online] Available at: [Accessed 9 May 2021].


Developer Journal : Widgets and Transitions

Developer Journal

As of the 26th of April, I’ve made more progress on my game by adding elements to the User Interface known as Widgets as well as some Level Transitions.

For my game, I’ve considered adding multiple levels as well as a simple UI to help transition the players to the next level by giving them a button to press once the current level had been beaten.

As seen in Figure 1 of the video below, this shows the results of my implementation of these features working within the game.

Figure 1

The Widget used at the end of each level simply accomplishes the player once they’ve beaten the current level and are given an option to continue onto the next.

In the designer section of the widget, I kept the UI relatively minimal looking with a short message and a working button.

Figure 2

Not only does Unreal allow us to design just a UI with the widget system, but we can also assign something to happen if the UI is interacted with. Using the button, we can make it so that once clicked the Player’s input can trigger a certain function within the game.

“The Widget Blueprint uses Blueprint Visual Scripting to design the layout as well as script functionality for UI elements such as what happens when a button is clicked or if a value changes.”, 2021

When the button is clicked on our widget, we want to activate a function that easily transitions the player to the next level or restarts the level if they’ve failed.

Figure 3 shows the blueprint for when the button on the UI is clicked, it also uses a branch to determine a different result whether the player is dead or not.

The game simply uses Names to help give it the level name it wants to open. If the player has failed then it’ll get the current level’s name and reopen it, or if they’ve succeeded then they will fetch a Name variable from the exit which contains the next level’s name.

Figure 3

The exits within the game have their own editable instance which allows me to set their value to the name of a level they’d transition the player to once touched, Figure 4 shows the function of the exit giving their name value to the widget to use later on.

Figure 4

I feel as if I’ve gotten a good idea of how inserting widgets into Unreal works as this task remained to be successful, however the current UI I’ve made for the project seems to look very rough and may need some better looking visuals – this won’t be a top priority however as I need to get a playable build of the game out first.

Bibliography 2021. Creating and Displaying UI. [online] Available at: [Accessed 4 May 2021].


Developer Journals : Box Traces

Developer Journal

As of the 19th of April, I’ve used this week to further refine my top down Unreal prototype.

As of recently, I’ve decided to create sliding blocks for my game. These blocks are moved into any direction if the player collides with them at a certain face, it will then accurately move it along a grid using BoxTraceByChannel.

BoxTraceByChannel is a collision function that uses traces for its detection, the trace can detect any objects blocking the way and thus can provide a result.

“Sweeps a box along the given line and returns the first blocking hit encountered. This trace finds the objects that RESPONDS to the given TraceChannel”, 2021

An example of how I used this is that for the sliding block, traces are used to check each square in front of the blocks before moving it. If there is nothing blocking the trace, then the block is permitted to move – This is shown by the red colour as seen in Figure 1.

Figure 1

If the traces detect any solid objects such as walls or items, then it will display as green in front of the cube and not allow it to move as seen on Figure 2.

Figure 2

Here is an example of how it should work in game as shown in Figure 3, the sliding block should slide alongside the floor and stop if there is a wall in front of it. There is a trail of red traces alongside some green ones against a couple of walls, this is the collision at work.

Figure 3

To make this work, I gave the sliding block two collision boxes inside of it in Figure 4. As the player has a small pointed bit on its front, it will be able to activate one of these if it faces towards the cube.

Figure 4

If the player were to overlap onto one of these box collisions, it would cast to the player to see if they were pushing the block, and then get their X or Y location to see which side the player was colliding from – this is what can make the box go into different directions. (Figure 5)

Figure 5

Once collided, a box trace will spawn in front of the direction the block will be sliding to check and see if there are any collisions in the way. This is done by just spawning a trace 50 units ahead of the object and snapping it to a certain location with a custom function to make it more accurate. (Figure 6)

Figure 6

If the traces don’t collide with anything, then a Move Component function is used to simply move the sliding block ahead. (Figure 7)

Figure 7

This was a fairly hard thing to do, but with some help and research I was able to get it working.

From this I managed to learn how to use traces for collisions, which can prove to be useful if you’re looking to make objects that move accurately without having to depend on physics.

Figure 8

I also briefly learnt how to use custom functions such as the one to help me with object snapping shown in Figure 8, this would calculate a grid size and offset to snap an object to.

Bibliography 2021. BoxTraceByChannel. [online] Available at: [Accessed 30 April 2021].


Developer Journal : Unreal Game

Developer Journal

As of the 12th of April, I used this week to make a start on my top down Unreal prototype.

So far, I’ve worked on actors and characters using the blueprint system. As well as trying out making my own textures and materials for the game. These are all in one level so far, which will serve as the testing grounds for any future features I might be able to add (See Figure 1).

Figure 1, Self-Collection

First off, I will describe how I managed to create a playable character. For this I needed to use a PlayerInput object to be able to set some keyboard inputs the player could use to control their character.

“The PlayerInput Object is responsible for converting input from the player into data that Actors (like PlayerControllers or Pawns) can understand and make use of.”, 2021

In order to create a playable character, it was important for me to set some Input Mappings for the player. This would allow me to set certain mappings within the game with keys connected to them (See Figure 2).

Figure 2, Self-Collection

There are multiple mappings you can assign keys to, these being Action Mappings and Axis Mappings. As my game doesn’t have any necessary actions yet, I’ve just used Axis Mappings – note that I have also set some axis scales to minus values.

Action Mappings are a discrete button or key press bounded to an event-driven behaviour, the idea for this is that it would usually be assigned to a single action typically that required a single press of a button.

“The end effect is that pressing (and/or releasing) a key, mouse button, or keypad button directly triggers some game behavior.”, 2021

An example of how an Action Mapping would be used is that it could be for actions as simple as jumping, couching or attacking – these are typically events in games that only require a single press.

Axis Mappings are bound to more continuous game behaviours, the advantages of this is that it can allow for smooth transitions in movement as opposed to more discrete game events.

“The inputs mapped in AxisMappings are continuously polled, even if they are just reporting that their input value is currently zero.”, 2021

Some good examples for Axis Mappings would be general movement keys, like “WASD” controls. As these mappings are continuous, it would be preferable to hold these keys down for movement instead of constantly needing to input them.

Using blueprints, I connected set mappings to add movement to the player. The set world direction values will then be scaled by the axis scales I’ve connected to them (See Figure 3).

Figure 3, Self-Collection

For instance a regular W input will move the character forwards by X+1.0 , but an S input will move them by X-1.0 as it has a negative scale.

I will now mention briefly as to how I managed to import some custom textures I’ve made myself, importing textures is as easy as dragging and dropping a file into a folder.

Figure 4, Self-Collection

As I’ll be using pixelated textures, I need to set their texture group to “2D Pixels” within the Level of Detail tab in order for them to display properly.

Materials work similarly to how actors and characters are made, as they also require blueprinting (See Figure 5).

Figure 5, Self-Collection

As I don’t want the material of this texture to be too shiny, I’ve decided to increase the Roughness Input to 1 – What this does is control how rough or smooth a material’s surface is.

“A Roughness of 0 (smooth) results in a mirror reflection and roughness of 1 (rough) results in a diffuse (or matte) surface.”, 2021

As my texture is pixelated, I wouldn’t want it to look too reflective or shiny. However, if I were using a different texture that was meant to be reflective and smooth I could increase the values if I wanted – this may depend on what I can add into the game.

Bibliography 2021. Input. [online] Available at: [Accessed 19 April 2021]. 2021. Material Inputs. [online] Available at: [Accessed 19 April 2021].


Developer Journal : Variables and Functions

Developer Journal

As of the 15th of Monday, I’ve been learning more about using the Unreal Engine.

During this week we covered on how to use variables and functions in the blueprint system, both of these are crucial to cover as they are mainly used to store important information from the game and to make certain actions possible.

What variables can do is to give certain objects or actors within the game world certain properties using values such as; floats, integers, booleans and even strings.

“Variables are properties that hold a value or reference an Object or Actor in the world.”, 2021

Shown in Figure 1, you can see that there are some examples as to what some variables can be.

Float & Integer variables may keep track of things such as time as they use numbers as values, whilst something like Booleans can be assigned to more simpler on/off functions such as toggling a player’s light – Strings may also be used for something such as storing the player’s name.

Variable Nodes
Figure 1 (

Essentially, variables can be used as statistics to help record what is happening within your game. As we are currently using the blueprint system, these variables will be mostly present within our node graphs we use to structure most of the game with.

This brings us into using Functions, which can be created and stored on a singular blueprint and then can be assigned to a simple function which can be called from a different graph.

“Functions are node graphs belonging to a particular Blueprint that can be executed, or called, from another graph within the Blueprint.”, 2021

An example of how a function may be used is shown in Figure 2, this presents us with a “Take Damage” function inside a character’s blueprint. Once this function is called, it will take away some of the player’s health.

Figure 2 (

The reason why this may of been done is that there may be multiple ways to take damage within the game, and instead of having to recreate the structure over again for each way it is instead assigned to a simple function which can just be called and still serve the same purpose.

Unreal uses Classes, which can contain both the variables and functions we’ve covered together. These can be used as Parents which can allow the properties within them to be inherited into other blueprints.

“Selecting a Parent Class allows you to inherit properties from the Parent to use in the Blueprint you are creating.”, 2021

Anything in Unreal can be used as a parent class, for example a parent Actor blueprint can contain properties for ammo and health – which a child blueprint can inherit.

As it is separate from the parent blueprint, we can add some specific functionality that only applies to just the child blueprint such as a unique weapon only available to the actor whilst still inheriting the functionality shared from the parent blueprint.

A more general example of how a parent system is used can be seen here on Figure 3.

Figure 3 (Grewell, C)

The parent class is a tree, and its children are different sorts of trees that have their own specific properties unique to them. The text in bold displays the new properties of each child object, whilst also inheriting the previous properties from their parent.

I personally think that learning classes, variables and functions are a great at trying to get your head around how a good chunk of the blueprinting system might work in Unreal – it also helps with programming generally.

Although I’ve already been familiar with using these in different engines and programs before such as Visual Studio, it is a good idea to keep revising them and also seeing how they may be used differently.

Bibliography 2021. Blueprint Variables. [online] Available at: [Accessed 21 March 2021]. 2021. Functions. [online] Available at: [Accessed 21 March 2021]. 2021. Blueprint Class. [online] Available at: [Accessed 21 March 2021].

Grewell, C., 2018. Classes. [online] Medium. Available at: [Accessed 21 March 2021].


Developer Journal : 2D Graphics

Developer Journal

As of the 8th of Monday during March, I’ve considered making a start on researching for our 2000-word essay on a technical discussion within the field of games development – we are given several topics to chose for this.

What’s important is that our discussion must be substantiated with evidence throughout it, so research is considered important for this task.

I’ve decided for my topic that I was going to do graphics in 2D games, so to help familiarize myself I’ve delved into researching different image file types such as Raster and Vector images.

Figure 1 (Wikipedia)

Vector graphics (See Figure 1) are known as the most flexible of the image types, they use mathematical formulas to form which can be easily retain a high quality no matter how much their shape and size changes.

“This means that the paths and objects that make up a vector image are individually scalable, preserving the quality of the image when scaling it up or down.”

Dawn Kuczwara, 2021

However, vector graphics are poor at creating realistic images due to this as they are best suited to more minimal styles.

Raster graphics are created by assembling pixels together to create graphics, they’re also the most common format of image types on the web – they also take up much less data than vectors.

“Raster images are ideal for realistic images, like photographs”

Dawn Kuczwara, 2021

They’re also capable of portraying realistic images much better than vectors can (See Figure 2).

Figure 2 (

The quality of Raster images however can degrade once their size is manipulated from the original proportions.

As Vector graphics are scalable, they’re ideal for features such as text fonts and even icons like logos. Figure 3 shows an example of what a Raster and Vector graphic looks like once it becomes scaled up.

Figure 3 (Amit Saxena)

I think that both image types can be vastly useful for many different situations within games, a good example would be how User Interfaces can use Vector graphics so that their quality stays the same regardless of how big a user’s screen is.

However Raster graphics may be better for details that don’t change size as much such as textures for models or scenery, it can also help with saving up memory within the game so Raster graphics may be good to consider if optimization needed to be improved upon.


Kuczwara, D., 2021. Raster Images vs Vector Graphics – Curotec. [online] Curotec. Available at:,created%20with%20individual%2C%20colored%20squares.,created%20with%20individual%2C%20colored%20squares. [Accessed 16 March 2021]. 2021. Vector graphics. [online] Available at: [Accessed 16 March 2021].

Saxrena, A., 2017. What Is Vector Graphics [An Introduction]. [online] Dignitas Digital. Available at: [Accessed 16 March 2021].

Olympus Press – Commercial Printing. 2013. Vector & Raster Graphics in Offset Printing – Olympus Press – Commercial Printing. [online] Available at: [Accessed 16 March 2021].


Developer Journal : Unreal Basics

Developer Journal

As of Monday the 1st on March, we learnt Unreal during the week. We learned the basics of the engine and how to use its user interface.

When we first started, we had to set our project settings. Within the project settings there are many ways to customize your project such as whether you want to use the blueprint system or C++ – We also have the option to set our quality and platform. (See Figure 1)

Project Settings Fullscreen
Figure 1 (

We are going to be using blueprints for our first year of using Unreal, this is essentially a Visual Scripting system. This is a node-based interface which is used to create gameplay elements, used to define object-oriented classes or objects in the engine.

“This system is extremely flexible and powerful as it provides the ability for designers to use virtually the full range of concepts and tools generally only available to programmers.”, 2021

This tool is very helpful to many beginners, as it is a simplified version of programming. This can also be especially useful for making small prototype projects as the blueprint system proves to be easy to use to create these types of projects within a short amount of time.

Another thing we learnt about was using Actors within a scene, this is any object you can place within a level. This can range from a simple shape to characters and even some lighting. (See Figure 2)

Figure 2 (

Actors can also be modified in a multitude of ways, they are capable of many 3d transformations as well as housing some properties, this means they can be highly customizable.

“Actors are a generic Class that support 3D transformations such as translation, rotation, and scale.”, 2021

These are basically what we will be mostly placing within our level, Actors can be thought of as containers that can hold special types of objects known as “components”.

Figure 3 (

Many different types of these components can control how Actors work within the engine – this can range from how they’re rendered to how they move.

As shown in Figure 3, this is an example of what an Actor hierarchy of components looks like. As you can see, there are several components attached to this object which enable a mesh to show as well as some effects such as audio to emit from it.

I think that so far, learning Unreal has been quite interesting and I’m excited about getting around to using the engine. I’ve already used several different game engines before, such as Construct 3 or using Phaser with Visual Studio – however I don’t have as much experience with using 3d engines, So this’ll mostly be new and exciting to me.

Bibliography 2021. Get Started with UE4. [online] Available at: [Accessed 11 March 2021]. 2021. Blueprints Visual Scripting. [online] Available at: [Accessed 11 March 2021]. 2021. Placing Actors. [online] Available at: [Accessed 11 March 2021].


Developer Journal : Digital Wellbeing

Digital Wellbeing

As of the 22nd of Monday, I’ve decided to cover a topic on digital wellbeing for my developer journal.

Digital Wellbeing is the idea of a state of personal wellbeing that’s from the use of technology, it can simply mean the improvement of a person’s well being through the use of media – but this can also include how the person maintains their relationship with technology.

“Digital wellbeing considers the impact of technologies and digital services on people’s mental, physical and emotional health”

Alicja Shah, 2019

People can use technology to stay in touch with distant family and friends, as well as use it for personal work – people are connected more than even in the world today. However, prolonged use to technology has negative side effects too as it can lead to psychological and physical issues.

Physically, staring at a screen for too long can induce digital eyestrain upon a user. This is because of exposure to Blue Light on most digital devices, overuse of this may lead to eyestrain and focusing problems.

“Digital eyestrain refers to blurred vision and other symptoms such as burning, stinging or tearing of the eyes associated with prolonged use of digital devices.”

Alicia Rohan, 2016

When using any digital devices, it is important to consider the 20/20/20 rule. (See Figure 1)

Debunking digital eyestrain and blue light myths
Figure 1 (

This rule explains that to maintain a comfortable vision when using digital devices, for every 20 minutes of digital device use you have to look away for 20 seconds focusing on something else 20 feet away – this can help reduce eyestrain.

Another issue that could arise from the use of technology on the psychological side would be the overuse and dependence on technology, prolonged use can lead to issues such as isolation and depression.

“Young adults aged 19–32 years found that people with higher social media use were more than three times as likely to feel socially isolated than those who did not use social media as often.”

Timothy J. Legg, 2020

It is also important considering that we are also currently living during the coronavirus, which means a lot of people are isolated already in-doors to stay safe and have to rely on technology to communicate with most of their friends.

Finding ways to reduce social media use can however help reduce feelings of isolation in some people.

I think that personally, I have managed to keep myself well kempt when it comes to the internet. As I already have a lot of experience compared to other people, however it is sometimes hard to keep up with my health as I sometimes spend too much time on my computer – this can affect my sleep as well as my workflow.


Marsden, P., 2020. What is Digital Wellbeing? A List of Definitions. [online] Available at: [Accessed 28 February 2021].

Shah, A., 2019. Defining digital wellbeing – Jisc Building Digital Capability Blog. [online] Jisc Building Digital Capability Blog. Available at: [Accessed 28 February 2021].

J. Legg, T., 2020. Negative effects of technology: Psychological, social, and health. [online] Available at:,on%20developing%20children%20and%20teenagers. [Accessed 28 February 2021].

Rohan, A., 2016. Debunking digital eyestrain and blue light myths. [online] Available at: [Accessed 28 February 2021].


Developer Diary : Tutorial

Development Activity

As of Monday the 15th, we had an independent study week where we were given the chance to work on anything we had to catch up with during our spare time. For this week, I chose to start and finish my tutorial task.

For this task, we needed to make a Phaser Tutorial on anything of our choice. I decided to do a basic tutorial on Matter that many beginner level users would be able to learn how to set up their own scenes as well, it will cover most of how to set it up and how the user can create small simple contraptions using the engine.

Writing a coding tutorial can better help develop a basic understanding of your subject matter, it can help you to build something small and simple at a deeper understanding – this understanding can aid you in future projects.

“It requires that you understand what you’re doing/building enough to be able to explain it to somebody else”

ryanjyost, 2019

A great way to start your process on this would be to try and pick up a topic you want to learn and write about, which could just about be anything you’re not so experienced with.

I felt as if writing a tutorial on the basics of Matter from the ground up has helped to show me understand what the most simplest way to start on a foundation for a program using Phaser, as I’ve managed to explain in detail most of what goes into making the program work at a core level.

Figure 1

After having explained how to put some simple objects with attributes together, throughout the tutorial I’ve also provided links to the Phaser 3 documents that the reader can check to further entice them into starting to explore and experiment with what they can do in the program.

I’ve also provided some in-game screenshots of what I was able to achieve using this simple level of understanding, within the tutorial I managed to put together a small simple physics contraption that made use of constraints. (See Figure 2)

Figure 2

I think that I’ve managed to provide a decent tutorial for anyone without much prior knowledge to using Phaser on how to set up a simple scene, I’ve also left a lot of what you could possibly do in the guide as open-ended as possible to encourage experimentation by leaving many links to the Phaser documentation the user could use to expand upon.

Matter I feel is a good way to introduce beginner level programmers to using Phaser, it touches upon how these programs are usually set up which can help to further a beginner level user’s understanding of the basics, there’s also plenty of attributes you can assign to even basic objects that make it fun to experiment and learn with.

It’s also got many examples you can try on the Phaser website:

Although the coding in the tutorial is kept as brief and as simple as possible, I feel as if the next time I get the chance to write a tutorial I could do something slightly more complex to help further encourage myself to learn more new things about Phaser.

Bibliography 2019. How to write a coding tutorial. [online] Available at: [Accessed 23 February 2021]., P., 2021. Phaser – Examples – physics – Matterjs. [online] Available at: [Accessed 23 February 2021].


Developer Journal : Presentation


As of the 8th of February, this week was about preparing and starting our presentations on Entry Level Games Developer jobs.

We were put into our seminar groups for this project, so we had to work as a team to put together a single presentation – each of us had to find and pick our own entry level job we wanted to present. (See Figure 1)

Figure 1

We planned on doing three slides each; the first would be a brief description of the job and company, second would be about the key skills required to get the job and last would be our own thoughts and conclusion about the job – since our presentation also had to be roughly around 20 minutes long, we gave ourselves 5 minutes to speak through our slides.

One important thing we had to consider for our presentation was to keep things simple, so we made sure that most of the text on our slides were a big size and written briefly.

“You should be able to communicate that key message very briefly. The important thing is to keep your core message focused and brief.”, 2021

Before, our slides were packed with text and had a much more smaller font – this proved to be too hard for people to read in a presentation. Due to this, we changed our font size to something much more bigger and cut off most of the unimportant text.

We moved most of this information into our speaker notes, which we’d look upon whilst presenting. (See Figure 2)

Figure 2

Another important thing to cover here which would help with our presentations would be to consider the Dual Coding Learning Theory, this is where both images and written information are combined to provide an easy visual and verbal way for people to process information.

“Dual coding is combining words and visuals such as pictures, diagrams, graphic organizers, and so on. The idea is to provide two different representations of the information, both visual and verbal, to help students understand the information better.”

Megan Sumeracki, 2019

Following these rules, we cut the amount of content we intend to include on a slide and instead present everything in a simple concise way.

We can also use images to help our readers to get a rapid gist of what we’re talking about.

Information is lined up neatly and written briefly to give our readers confidence, most of the details are saved for when we start to speak about our slides.

“Visuals are powerful for communicating complex ideas in an efficient way; it takes a great many words to describe the simplest of images”

“Cut the amount of content we intend to include on a slide or resource; chunk the information into headings that stand out”, 2021

Using a combination of both of these, ideas we’ve managed to create a slide that’s quick to the point and comes across as clear as possible. Here you can see the brief description of the job and company as well as some images of their logo and most well known games within the industry. (See Figure 3)

Figure 3

I think that I managed to learn quite a bit about how important it is to condense information when making something such as a presentation, I learnt that with things such as Dual Coding Learning Theory that there are better and simpler ways to get messages across.

This was fairly important for this lesson, as we were given a 20 minute time limit to do our presentations, so we had to focus on getting our message out as clearly as possible rather than outputting as much information that can overwhelm the readers.

I think that maybe if I were to try and do another presentation, then I should focus more on getting a good balance between text and images. As although we shortened most of our work, we didn’t provide as much images to show to our audience.

Bibliography, (2021. Top Tips for Effective Presentations | SkillsYouNeed. [online] Available at: [Accessed 11 February 2021].

FutureLearn. 2021. An introduction to Dual Coding Theory. [online] Available at: [Accessed 11 February 2021].

Sumeracki, M., 2019. Dual Coding and Learning Styles — The Learning Scientists. [online] The Learning Scientists. Available at: [Accessed 11 February 2021].