Riot Games Hackathon

What is it?

Riot Games, the studio behind the popular online MOBA (Multiplayer Online Battle Area) title ‘League of Legends’, opened applications for the first time to participate in their on-campus hackathon. From over 2500 applications, 53 were selected to be flown out to their head office in Santa Monica, LA. Needless to say, I was a little bit excited upon seeing this email hit my inbox:

Having organised multiple hackathons in the past, as well as being a serial hackathon attendee, I was thrilled to not only participate but to also get a glimpse into how one of the key game studios that paved my interest into game development approached such an event.

Meeting the Attendees

With less than a month to go, a chatroom was created to give attendees a chance to discuss ideas and get to know each other. This is something we actually did on a smaller scale for our own Hackathon a couple of months earlier, though what I really liked about Riot’s approach was having veteran members of the community actively engaging new members and inviting them to share their ideas on a public spreadsheet as well as writing up a short bio about themselves.

Not only did this act as an icebreaker when attending the event (imagine a lot of “Oh, you’re GopherDude66!”) but instantly immersing them in the community.

Whether it be from the excitement of visiting Riot Games or the fabulously attentive community around the event, the attrition rate was a staggeringly impressive 0.4%, compared to the average rate of 50-60% for most hackathons, this is one of the stand-out aspects of the event that I really need to commend Riot Games / Gene Chorba for.

The Great Journey Beyond

Thankfully my ESTA was still valid from the Microsoft conference I attended last year (that’s $15 saved that I can put towards tacos) meaning all I needed to do was book my coach to London and I was set! Thankfully, Riot was very generous in offering full coverage for flights and accommodation, the within just 20~ hours of setting off I had arrived in LA, was picked up by a private taxi and taken directly to the hotel.

Within seconds of walking into my room and chatting to my roommate, we were already discussing machine learning and virtual reality, I knew I was in the right place.

Shortly after unpacking and exploring the local area with my roommate, those of us that had arrived swarmed the lobby area – you’d have noticed it was a group of programmers a mile off – We made our way to the pier for some much-needed beach pictures, delicious food, and a bad attempt at the mannequin challenge.


NDAs, Secrets and Where Not to Find Them

At 8:30 am the following morning we were whisked away by coach to the fabled Riot Games headquarters. The building itself was immense, the

Scriptable Object based Dialogue System

Creating a Scriptable Object Based Dialogue System


I recently started work on a new RPG project, of which I knew I’d need to work on a dialogue system for. After implementing this in the form of JSON / internal database, I settled on creating my own handler for the sake of quick implementation and ‘scriptability’™.

One requirement of this system was to have variable responses, blocked either by prerequisites or due to certain events being triggered.

ScriptableObjects are a wonderful way of storing bits of data that you may not necessarily want to be easily editable in your game. The downside of this method over having a large database is that every conversation is its own instance, meaning that instead of one file that can easily be passed off to a localiser for example, each object must be selected and changed manually in the editor.

Setting Up our Dialogue Class

For my project, I wanted to know both the NPC’s ID and name, this could easily be adapted to hold an ID for looking up a sprite in the case of wanting a profile picture during conversations.

I also wanted two additional fields for my responses, these being pre-requisites and triggers. Pre-reqs allow for certain responses to only display if those conditions are met, triggers send out an event to the object that triggered the dialogue (either the player or an NPC).

The ‘next’ field notes which response to show next, I’d recommend using -1 as your default ‘exit’ value.

public class Dialogue : ScriptableObject {
public int npcID;
public string npcName;
public Message[] messages;

public class Message {
public string text;
public Response[] responses;

public class Response {
public int next;
public string reply;
public string prereq;
public string trigger;


Creating Instances

Now for a slightly awkward tidbit; if you want to create an instance of your lovely new class, we can simply right click the script and we’re done! have to create a brand new class and add a new ‘create’ option in our editor. To do this, just add this either as its own script or inside the dialogue script:

[CreateAssetMenu(fileName = “New Dialogue”, menuName =”Dialogue/New Dialogue”)]
public class DialogueData : Dialogue { }

Once you’ve done this, right clicking inside your assets folder and selecting ‘create’ will now bring up a new menu!

Beautiful! Now once the instance is created, I’d recommend folders for each of your rooms to help keep track of conversations. In this example, I’ve created a folder for all of the conversations held in the living room.

Once you have your conversations, you can now start writing your dialogue. Note that you are still able to implement your own markup for these pieces of text as if they were done in a DB format.

Reading the Text

There are many implementations and areas of customisation for reading in the text so I’ll just be covering the bare bones here.

Create a class called TextHandler and add a new public function called LoadDialogue, pass through both the dialogue object so that we can access the data as well as the sender object itself. (This is useful to us if we want to send events to that object)
Additionally, create a variable at the top of the script where we can cache a reference to the dialogue object for use in other functions.

Create another function called LoadText, this will take the index of our ‘next’ dialogue option. We’ll want to call LoadText(0) inside out LoadDialogue option to start off each interaction at its first index.

Here you can access the content of the object by accessing the reference created earlier and using ‘next’ as the index.

Tip: Using a vertical layout group and enabling/disabling the objects will automatically format them to the number of responses you have.

Now that we have our text-complete dialogue objects and TextHandler, we need a way to assign them to each object.

Create a class called Interactable and create a reference for both the TextHandler and Dialogue option. In my case, I added a little floating speech bubble that shows the object is interactable.
You’re all done! You can now start writing dialogues in seconds and customising it to your heart’s content <3


Looking for an active #GameDev community? Join our partnered Discord community, GameDevNetwork!

[Hackathon] DocChARt [Winner]

MedHack: DocChARt
University of Sheffield
Organisers: Tzen Szen, Jeremy Chui, Julian Ow, Med Tech Society
Team:  Jannah Shaffie, Wai Ching Lin, Andrew Nguyen, Liam Sorta, Dorian Brown. (Respective order of team image below)


The incredibly diverse team behind DocChARt

I had the pleasure of spending my weekend this week at the University of Sheffield for their first iteration of HackMed, a hackathon aimed at integrating both Medical (Jannah, Wai Ching and Andrew) and Computer Science students (Myself and Dorian).

Before I even sat down, I mentioned some of the capabilities of AR and how such technology is brilliant for instant visual feedback. An issue raised was that of the time wasted searching for reports, patient history and various other forms of diagnostic information on patients as a doctor enters the room. Our solution would replace this by simply sticking a tracking image unique to the patient above their bed; this would trigger multiple windows to appear and display important information such as vitals, allergies, current medication, etc.

Short video I made showcasing the app with some overly positive music

An added bonus of this would be that regardless of where the doctor is based in the hospital, provided they have access to the tracking image, this information can be pulled down.

We made sure to spend some time thinking of the perfect punny name for our AR creation as this was an obvious priority for us. DocChARt, as suggested by Andrew was too good not to use.

After spending a solid hour trying to solve the issue as to why tracking wasn’t working (because I didn’t enable a package) I began development on the app and experienced the joys of working with UI elements / positioning them in a real space.

We also created an actual hospital room and fake patient to demonstrate this. I’ll take a moment for this because this is arguably more innovative than the collective works of Tesla.

  • The patient’s tracking image was attached to a folded strip of paper, allowing it to be hot-switched out.
  • Our patient was a whiteboard marker with a sad face drawn on a rolled strip of paper.
  • The bed was a whiteboard rubber with a layer of tissue wrapped around it, then another layer, folded at the top for the ‘cover’.
  • Wai Ching did some magic and added a WINDOW with the head of a transparent plastic spoon
  • Support beams around the model with dismembered spoons/forks

All of this was then placed on five plastic cups to give us a bit of extra height to prevent any unnecessary background noise during the image capturing.

Our hospital room with our AR-placed elements around the room. Please take note of my hilarious hospital-themed name for Katniss

Diego thankfully managed to create a great looking UI so that the app would be something other than solid coloured backgrounds as is with most of my attempts at ‘design’. He also created an iPhone app with a touch-ID login system so that we wouldn’t need to use my password field system, something that’d be quite clunky in a practical environment.

We totally got 1k downloads, don’t question it.

To make sure the demo was as realistic as possible, I implemented some live vital charts for our left panel, while, in practice, this would be fed via a server, I wanted to simulate this without having to use any costly functions as this would be running every frame. Due to the costly nature of the Hololens, this would also need to run on mobile, thus the need for efficiency.

After some input from both Wai Ching and Andrew, I managed to generate realistic looking patterns for the various kinds of data ECG machines would typically provide.
I scrapped my original system for something less costly, inspired by a tutorial we worked through in my first year at university for repeating backgrounds.

Efficient Cheating 

To do this, I created a flat 2D image that started and ended in the dead centre. I then wrapped this around a quad and almost like magic, I was able to adjust the material offset to simulate the waves, all without using any expensive functions. The random also has a min/max and delay option, allowing for graphs such as the blood pressure to be slower.

And our project wouldn’t be complete without a website, with a background mostly in medical, Jannah amazingly put together a beautiful site for us in less than 24 hours:
It was a joy to work with such a talented, diverse team; this has easily been one of my most enjoyable projects to work on to date.

That said, this post wouldn’t be complete without a swag-shot:

[Hackathon] business_cARds

MLH Prime: European Regionals: business_cARds
Bloomberg HQ London
Organisers: Tim Fogarty, Major League Hacking
Team: Liam Sorta, Tom Goodman, Arvydas Kubilius

Photo credit: Harshpal Singh Bhirth

Last weekend I was fortunate enough to be able to attend the final MLH Prime event in the beautiful Bloomberg London office. One thing I do need to give immediate kudos for is for having the coolest wristbands at any hackathon I’ve been to; I’m a sucker for fabric bands.

For Prime we wanted to take a more business-focused approach for our hack, incorporating practicality with extensibility. Looking to using AR as a form of disruptive tech, we came up with a few concepts ranging from map assistants to projectable keyboards. However, we settled on the idea of augmented reality business c…ar…ds. One day I will work on an AR project without punning up the title, one day…

The app:
company has a custom card printed that will be used for every employee. Each card has a small section with a unique ID (our demonstration used names, in practice this would be a randomly generated string of characters) that can be entered into a secret login page. 

What I love about this is the idea of these cards being a mini secret portal into a querying database, something about that concept brings out the RPG gamer in me I suppose.

Once an ID has been entered into the top secret command console terminal portal, a user will pop up with a fancy animation around them.

We originally planned to have an online admin panel for business owners to add employees; we’d then send a request to the database we’d be checking (Bloomberg in this case) along with the user ID, the information would then be returned to us in JSON. But as with many hackathons, those 24 hours are tough to manage. I cheated a little and created a user class that holds the data of their card in public fields and tied it up with a few empty game objects in unity.

I request that you don’t look at the GitHub repository above, hindsight is a powerful trait that was not present in the late hours of the event.

Being part of MLH Prime was a blast, we made something relatively polished, saw some incredible hacks and most importantly of all, got two free t-shirts to meet so many new and interesting people.

Photo credit: Ollie Favell

Oh, and the unlimited free blueberry drinks were great too.

Join me next week where I attend DesperatelyInNeedOfAHaircutHack.

[Hackathon] WizARds [Winner]

Before I get into the project, I’m proud to stay that this is the first hackathon I’ve attended without any sleep, so, a round of applause to me. I’m very, very sorry.

Brumhack 6.0: WizARds
University of Birmingham
Organisers: Poppie Simmonds, Dan Jones
Team: Liam Sorta, Nathan Dewell, John Paul Ibanez, Scott Brynnink

Photo credit: Filippo Galli

In the ever-expanding world of Virtual/Augmented Reality technologies, we wanted to create something that would show off how quickly an application could be put together with impressive results.

Any fans of the Yu-Gi-Oh franchise will appreciate how cool it was seeing projected monsters actually fighting, especially wanting one of their own. Sadly, I never did manage to create a full-scale monster fighting stadium in my backyard.

Our group being collective fans of card games all have too wanted to experience virtual monster battles.

We set out to create a relatively small AR wizard fighting game.
Here are a few of the features we managed to include in just the 24 hours:

  • Three ‘types’, Fire, Water, and Earth (think pokémon) that would form a combat triangle, allowing for more strategic gameplay
  • 6 playable characters
  • A basic attack and two special abilities for each character
  • Live health, displayed alongside the character
  • Attacks bound to character rather than as an overlay
  • Active player detection
  • Round management for turn-based fights
  • Live hot switching!! Swap out a character at any time

Though the magic of the AR framework, Vuforia, we were able to implement tracking fairly quickly, leaving most of the work down to the actual mechanics/player systems.

For Vuforia to track an image well, there must be a high amount of distortion in the image. For example, compare these two cards:

Fig A: No background, basic model and stats
Fig B: Background to both model and card

Due to the tracking data being based on the grey scale histogram of the image, it’s imperative that there’s sufficient variation in tone/texture.

Another issue I ran into was determining which player was which. Originally, the plan was to allow for a battle to take place at any angle, allowing for repositioning at any point. This poses the issue of detecting who owns who, there would be no way to determine this without verifying the player’s “deck” at the start of each game, adding a delay to every match.

To circumvent this, I gave ownership of the characters based on world position. This generic assignment meant that there was never an official owner of the model, resulting in the need for a constantly static playing field.

Adding a few world-positioned canvases to the characters was the only other step and bam, we have summonable models!

Round management was pretty simple after this, it ensures there are two players on the field, waits for both of them to select an attack move and fires away. If a player selects a move and ‘disconnects’, their status is revoked and cancels the turn, preventing the other player from getting any free hits in.

My team and I presenting WizARds post-hack

This project was a bunch of fun to work on and came out looking somewhat polished. We were honoured to win gold sponsor Capgemini’s prize for innovation, receiving a fancy new Raspberry Pi in the process. Here’s an awkward photo:


[Hackathon] Where/Werewolf

Brumhack: Where/Werewolf
Curzon Building, Birmingham
Organisers: BCU HaCS Society
Team: Liam Sorta, Nat Baulch-Jones‎, Edward Evans
GitHub: (Can’t get on the repo </3)



As you may have guessed, I’m not a huge fan of web dev. Alas, this hack I bit the bullet and worked on (or at least attempted to) front/backend web development.

Our original concept was a site that made organising the popular party game “Werewolf” easier. A visitor could either create or join a lobby, at which point the admin of the lobby would start the game and distribute cards out to all players, displaying their card while holding down on the card back.

Admins would declare players as dead, track usage of class-specific abilities and even restart the game if necessary.
You read ‘would’.
You know where this is going.

We didn’t quite manage to pull this off and lost one of our teammates to exhaustion come night.

Quick! Pivot!
There’s an hour left, what do we do!?


And so from the werewolf’s ashes, Wherewolf arose.

Let me play through a scenario with you.
You’re walking home, it’s a quiet yet peaceful night, a light warm breeze drifting by.
You walk down the nature reserve path SUDDEN, WOLF

You roll for dexterity, you roll 1, campaign over.

Imagine if you had a site that could check if there were wolves in your area!

The site we created in the final hour asks for a postcode input and returns your likelihood of encountering a wolf. 

I never thought there would be as much data online regarding wolf population density as there is, but I suppose it’s natural that others share my concern of spontaneous wolf attacks.

In all seriousness, this was a great chance for me to educate myself on web socket programming, front/backend development, it’s also a great example of why even if you out-scope yourself at a hackathon, it’s always worth making the best of a situation and demoing regardless.

Oh, and IHateWeb

Global Game Jam 2017

BCU – Global Game Jam

For the last five years, Birmingham City University has opened their gates as hosts to the annual Global Game Jam, a 48-hour game development event aimed at bringing anyone with a passion for games together to create a game from scratch. The diversity the jam promotes speaks volumes from its 36,000 jammers, 7,000 produced games and the involvement of 95 different countries this year alone.

I had the pleasure of organising this year’s event on behalf of the Computer Games Technology course. While I have planned other events in the past, the GGJ is particularly special to me as it promotes a platform that encourages developers of all abilities the experience to collaborate and produce something they can be proud of in just a weekend.

Being surrounded by an ocean of other passionate developers offers a unique opportunity to get instant feedback from those in similar roles but with a range diverse backgrounds. Whether there is an issue with their code, a creative block or you’re just particularly curious about something, there is always someone around that can offer their input.

While news spreads across the university year-by-year, the quality of our events attract students from around the country. From Nottingham to Cambridge, from Reddit to Twitter, the event has spiked in popularity this year, reaching a record-breaking 100 attendees making this year by far the most attended GGJ event hosted at BCU.

There was a consistently positive atmosphere, partially lit with the excitement of glow sticks and neon bracelets, even with the wave of exhaustion hitting those attempting to ironman the 48 hours without any sleep at all.

We also had the honour of receiving Jake Parker and Adam Kaye from fishinabottle as well as Andrew Hague from Very Good Friend, having their expertise in the judging process was a great asset and allowed our developers to ask questions, receive feedback on their games and expand their professional network. See the bottom of this page for links to their studios.

For those of whom were awake, we also conducted live interviews, even being featured on the official Global Game Jam “” channel. It was great to see the live progress of the games being produced and the differences of workflows between teams.

After promoting a fantastic weekend of game development and collaboration, I’m incredibly proud of how well the event turned out. It wouldn’t have been possible without the help of Dr Wilson, our volunteers, and our fantastic guest judges from fishinabottle and Very Good Friend. Most importantly, however, we managed to cultivate a space where our many attendees could work creatively and do so with the support of a local development community.

Here’s to GGJ 2018!


Very Good Friend:


Frequency Analysis

A friend of mine recently released a demo for his new studio’s second title, Heartbound[1], though aside from being a fantastic developer, his cryptographic prowess has secured him multiple DefCon Black Badges[2][3].

Along with the release of the game, he created an ARG puzzle that you can check out here:
Spoiler Warning: If you’d like to work through the challenges yourself, I’d avoid reading the rest of this article.

One of the challenges requires the usage of the runic glyphs as seen on multiple pages throughout the ARG, though I wasn’t sure as to how I would actually apply them.

My initial thoughts lead me to believe it was simply a public font and many hours of googling resulted in even more confusion.

A series of books titled “Our Mathematical World” recently launched here in the UK. One of the cryptography-focused issues brought up the usage of frequency analysis to decode caesar ciphers along with unknown foreign text, at which point I  immediately thought back to the puzzle. The book even provided this neat little table:

We’ll start off with this sample of all glyphs and start searching for the most frequent ‘characters’. To narrow this down further, we can check the glyph positions and reference them with

The site identifies the most common first letter in the English language as ‘T’. Under this assumption, I went with what I’ve coined as the ‘table’ glyph. I’ve removed the background and marked our guess.
Luckily there are a couple of two letter words on the right set of glyphs, this makes it easier to narrow down possible characters.
The most common second letters are as follows: h, o, e, i, a, u, n, r, t. As, at least to my knowledge, ‘th’ isn’t a word, we can assume that the F glyph is ‘o’ to form ‘To’.

From here it’s a case of following similar cases as well as using word finder tools to aid in character assumptions. I’ve also added a key as to help us form words later on.

‘th’ being by far the most common two letter combination made eyeing up certain words / assumptions much easier, providing a larger sample size for future referencing. Infuriatingly there isn’t enough data to complete a custom alphabet, much to the dismay inner completionist.

It was a super fun topic to look into, though these days frequency analysis has lost most of its purpose in the realm of cryptography, only bearing relevance to the odd archaeologist or computer science class.

Bonus! My original messy walkthrough of the puzzle


Microsoft Student Partner Worldwide Summit

At the start of my second year at University I was informed of an opportunity to become a ‘Microsoft Student Partner’ (MSP). A couple of weeks after filling in my application I was ecstatic to hear I had been accepted into their programme. While I didn’t at all expect the journey this would take me on, it would be one that changed far more than a single new line on my CV. 

The Snowball Effect

In many scenarios, being offered the position is the easy part. If you want that role to be personally rewarding, then take advantage of any and all opportunities that present themselves.

Throughout the year I applied for every meeting, webinar and corporate function available. Why? Every event is an opportunity. Being successful in any role is all about being active. Networking is an essential skill in the age we live in, if you’re in a room full of experts in your field you cannot afford to sit idly by and enjoy the complimentary buffet without leaving with a new contact.

I threw myself at the programme and did everything within my power to make my time as part of it as successful as possible. In addition to the various events I attended, I organised and ran both workshops and tutoring sessions, aided with the organisation of two University hackathons and secured sponsorship for the yearly Global Game Jam competition. A collective reach of ~250 students.

Along with the active involvement and initiative from everyone involved in the newly founded hackathon society, we managed to take Birmingham City University’s hackathon involvement from little to no presence to #5 in Europe based on the MLH Seasonal Standings. I was later given the position of Student Partner Regional Lead as well as the additional role as chairman of the Microsoft Sponsorship Committee.

Attending the Worldwide Summit

It was an incredible honour to have been chosen as one of the 4 UK MSPs to attend the summit, before the feeling had really hit me I was in the air on a 9 hour non-stop flight to Seattle, Washington.

After arriving at the dorms I’d be staying in, I had the chance to talk to some of the other MSPs, a few of which had been traveling for nearly double the duration I had! Being able to both see some of the incredible work being done by student partners across the world and the overall atmosphere was a remarkable feeling. Everyone participating was enthusiastic and had a passion for what they did.


Now, as one may come to expect from a games programmer, the Hololens is something I’ve been following since its initial inception. I thought my chances of actually being able to get my hands on testing with it were next to none. Heck, even most of the staff at Microsoft hadn’t tried it yet!

Side note: If you haven’t seen/heard of Hololens yet, I’d recommend watching this short video, I couldn’t do justice for such a beautiful piece of technology in a couple of sentences.


You’ve probably already guessed from the buildup I’ve written that I did indeed get to test-drive the Hololens. Not only that, we were able to do so without an NDA in a development setting. After entering a room I can only describe as a technologist’s paradise we were given an open Unity project, followed a few steps and before long we had networked player avatars that other people could shoot!

Despite best efforts, I was not given permission to smuggle one out with me.


If my organisation/participation history with hackathons has taught me anything it’s that if there’s a group of inspired individuals and an IDE, good things are going to happen. Following on from an IoT (Internet of Things) talk, a room full of hardware and open terminals awaited us. Utilising the kits, Azure and a bit of code, groups of four were tasked with creating a football robot to compete with other MSPs to win an Xbox One.

Photo Credit:

The best part of this, however, was the day after where all MSPs would attend a special event at the local secondary school, Garfield High School. We roamed the hall and assisted all of the budding young students with their creations and showed them some of the possibilities that IoT can offer. A programmer will often be asked “How do you know that?”. While helping one group in particular someone asked me, “What do I need to do to know that?” and that passion is what fuels a great learner. There wasn’t a single student in that gymnasium without ambition, granted some of that ambition may be at the prospects of winning an Xbox, but ambition nonetheless!

Imagine Cup World Finals

As an Imagine Cup National Finalist myself, I can greatly appreciate the effort and commitment from every competitor involved. After listening to an introductory speech from Microsoft figureheads Steven Guggenheimer (Chief Evangelist) and even Satya Nadella (CEO) himself, we were left with the following: “We can imagine solutions that to date have not been possible”, a quote of which is a true testament to the potential modern day technology has to offer.

The Imagine Cup has 3 categories, each with their own set of criteria, the winners of each of these were as follows:

Games: PH21 – Timelie

PH21 of Thailand created a stealth puzzle game with an unpredictable storyline and an unusual gameplay system. The player controls Marza, a mysterious woman who has stolen a device with the power to see the future, and Alpha, a little girl who has the ability to manipulate time. The two characters must cooperate to overcome obstacles and find the best route for each puzzle.*

Innovation: ENTy (Overall Winner)

The ENTy team of Romania created a high-tech wearable device that tracks the balance of the internal ear and checks the spine posture in real time. The device is the size of a door key and is worn on the back or head and can detect inner ear problems and other data that can be useful to doctors in diagnosing patients.*

World Citizenship: Amanda

The AMANDA team of Greece developed a gamified virtual reality app to help combat bullying. It puts users in 3D interactive scenes, many involving bullying, to gauge their responses and focus on boosting empathy, awareness and self-esteem. The project is named for Amanda Todd, a Canadian teen who made a video detailing her experience of being bullied before taking her own life in 2012.*



Being an MSP has been an invaluable experience both educationally and professionally. The programme very much supports a phrase I’ve been mindful of since starting University, “You are responsible for creating your own success”. With dedication, discipline and a true passion for what you do, anything is possible.


Special Thanks: Dr. Thomas Lancaster, Sunbir Alam, Lee Stott, Andrew Webber, Dr. Andrew Wilson, BCU: HaCS

Global Game Jam 2016

Every year Birmingham City University hosts the Global Game Jam (organised by Andrew Wilson), a worldwide event that poses the challenge of creating a working game in just 48 hours. The nature of these kind of events comes from the ‘crunch time’ effect experienced by many game developers to hit milestones / release deadlines.




Students being given this opportunity gives them a glimpse into the industry demands and encourages them to work on their team building skills as well as correctly anticipating the scope of their project. It’s also a brilliant opportunity for students interested in game development (regardless of their course) to meet and discuss new ideas and meet industry professionals.


JxLoDWR[1](A sea of developers awaiting their food)

Our event ran for 48 hours between the 29th January and the 31st of January with an astounding 60+ attendees, the best ever turnout for a uni-held GGJ! Microsoft were generous enough to sponsor our event which took care of all of the refreshments / snacks for the event, be sure to check out their web development platform Azure here: !

Participants also had the opportunity to meet and chat with industry experts from Rare, Escape Technologies and Radiant Worlds, having this opportunity opens up the potential for internships or even graduate jobs, it also lets them experience professional networking first hand.


S3LItpm[1](My exhausted self alongside my team, of which are also falling asleep)

I personally only slept for 3 hours between 6:30am on the 29th and 11pm on the 31st, needless to say I was pretty worn. We did however place second for game studio Rare’s pick which was a great honour.
Our game was called Secret of the Elements, a NES-Style adventure game that tasks you in defeating each element (Fire, Water, Wind and Earth) to unlock a portal to a monster they had kept locked away.
You can check out our game here: (Expect bugs!)