M A T T H E W - P O P E

mArIo

brain

Neural networks are amazing, and they are everywhere. In our phones, in our self driving cars, and even in our heads[1]. They are simulations of the brain; unlike our brains they are specialized to identify one thing (classification), to predict one thing (forecasting), or to find specific patters (pattern recognition).

I won’t go into detail any further about Artificial Nerual Networks, there are plenty of white-papers[2] and blog posts[3]. If you are uninformed, take a look at those links! They are very interesting to read in my opinion.

I myself used the a basic algorithm, and created a simple ANN with a fixed structure and no loops. The project was called mArIo[4], written in C++ and Lua. For mArIo I made some…unwise design choices to begin with, to make results appear faster. Namely, how I decided to structure the ANN. I’ll get into that in a little bit, I should introduce mArIo first.

mArIo is was an assignment for my AI class at CSU Chico. Other groups decided to take on projects such as solve 16x16 sudoku, or other ideas using recursion with back tracking/niave bayse networks. We decided to play Mario. Our professor was skeptial[5]. She had a good reason, we had only two weeks to complete the assignment.

We dove in, basically burning through that Thanksgiving break coding and talking over Google hangouts. Design choices were made in the lib a day before break, everything else was completed after. Through a random time slot assignment we were chosen to pick second, on the Monday when we returned. The heat was on.

We ended up utilizing a simple Genetic Algorithm that changes the strucutre of a Mario’s brain, which was an individual ANN. We would create 4 children from 2 parents, then run all of them. We would evaluate the fitness, which was only based off of distance traveled right at the time, and then we would save the most ‘fit’ Marios for the next generation. Test, splice, mutate, run, test, repeat.

Our input data would come from the 208 regesters that the emulator BizHawk[6] used for the emulation. Pulling the information from BizHawk was done using their lua interface. Each enviornment variable such as: pipes, goombas, and bricks, were given a unique even enumeration. Mario was given the value of 1. Now the ANN could innately keep track of Mario’s location, being the only spot that would be odd.

So, we took the regester data, fed it to the ANN, and returned two commands. We decided upon using right and ‘A’ to keep things simple at first. Then, through the Lua interface, the output created through the ANN is then fed into the emulator’s input registers. Then Mario would move. This movement created a whole new set of enviornment data, which was fed into the ANN and would generate the next set of movement instructions. The NES version of Mario has nothing random in it, so the input to the ANN is consistent through each run through.

Through the miracle of compiler optimizations and constant references the ANN kept up with the with the speed of the emulator. When we cranked the emulator up to 8x speed, the ANN kept up too. It was a miracle.

Now for my mistakes. These design faux pas were made for a good reason though! We only had two weeks to combine my ANN with the emulator, so I had to make a short-sighted decision. The ANN was represented as a series of 2d vectors, as opposed to a graph form. This decision had multiple repercussions. For starters nodes could only be connected to their neighboring layers, as opposed to being able to have the ability to connect to nodes that lived in far off lands.

The next problem that was generated by this design was the ability to easily change the structure. That is, the ability to add or subtract nodes from a layer, or even to add or remove the layers themselves was not present. This means we have a limited subset of mutations available. The only mutations that we were able to accomplish were swapping entire rows of weights, and changing an individual weight. Although limited these proved to be minimally sufficent.

I will pretend to know what you are thinking[7]. “Why, oh why would this be an choice you made?” Well I shall tell you. Mutation was extremely easy, just swappage of vectors. Error propigation, pre genetic algorithm, was easier. I don’t believe we’d have finished the project, and gotten an A, if this decision wasn’t made.

I have now decided to undertake the task of changing the ANN to a graph based version. I want mArIo to make it across the level. Our results were limited to the half way point before a bad mutation prevents further advancement. This project will continue past the scope of the class.

By the way, this was a 4 man project. I’d like to thank Brandon Smith, Garrett Swann, and Matt Nachtigal for their work on the project. I believe we all will be contributing more in the future.

When I have finished the graph conversion I will post a new entry with a comparison and general feeling about the change.

Keep it real,

Matthew

Super Scripts

[1] Well, these are actual neural networks…
[2] whitepaper
[3] other post
[4] Twitch Stream I am unsure if it is still running/is active at the time you may be viewing this. Also, clever name right?
[5] I believe her words were: “Are you sure you don’t wish to try something eaiser?” and “I don’t think you should do this.”
[6] BizHawk is a great emulator often used for speed runs. The Lua console is more useful than just finding hidden enviornment variables though!
[7] I useds a special, unreleased ANN I’ve been developing to predict thoughts.