brain-hacking-and-thought-controlled-quadcopters-good-and-bad-future-mind-reading-devices-1280x600 January 05

Brain Hackathon: Towards becoming the cyborg you always wanted to be.


Ever wondered what the future looks like? I’m pretty sure it involves brain machine interfaces (i.e. the kickoff at the 2014 world cup in Brazil). Listen to my audio story about a brain hackathon- where nerdoscientists™ get together to create new brain machine interfaces!



NARRATION: Imagine a world in which you could play a video game without a controller or a world in which your phone could send you reminders to calm down when you are feeling out of whack.  Well you don’t really have to imagine, because technologies like this already exist, albeit in early stages, that allow signals from your brain to communicate directly with computers to do something useful for you.  These technologies are called Brain Machine Interfaces or Brain Computer Interfaces, BMI or BCI for short.  While BMIs were first conceived in the 1970’s, the possibilities of these interfaces are just now coming to fruition. That is the main purpose of a brain hackathon, an event where developers, neuroscientists, computer scient ists  get together in small teams to brainstorm and demonstrate new functionalities for these devices.  I visited a hackathon this September held in downtown San Diego at the office of the startup incubator, EvoNexus.  It looked just as I’d imagine an office in Silicon Valley to look like with lots of large whiteboards covered in scribbles, oddly shaped couches, and the requisite ping pong table.  I arrived at 9 in the morning on a Saturday as participants trickle in and find their workspaces.

[Crowd Chatter], [Knocking]

ADMIN: Good Morning.  We are having an introduction over in the stage area.

[Crowd Chatter]

NARISA: So my name is Narisa Chu and here is Tim Mullen, we both are the organizers for this hackathon.  I am here to welcome you.  Come quickly! Ok, there is a team from Taiwan. Welcome, welcome, they have traveled 6,000 miles or whatever.



NARRATION: 16 teams gather.  While many teams are local from UCSD or the Salk Institute, others have flown in from Florida, Chicago and even Taiwan. Let’s meet a few of the teams.

ME: What’s your team name?

TEAM 1: Three TSPs.  Yea so its three because we are three members and T is for Terrin, S is for Semir and P is for Pam.

TEAM 2: Brain nyo, brain n-y-o

TEAM 3: Goblin.  

TEAM 4: Our team-name is pretty creative. It is team SPAWAR. So SPAWAR is a navy organization in San Diego -the space and naval warfare system center.

ME: Do you guys work for them?

TEAM 4: Yea, so, we work for the Navy.  




NARRATION: I talk to Narisa about the story behind this event

NARISA: So I am on the board of governors for IEEE (Institute of Electrical and Electronics Engineers).  IEEE started this initiative focusing on brain technology brain science and engineering and all applications associated with that.  So I made a proposal to run a hackathon and we liked to come to San Diego to do that because SD is one of the major hubs of biomedical engineering, healthcare, and communications technology, defense, you name it.

NARRATION: San Diego is also the home of Qusp, the company that the other leader of the hackathon, Tim Mullen, founded. They develop tools for processing brain signals; tools that can be used in brain machine interfaces. Tim Mullen explains the purpose of this event.


Qusp’s Tim Mullen explaining BCIs to hackers.


TIM MULLENS: The purpose of this hackathon is to be application oriented, you know be creative, think about how BMI will be used in the next you know decade and the decades to come. Now there is a huge push towards translational applications of all this stuff that has happened in the sciences.  What that requires is new technologies that can allow us to take these scientific ideas and put them in the real world. Profound technology should be something you interact with in your natural organic way like you interact with the world and or with other humans.  

NARRATION: This might look like the way you interact with your smart phone, almost like it is an extension of yourself.

TIM MULLENS: For neurotechnology that has not been the case.

NARRATION: If this technology is to be ubiquitous, we will have to use brain signals that can be recorded non-invasively, with measurements that we call EEGs, or electroencephalograms. A collective firing of many neurons in your cortex, or the outer layer of your brain, can generate signals that can be captured by electrodes placed on your head. With better and cheaper sensors on the rise and more computing power, the use of EEG as a signal for BCIs for the average consumer becomes more of a possibility.   

TIM MULLENS: These are your kind of typical conventional EEG systems. You know 256, 128, 64 channel montages.

NARRATION: Tim shows off a table full of EEG sensors that are  available for the teams to use for their projects- from bulky sensors that look like a bike helmet, to small sensors that look like a headset a McDonald’s employee might wear.


Equipment table full of EEG sensors

TIM MULLENS: But, with just a few sensors on the head, you are limited to where they are placed.  They need to be very strategically placed to cover the area where the signal is being most strongly being activated in the brain. And that is because the electric field of sources in the brain which propagate up to your sensors, they decay very rapidly over distance. EEG is a very noisy sensing system.  Artifacts are incredibly problematic.  Blinks you create, the clenches of your jaw, the head movements, the sensor moving against the scalp.  All of these things are generally an order of magnitude or more larger in amplitude in terms of the signal.

NARRATION: That is one of the main challenges of this hackathon- choosing the right EEG sensor model and placing the EEG electrodes well.  As Tim said, electrical activity in the brain gets filtered through the skull so that only low frequency signals pass through and are detected by the EEG sensors.  The signals that one can detect are usually slow oscillations, or waves, of neural activity.  You may have heard of some of these when referring to sleep- delta, theta, alpha, beta, gamma waves. Another signal that is often used is from the area of your brain that tells your muscles to move. The motor cortex is set up like a map so that there is a specific area that is used to move a hand versus a foot.  Then you can just think about moving a hand or foot, and the EEG can capture that information to move something else like an artificial limb or a drone.  Okay, so now you have an idea of what signals and tools the team can work with.  Let’s get back to the rules of the hackathon.



NARRISSA: Presentations starts at four o’clock next day.  Every team is given at the most 5 minutes, so you are given a very limited time and then two minutes for questions and answers. We have three judges with excellent background[s], with really balanced field[s] of expertise.

NARRATION: Prizes range from 300 dollars for third prize to 1000 for first place- but to be honest, all of the participants that I talked to were more driven by the challenge of using the EEG equipment, by their love of playing with data, and the curiosity of seeing what they can create within the constraints of the contest.  

TIM MULLENS:  Go forth and hack.  [Trumpet Sound]


UCSD undergraduates working on an EEG controllable prosthetic hand.


NARRATION: And with that triumphant call to action, people shuffle back to their meeting rooms to start their projects.  In the meantime, let me explain the other challenges that the teams will face. The EEG sensors are sampling, or collecting data points, two hundred times a second.  First that data stream is likely filtered for unwanted noise and pesky artifacts are removed.  Next. the data can be broken into time windows and transformed so that the strength of different oscillations can be detected.  After that you can make all sorts of measures that might be important for your task like the correlation of signals coming from two electrodes at any given time.

But even after all of this data crunching, there is still another step that can take a lot of work.  You might not inherently know which measurements at which electrodes over what timescale corresponds to some state you are interested in.  Let me give you an example:  You want to know when Gina is thinking about sailing. To figure out what information correlates well with thoughts of sailing, your best bet is to train a computer algorithm in a process known as machine learning.  To get started, you ask Gina to think of a sailboat and you feed your EEG measurements to the computer with a flag letting it know that this is the signal that correlates with thoughts of sailboats. Then you ask Gina to not to think of a sailboat, anything else but a sailboat, and give that EEG data to the computer, this time telling it that you are not looking for that signal.  Sailboat, not sailboat, sailboat, not sailboat.  With enough training data, the computer algorithm can figure out which measurements are most important to predict if Gina is thinking of a sailboat. Now you are off to the races and you can use your EEG and computer algorithm so that every time Gina thinks of a sailboat, the song Come Sail Away by Styx starts playing full blast from her iphone.

Well hey that’s pretty cool, but how are you going to do all of that computer processing without a souped up PC by your side at all times.  The iphone definitely can’t handle that..yet.  Use the cloud!  Let the servers in data centers that exist who knows where to do all of the maths, and then let it send you the answer.  Aah the power of the cloud.  If this all sounds easy to you, it’s not and the brain hackathoners are well aware of this and spend the next day and a half pooling collective wisdom, creativity, and grit to put together a presentation for the judges.  I follow a judge around to hear the pitches.



TIM MULLENS: So as soon as Jordan is ready we will start the clock, just let us know when you are ready to begin.

TEAM 1: A major problem is matchmaking.  And the people are not truthful, there is always implication about when you meet someone what do they think, what do they want.  And so how can we use this kind of brain recorder to actually bypass that.  So how about we actually show images, we see how people react emotionally and we actually match people that have similar correlation.

TEAM 2: So imagine you are on a jog and you are listening to a playlist and you come across a song you don’t like.  What if you could change the song you are listening to by using your brain waves.  That’s the idea behind MIndplayer to use EEG to listen to songs and to tell if you want to skip to the next song based on your displeasure or not what you expect.

[Bollywood music starts playing]

TEAM 2: It was supposed to skip but it didn’t.

JUDGE: Well he likes the song

TEAM 3: We created a web app that monitors and reacts to a user’s level of focus during a work session. BCI focus is designed as a tool for both real time distraction mitigation and longitudinal focus training.

TIM MULLENS:  Ok that’s your time.  Alright guys.

NARRATION:  Examples of other ideas include a sonic game where doors to new soundscapes are opened using the mind, an app that uses responses to pictures of food to narrow down yelp restaurant choices, an alert system for when someone has an epileptic seizure. With 16 pitches in all, the judges have a lot to discuss.


TIM MULLENS: So just a quick, as the judges mentioned, it was really a deliberated process.  Very competitive projects, a lot of creativity and innovation.  

NARRATION: Goblin, a group of researchers from the UCSD Temporal Dynamics of Learning Center take first prize, by impressing the judges with a two player tug of war game using players’ mastery of their brain waves.  But for Team Goblin this is just the start. They want to use BCI games to help them study how kids process language.

TEAM GOBLIN: But you know that is the sci fi end of things, for now this is more practical.


TIM MULLENS: I just want to leave you guys with one more note.  All of the projects here are just the start of something.  Please continue with your work.  And again, thanks so much, just for participating in this last weekend.  It has been really great to have you. Thanks.

NARRATION: With that, the hackers disperse into a world that lags behind the future they envisioned here for three days in downtown San Diego.  I’m not sure how long it will be before everyone wears EEG sensors like they wear fitbits, with notifications telling you how focused you were during work, with computers predicting love interests so that you don’t have to bother with swiping right. But what I do know is that everyday and every brain hackathon brings us closer to our cyborg potentials. And now I will try to end this podcast using only my brain waves.   Gggrrrrrrrrrrr  grrrrrrrrr. [Sigh], well, maybe next year.  

This has been a neuwrite production.  Please visit the website to look at pictures from the event and for links to resources about brain hackathons around the world.  Content was recorded and edited by yours truly, Margot Wohl, with feedback from the neuwrite team.  A special thanks to Nicole Randall, Narisa Chu, and Tim Mullens for inviting me to the hackathon. Music featured in this segment was by Blue Dot Sessions, Dave Depper, Styx, Greg Steve, and Raggaman.


Additional Resources

Annual hackathon held in Amsterdam in what used to be an anatomy theater!

2017 Hackathon in Dublin – details not there yet though 

Open source software and (relatively) cheap equipment

DIY kit to experience a BMI on the cheap!  Uses EMG signals (from your muscles) instead of EEG.

Join a local NeuroTech club to meet with others interested in BCIs

Ship in the Woods getting creative with EEG signals