brain computer interfaces

nerding out on cs + neuroscience 🧠

welcome to introspection ft. harsehaj! ⭐️ i’m harsehaj, and always up to something in social good x tech.

this publication is a place for me to reflect on productivity, health and tech, and drop unique opportunities in the space right to your inbox daily. if you’re new here, sign up to tune in!💌

scroll to the end for my daily roundup on unique opportunities!

onto today’s topic: brain computer interfaces 🧠 

ever see those caps with wires sticking out and hooked up to a computer? if you haven’t, this is awkward. if you have, that’s an application of brain computer interfaces (bcis) — one of my favourite fields in tech.

i’ll give a quick overview. :)

a bci is a system that interprets brain signals and translates them into commands that can control external devices like computers, prosthetic limbs, or even smart home systems. this process can be broken down into 4 steps:

1) capturing brain activity 🌊 

the brain generates electric signals as a result of neuron activity, and bci systems can capture this non-invasively through eeg data collection, or invasively through electrocorticography or implanted micro-electrodes. each method corresponds to electric signals for different brain activity.

2) signal processing 🧹 

that data collected in step 1 typically isn’t usable, so naturally, the next step is to clean it up, which is a process involving preprocessing and feature mapping. first, the “noise” generated by muscle movement, eye blinks or any external interference is filtered out so that meaningful signal patterns linked to different features can be extracted. these signal patterns, or features, are then translated to intent. for example, if an eeg detects increased activity in the motor cortex, the system could infer that the user is trying to move their hand.

3) machine learning for translation 🤖 

we typically don’t leave the overwhelming work of parsing through thousands of different scans/frames to humans. instead, bcis make use of machine learning to do the command translation once meaningful patterns are extracted.

4) executing commands 🦾 

once a command is decided, it’s sent to an external device including (but not limited to) a robotic prosthetic limb, or wheelchair. for example, an eeg headset could detect brain activity related to focus and would let a user type without touching a keyboard. ⌨️ 

bcis are an exciting intersection of neuroscience, computer science, and engineering. personally, it also feels like our closest reach to science fiction. not to mention, there are so many applications to do good in the health sector with this technology.

Receive 2 free months of running training by participating in the 2025 RunDot Project.

The RunDot Project is an annual research initiative that helps runners reach their true potential through optimized training methods.

daily opportunity + resource drops 🔍️

Subscribe to keep reading

This content is free, but you must be subscribed to introspection ft. harsehaj to continue reading.

Already a subscriber?Sign In.Not now

Reply

or to participate.