Skip to Content

 

Brain-Controlled Interface for the Visually Impaired

By Anshu Soni

About Me


I am Anshu Soni, a former Nanotechnology Researcher at NIT, passionate about leveraging cutting-edge neuroscience and AI to create life-changing assistive technologies. Over the years, I have worked on brain-computer interfaces (BCIs), AI-driven prosthetics, and bio-signal processing. My research focuses on bridging the gap between human cognition and technology, enabling individuals with disabilities to lead more independent lives.

My previous projects include:

1. Neural-Controlled Prosthetics – Developed a prototype prosthetic arm that responds to EEG signals.

2. AI-Powered Hearing Aid – Designed an adaptive hearing device that enhances selective sound processing.

3. Smart Wearables for Health Monitoring – Created a real-time vitals tracker using biometric sensors.

Now, I am working on an innovative solution to help visually impaired individuals "see" their surroundings using brain signals.

Introduction


Globally, over 43 million people suffer from visual impairment. Traditional tools like white canes, guide dogs, and AI-powered cameras have limitations—some are costly, while others provide only partial assistance. My solution aims to revolutionize navigation and spatial awareness using an EEG-based Brain-Controlled Interface (BCI).  

How It Works

1. Brainwave Sensors (EEG): Captures neural signals related to spatial perception.

2. AI-Powered Processing: Converts signals into meaningful real-time insights.

3. Multi-Feedback System: Translates information into audio or haptic (vibration) feedback, allowing users to "sense" objects around them.

Unlike AI camera-based solutions, this non-intrusive and adaptive technology works in real time and doesn’t rely on external vision systems.



Why This Innovation is Needed


Affordable & Accessible – Unlike high-end smart glasses that cost ₹2-3 lakhs ($2,500 - $3,500), this technology aims to be affordable and scalable.

Intuitive & Natural – It mimics real perception, reducing the cognitive strain of learning new systems.

Beyond Just Navigation – Future applications include gesture-controlled devices, robotic vision, and medical rehabilitation.

This project has immense potential in redefining assistive technology. With proper funding, I aim to build, test, and deploy a working prototype within the next 12-18 months.

Funding Requirement 


To bring this vision to life, I need ₹27.5 Lakhs ($33,500) in funding.

Budget Breakdown:

Advanced EEG Sensors & Wearable Design – ₹4.1 Lakhs ($5,000)

AI & Software Development – ₹7.4 Lakhs ($9,000)

Prototype Development & Testing – ₹5.7 Lakhs ($7,000)

Manufacturing & Early-Stage Deployment – ₹10.3 Lakhs ($12,500)

This funding will cover R&D, hardware procurement, software engineering, and testing in real-world environments.


How You Can Support


If you believe in accessible, life-changing technology, consider funding or collaborating to bring this innovation to reality.

I am also open to partnerships with AI developers, neuroscientists, and assistive tech manufacturers.

 Fund This Project

For inquiries, contact me at [ anshusoni259@gmail.com ]