top of page

Synthetic Nervous System
A neuromuscular feedback for embodied perception

Overview
VID_20191218_024442.mp4
00:00/00:00
VID_20190602_173228_website_use.mp4
00:00/00:00
PXL_20210406_132436846.jpg
A neuromuscular feedback toward embdied plasicity in cortical remapping
VID_20200208_123359.mp4
00:00/00:00

Synthetic Nervous System is a non-invasive, full-stack neuromuscular feedback platform that restores situational awareness through the body rather than through screens. Following an experimental Ewing amputation, the core challenge was not mobility but perception: passive prosthetics sever the feedback loop between body and environment, forcing terrain to be inferred too late and at high cognitive cost.

 

The key insight was biological, not technical. Habituation is not a hardware failure but a property of the nervous system itself. By using distributed, biphasic differential surface stimulation mapped to specific nerve groupings and musculature, sensory feedback remains spatially differentiated, adaptive, and biologically legible over time. Instead of symbolically representing a foot, the system enables cortical plasticity to remap sensation—allowing the user to genuinely feel the ground again.

 

Implemented as a 16-channel modular platform worn continuously in real environments, the project reframes bionics and AR as embodied learning systems rather than visual overlays.

 

When feedback is embodied, augmentation stops being an interface and becomes physiology.

The Challenge

How do you restore a sense the body no longer has — without asking the user to look, think, or interpret?

After an experimental Ewing amputation, the problem wasn’t walking. It was feeling. Conventional prosthetics collapse the sensory loop: terrain becomes abstract, delayed, inferred. Every step is reactive. Every surface arrives late.

The challenge was to rebuild a continuous feedback relationship between body and environment — not visually, not symbolically — but physiologically. To create a system that delivers real-time spatial awareness without screens, without cognition, and without becoming invisible through habituation.

Augmentation fails when it competes with attention. It succeeds when it speaks the body’s native language.

Most AR and bionic systems centralize feedback, relying on static cues that the nervous system quickly ignores. The realization was simple and counterintuitive:

habituation isn’t a bug — it’s a feature of biology.

Instead of fighting it, this system works with it.

By distributing biphasic neuromuscular stimulation across localized nodes — mapped to nerve groupings and limb geometry — sensation remains dynamic, spatially differentiated, and biologically legible. Over time, the brain doesn’t memorize signals — it remaps itself.

The result isn’t a simulated foot.

It’s the return of ground.

The Insight
The Execution

A full-stack synthetic nervous system, worn daily, learning continuously.

The project materialized as a 16-channel neuromuscular feedback platform, designed and built as both a functional interface and a living research instrument:

  • Distributed sensory input translating ground and gait dynamics directly into surface neurostimulation
     

  • Geometry-aware stimulation architecture targeting specific musculature and nerve groupings
     

  • Biphasic differential signals to sustain sensation over time and resist perceptual fade
     

  • Modular, plug-and-play pathways, enabling daily reconfiguration and experimentation
     

Rather than a lab-bound prototype, the system is worn in the real world — on uneven terrain, rocky beaches, and daily movement. When active, the prosthesis disappears. When disabled, the world goes quiet again.

This isn’t a demo.

It’s a feedback loop.

The Impact

This project reframes augmentation — and by extension AR — as something you inhabit, not observe.

  • Embodied transformation

    Immediate terrain feedback restores intuitive movement, eliminating the need for conscious correction.
     

  • A new model for AR

    No displays. No overlays. No focal attention. Information is encoded directly into the body, operating below awareness.
     

  • Conceptual shift

    Positions bionics and interfaces as exoskeletons for neuromuscular hypertrophy — systems that strengthen perception through use.
     

  • Foundational platform

    The work establishes fertile ground for future patents, products, and experiential systems that treat biology itself as the interface.
     

This is AR without graphics.

UX without UI.

Cinema without a screen.

Interaction

From tap-and-response to stimulus-and-adaptation

  • Interaction is continuous, bidirectional, and embodied
     

  • Users don’t trigger events — their movement is the input
     

  • Feedback loops evolve over time through cortical remapping
     

  • Demonstrates deep understanding of human sensing, habituation, and adaptive systems
     

ECXD signal: designing interactions that change behavior and perception, not just state.

Cinematic UX

Experience unfolds over time, through the body

  • Sensation is choreographed, not toggled
     

  • Spatial differentiation creates a felt narrative of terrain and motion
     

  • The “story” is revealed through repeated use — tension, relief, intuition
     

  • Presence is achieved without visuals, relying on pacing, rhythm, and embodiment
     

ECXD signal: cinematic thinking applied to physiology, not screens.

Motion Systems

Motion as meaning, not animation

  • Gait, weight shift, and terrain drive the experience
     

  • Feedback is synchronized to physical dynamics in real time
     

  • The system treats motion as a primary communication channel
     

  • Demonstrates mastery of temporal systems, responsiveness, and physical causality
     

ECXD signal: motion as a semantic layer, not decoration.

Why This Belongs in ECXD

This project proves you don’t just design interfaces —

you design relationships between humans and systems.

It shows how experience can be:

  • content-forward without content
     

  • cinematic without visuals
     

  • interactive without controls
     

And most importantly, it shows how Netflix-scale thinking can exist before the screen ever turns on.

IMG_20181021_141441-scaled.jpg
IMG_20191216_185153.jpg
IMG_20191216_194159.jpg
IMG_20200208_122439.jpg
IMG_20200208_120745.jpg
PXL_20210222_212243864~2.jpg
PXL_20210222_211429781.jpg
PXL_20210222_150314899.jpg
IMG_20191216_114926.jpg
IMG_20191216_114922.jpg
IMG_20191206_124720.jpg
IMG_20200125_082856.jpg
IMG_20200125_153304.jpg
IMG_20200208_120713.jpg
Challenge

How do you restore a sense the body no longer has — without asking the user to look, think, or interpret?

After an experimental Ewing amputation, the problem wasn’t walking. It was feeling. Conventional prosthetics collapse the sensory loop: terrain becomes abstract, delayed, inferred. Every step is reactive. Every surface arrives late.

The challenge was to rebuild a continuous feedback relationship between body and environment — not visually, not symbolically — but physiologically. To create a system that delivers real-time spatial awareness without screens, without cognition, and without becoming invisible through habituation.

Insight

Augmentation fails when it competes with attention. It succeeds when it speaks the body’s native language.

Most AR and bionic systems centralize feedback, relying on static cues that the nervous system quickly ignores. The realization was simple and counterintuitive:

habituation isn’t a bug — it’s a feature of biology.

Instead of fighting it, this system works with it.

By distributing biphasic neuromuscular stimulation across localized nodes — mapped to nerve groupings and limb geometry — sensation remains dynamic, spatially differentiated, and biologically legible. Over time, the brain doesn’t memorize signals — it remaps itself.

The result isn’t a simulated foot.

It’s the return of ground.

Execution

A full-stack synthetic nervous system, worn daily, learning continuously.

The project materialized as a 16-channel neuromuscular feedback platform, designed and built as both a functional interface and a living research instrument:

  • Distributed sensory input translating ground and gait dynamics directly into surface neurostimulation
     

  • Geometry-aware stimulation architecture targeting specific musculature and nerve groupings
     

  • Biphasic differential signals to sustain sensation over time and resist perceptual fade
     

  • Modular, plug-and-play pathways, enabling daily reconfiguration and experimentation
     

Rather than a lab-bound prototype, the system is worn in the real world — on uneven terrain, rocky beaches, and daily movement. When active, the prosthesis disappears. When disabled, the world goes quiet again.

This isn’t a demo.

It’s a feedback loop.

Impact

This project reframes augmentation — and by extension AR — as something you inhabit, not observe.

  • Embodied transformation

    Immediate terrain feedback restores intuitive movement, eliminating the need for conscious correction.
     

  • A new model for AR

    No displays. No overlays. No focal attention. Information is encoded directly into the body, operating below awareness.
     

  • Conceptual shift

    Positions bionics and interfaces as exoskeletons for neuromuscular hypertrophy — systems that strengthen perception through use.
     

  • Foundational platform

    The work establishes fertile ground for future patents, products, and experiential systems that treat biology itself as the interface.
     

This is AR without graphics.

UX without UI.

Cinema without a screen.

PXL_20210801_221911694.jpg
PXL_20210222_211728423.jpg
bottom of page