Jules Anh Tuan Nguyen Explains How AI Lets Amputee Control Prosthetic Hand, Video Games - Ep. 149

The AI Podcast - Un pódcast de NVIDIA

Categorías:

Path-breaking work that translates an amputee’s thoughts into finger motions, and even commands in video games, holds open the possibility of humans controlling just about anything digital with their minds. Using GPUs, a group of researchers trained an AI neural decoder able to run on a compact, power-efficient NVIDIA Jetson Nano system on module (SOM) to translate 46-year-old Shawn Findley’s thoughts into individual finger motions. And if that breakthrough weren’t enough, the team then plugged Findley into a PC running Far Cry 5 and Raiden IV, where he had his game avatar move, jump — even fly a virtual helicopter — using his mind. It’s a demonstration that not only promises to give amputees more natural and responsive control over their prosthetics. It could one day give users almost superhuman capabilities. The effort is detailed in a draft paper, or pre-print, titled “A Portable, Self-Contained Neuroprosthetic Hand with Deep Learning-Based Finger Control.” It details an extraordinary cross-disciplinary collaboration behind a system that, in effect, allows humans to control just about anything digital with thoughts. Jules Anh Tuan Nguyen, the paper’s lead author and now a postdoctoral researcher at the University of Minnesota, spoke with NVIDIA AI Podcast host Noah Kravitz about his efforts to allow amputees to control their prosthetic limb — right down to the finger motions — with their minds. blogs.nvidia.com/blog/2021/08/10/lending-a-helping-hand-jules-anh-tuan-nguyen-on-building-a-neuroprosthetic

Visit the podcast's native language site