Human-Robot Interaction and Whole-Body Robot Sensing

Share

Abstract:  The ability by a robot to operate in an uncertain environment, such as near humans or far away under human control, potentially opens a myriad uses. Examples include robots preparing the Mars surface for human arrival; robots for assembly of large space telescopes; robot helpers for the elderly; robot search and disposal of war mines. So far advances in this area have been coming slowly, with a focus on small categories of tasks rather than on a universal ability typical in nature. Challenges appear both on the robotics side and on human side: robots have hard time adjusting to an unstructured environment, whereas human cognition has serious limits in adjusting to robots and grasping complex 2D and 3D motion. As a result, applications where robots operate near humans – or far away under their control – are exceedingly rare. The way out of this impasse is to supply the robot with a whole-body sensing - an ability to sense surrounding objects at the robot’s whole body and utilize these data in real time. This calls for large-area flexible sensing arrays - sensitive skin covering the whole robot body akin to the skin covering the human body. Whole-body sensing brings interesting, even unexpected, properties: powerful robots become inherently safe; human operators can move them fast, with “natural” speeds; robot motion strategies exceed human spatial reasoning skills; it becomes realistic to utilize natural synergy of human-robot teams and allow a mix of supervised and unsupervised robot operation. We will review the mathematical, algorithmic, hardware (materials, electronics, computing), as well as control and cognitive science issues involved in realizing such systems.



  Date and Time

  Location

  Contact

  Registration



  • J.J. Pickle Research Center
  • 10100 Burnet Rd, Austin, TX
  • Austin, Texas
  • United States 78758
  • Building: The J. Neils Thompson Commons Building (TCB), #137
  • Room Number: Balcones Room
  • Click here for Map
  • Brent Lunceford, Chair-IEEE CTS MEMS & Sensors Chapter

    brent.lunceford@utexas.edu

    Dr. Larry Larson, IEEE CTS Electron Devices Society

    larry.larson@txstate.edu

    v

  • Co-sponsored by Electron Devices Society
  • Starts 02 January 2017 12:00 AM
  • Ends 30 January 2017 09:00 PM
  • All times are US/Central
  • No Admission Charge
  • Register


  Speakers

Dr. Vladimir Lumelsky

Dr. Vladimir Lumelsky of University of Wisconsin-Madison

Topic:

Human-Robot Interaction and Whole-Body Robot Sensing

Biography:

Biography:Vladimir Lumelsky is Professor at the University of Wisconsin-Madison. His Ph.D. in Applied Mathematics is from the Institute of Control Sciences, Russian National Academy of Sciences, Moscow. He has held engineering, research, and faculty positions with Ford Motor Research Labs, General Electric Research Center, Yale University, University of Wisconsin-Madison, University of Maryland, NASA-Goddard Space Center, National Science Foundation. Concurrently he held visiting positions with the Tokyo Institute of Science, Japan; Weizmann Institute, Israel; USA-Antarctica South Pole Station.

He has served аs IEEE Sensors Council President; Founding Editor-in-Chief of IEEE Sensors Journal; chair and co-chair of major conferences; on Editorial Boards of IEEE Transactions on Robotics and Automation and other journals; on various governing bodies and committees of IEEE; served as guest editor for special journal issues. He has authored over 200 publications (books, journal papers, conferences, reports); is IEEE Life Fellow, and member of ACM and SME.

Email:

Dr. Vladimir Lumelsky of University of Wisconsin-Madison

Topic:

Human-Robot Interaction and Whole-Body Robot Sensing

Biography:

Email:





Agenda

6:00-7:00pm    Arrival, networking

7:00-8:00pm    Presentation

8:00-8:25pm    Q & A

8:30pm            Depart



a joint meeting brought to you by the IEEE Central Texas MEMS & Sensors Chapter and the IEEE Central Texas Electron Devices Society