Abstract:
Designing robots for human interaction is a multifaceted challenge involving the robot’s intelligent behavior, physical form, mechanical structure, and interaction schema. Our lab develops and studies human-centered robots using a combination of methods from AI, Design, and Human-Computer Interaction. This talk focuses on three recent projects, two concerning the design of a new robot, and one that tackles designing robots that help human designers.
Biography:
Guy Hoffman is Associate Professor and the Mills Family Faculty Fellow in the Sibley School of Mechanical and Aerospace Engineering at Cornell University. Prior to that he was an Assistant Professor at IDC Herzliya and co-director of the IDC Media Innovation Lab. Hoffman holds a Ph.D from MIT in the field of human-robot interaction. He heads the Human-Robot Collaboration and Companionship (HRC²) group, studying the algorithms, interaction schema, and designs enabling close interactions between people and personal robots in the workplace and at home. Among others, Hoffman developed the world’s first human-robot joint theater performance, and the first real-time improvising human-robot Jazz duet. His research papers won several top academic awards, including Best Paper awards at robotics conferences in 2004, 2006, 2008, 2010, 2013, 2015, 2018, 2019, 2020, and 2021. His TEDx talk is one of the most viewed online talks on robotics, watched more than 3 million times.
Abstract:
Many successful approaches to robotic locomotion and manipulation operate with high quality simulation tools. Many such approaches are “bottom-up” in a modeling sense, accounting for all internal forces and environmental interactions explicitly. These “bottom-up” models are used either beforehand (such as in reinforcement learning) and/or in real time. However, various types of robots are getting smaller, softer, and more complex (e.g. bio-hybrid actuators). Some robots lean on low-precision manufacturing and fabrication techniques, and many robots are now being asked to operate in hard-to-characterize, natural interfaces like the human body. Such attributes can render “bottom-up” simulators impractical for expected use cases on various research frontiers, such as micro-biomedical robots and soft robots deployed in uncharacterized environments. In this talk I will revisit the reconstruction equation, a result from the geometric mechanics literature that offers a “top-down” view of Lagrangian systems, permitting insights into generalizable system behaviors along a spectrum of friction-momentum dominance. I will show how these tools can permit rapid modeling of high complexity robots in their operating environment without the requirement to specify CAD models or any explicit forces. I will also discuss a related strength and weakness of the approach resulting from the use of symmetries. Surprisingly, results in simulation and hardware indicate that even with eight-jointed systems, useful behavioral models can be computed from tens of cycles of data. This suggests that high degree of freedom robots can adjust and excel in situations where explicit force models are poorly understood. I will also briefly discuss a framework for robot recovery that leans on these tools as well as a metric for a robot’s ability to cover the local space of motions, computed on the Lie algebra of the position space. The metric allows primitives to be valued for their contribution to the space of composed motions rather than just their individual qualities. Results here include a Dubins car that can learn how to turn left (with its steering wheel restricted to only turn right) in less than a second as well as a robot made of tree branches that can learn to walk around the laboratory with less than twelve minutes of experimental data. I hope to motivate the general use of structural reductions as we pursue modeling and control of the next generation of high complexity robots.
Biography:
Dr. Brian Bittner received a B.S. from Carnegie Mellon and a PhD from Michigan where he researched the theory, simulation, and application of physics informed machine learning for in situ behavior modeling and optimization. He has sought out cross-disciplinary environments for research, collaborating with physicists, biologists, and mathematicians, working to facilitate insights from these fields into robotic systems. Bittner is currently a research scientist at the Applied Physics Lab. He is currently working on approaches to modeling and control for soft robots and underwater manipulation.
Abstract:
The talk will present a survey of my research activities, with more detailed presentation of our guidance system for robot-assisted prostate cancer surgery. The majority of prostate cancer surgery is carried out with the da Vinci surgical system. Tracking of instruments and hand-eye calibration of this robotic system enables the overlay of pre-operative magnetic resonance imaging by registration to real-time ultrasound. This enables visualization of sub-surface anatomy and cancer. We will discuss our system design, visualization and registration approaches.
We will also discuss instrumentation for force sensing using the da Vinci Research Kit, and a new approach to teleguidance for ultrasound examinations.
Biography:
Tim Salcudean is a Professor with the Department of Electrical and Computer Engineering, where he holds the C.A. Laszlo Chair in Biomedical Engineering. He is cross-appointed with the UBC School of Biomedical Engineering and the Vancouver Prostate Centre. He is on the steering committee of the IPCAI conference and on the Editorial Board of the International Journal of Robotics Research. He is a Fellow of IEEE, MICCAI and of the Canadian Academy of Engineering. His research interests are in medical robotics, medical image analysis and elastography imaging.
Panelists:
Abstract:
Flexible and soft medical robots offer capabilities beyond those of conventional rigid-link robots due to their ability to traverse confined spaces and conform to highly curved paths. They also offer potential for improved safety due to their inherent compliance. In this talk, I will present several new robot designs for various surgical applications. In particular, I will discuss our work on soft, growing robots that achieve locomotion by material extending from their tip. I will discuss limitations in miniaturizing such robots, along with methods for actively steering, sensing, and controlling them. Finally, I will discuss new sensing and human-in-the-loop control paradigms that are aimed at improving the performance of flexible surgical robots.
Bio:
Tania Morimoto is an Assistant Professor in the Department of Mechanical and Aerospace Engineering and in the Department of Surgery at the University of California, San Diego. She received the B.S. degree from Massachusetts Institute of Technology, Cambridge, MA, and the M.S. and Ph.D. degrees from Stanford University, Stanford, CA, all in mechanical engineering. Her research lab focuses on the design and control of flexible continuum robots for increased dexterity and accessibility in uncertain environments, particularly for minimally invasive surgical interventions. They are also working to address the challenges of designing human-in-the-loop interfaces for controlling these flexible and soft robots, including the integration of haptic feedback to improve surgical outcomes. She is a recipient of the Hellman Fellowship (2021), the Beckman Young Investigator Award (2022), and the NSF CAREER Award (2022).
Abstract:
Extreme globalization, war in the Western World, COVID-19 are presenting together an unprecedented challenge for humanity. Engineering intelligent systems and robotics can help to counter-balance the negative effects in a number of ways. Potential technology-driven solutions include the emergence of medical robots, Surgical Data Science, AI-based support for early anomaly detection and health diagnosis, rescue robotics, smart agrifood robotic solutions and beyond. Much of these areas are addressed by the various applied research projects of the University Research and Innovation center (EKIK) at Óbuda University. This presentation highlights through examples the role that robotics and automation can play in living up to global challenges. The talk will also cover the ethical implications of robotics research, in both the emergency and post-pandemic world, with a specific focus on the 2015 UN Sustainable Development Goals.
Abstract:
Human motor learning depends on a suite of brain mechanisms that are driven by different signals and operate on timescales ranging from minutes to years. Understanding these processes requires identifying how new movement patterns are normally acquired, retained, and generalized, as well as the effects of distinct brain lesions. The lecture will focus on normal and abnormal motor learning, and how we can use this information to improve rehabilitation for individuals with neurological damage.
Bio:
Dr. Amy Bastian is a neuroscientist who has made important contributions to the neuroscience of sensorimotor control. She is the Chief Science Officer at the Kennedy Krieger Institute, and Director of the motion analysis laboratory that studies the neural control of human movement. Dr. Bastian is also a Professor of Neuroscience, Neurology and PM&R at the Johns Hopkins University School of Medicine. Dr. Bastian is a recognized and highly accomplished neuroscientists whose interests include understanding cerebellar function/dysfunction, locomotor learning mechanisms, motor learning in development, and how to rehabilitate people with many types of neurological diseases.
Abstract: Planning, the ability to imagine different futures and select one assessed to have high value, is one of the most vaunted of animal capacities. As such it has been a central target of artificial intelligence work from the origins of that field, in addition to being a focus of neuroscience and cognitive science. These separate and sometimes synergistic traditions are combined in our new work exploring the origin and mechanics of planning in animals. We will show how mammals evade autonomous robot “predators” in complex large arenas. We have discovered that depending on the arrangement and density of barriers to vision, animals appear to carefully manage their uncertainty about the predator’s location in order to reach their goal. Their behavior appears unlikely to be driven by cached responses that were successful in the past, but rather based on planning during brief pauses over which they peek at the hidden robot adversary that is looking for them. After peeking, they re-route to avoid the predator.
Bio: Malcolm A. MacIver is a group leader of the Center for Robotics and Biosystems at Northwestern University, with a joint appointment between Mechanical Engineering and Biomedical Engineering, and courtesy appointments in the Department of Neurobiology and the Department of Computer Science. His work focuses on extracting principles underlying animal behavior, focusing on interactions between biomechanics, sensory systems, and planning circuits. He then incorporates these principles into biorobotic systems or simulations of the animal in its environment for synergy between technological and scientific advances. For this work he received the 2009 Presidential Early Career Award for Science and Engineering from President Obama at the White House. MacIver has also developed interactive science-inspired art installations that have exhibited internationally, and consults for science fiction film and TV series makers.