Plenary Speakers

Plenary 1: The Robotics Part of Micro and Nano Robots

Oct. 24, 9:00-9:50@Main Hall

Brad Nelson

Professor of Robotics and Intelligent Systems (IRIS), ETH Zürich, Switzerland

Abstract:

Micro and nano robots have made great strides since becoming a focused research topic over two decades ago. Much of the progress has been in material selection, processing, and fabrication, and paths forward in developing clinically relevant biocompatible and biodegradable micro and nano robots are becoming clear. Our group, as well as others, maintain that using biocompatible magnetic composites with externally generated magnetic fields and field gradients is perhaps closest to clinical application. One of the most challenging aspects of the field is in the development of the magnetic navigation system (MNS) that generates the fields and field gradients needed for microrobot locomotion. In this talk, I will present an overview of MNSs and show how these systems are fundamentally robotic in the way they must be designed and controlled. Decades of work in robotic manipulation can be brought to bear on this problem as we move forward in bringing MNS technology to the clinic. I will also look at recent efforts in creating more intelligent micro and nano robots that exhibit increasingly complex behaviors, some of which can even be programmed in situ. The field appears to be on the cusp of realizing the fantastic voyage.

Bio:

Brad Nelson is the Professor of Robotics and Intelligent Systems at ETH Zürich and has recently become the Chief Scientific Advisor of Science Robotics. He has over thirty years of experience in the field and has received a number of awards in robotics, nanotechnology, and biomedicine. He serves on the advisory boards of a number of academic departments and research institutes across North America, Europe, and Asia. Prof. Nelson has been the Department Head of Mechanical and Process Engineering at ETH twice, the Chairman of the ETH Electron Microscopy Center, and a member of the Research Council of the Swiss National Science Foundation. He also serves on boards of three Swiss companies and is a member of the Swiss Academy of Engineering (SATW). Before moving to Europe, Nelson worked as an engineer at Honeywell and Motorola and served as a United States Peace Corps Volunteer in Botswana, Africa. He has also been a professor at the University of Minnesota and the University of Illinois at Chicago.

Plenary 1: The Robotics Part of Micro and Nano Robots

Oct. 24, 9:00-9:50@Main Hall

Brad Nelson

Professor of Robotics and Intelligent Systems (IRIS)

ETH Zürich, Switzerland

Abstract:

Micro and nano robots have made great strides since becoming a focused research topic over two decades ago. Much of the progress has been in material selection, processing, and fabrication, and paths forward in developing clinically relevant biocompatible and biodegradable micro and nano robots are becoming clear. Our group, as well as others, maintain that using biocompatible magnetic composites with externally generated magnetic fields and field gradients is perhaps closest to clinical application. One of the most challenging aspects of the field is in the development of the magnetic navigation system (MNS) that generates the fields and field gradients needed for microrobot locomotion. In this talk, I will present an overview of MNSs and show how these systems are fundamentally robotic in the way they must be designed and controlled. Decades of work in robotic manipulation can be brought to bear on this problem as we move forward in bringing MNS technology to the clinic. I will also look at recent efforts in creating more intelligent micro and nano robots that exhibit increasingly complex behaviors, some of which can even be programmed in situ. The field appears to be on the cusp of realizing the fantastic voyage.

Bio:

Brad Nelson is the Professor of Robotics and Intelligent Systems at ETH Zürich and has recently become the Chief Scientific Advisor of Science Robotics. He has over thirty years of experience in the field and has received a number of awards in robotics, nanotechnology, and biomedicine. He serves on the advisory boards of a number of academic departments and research institutes across North America, Europe, and Asia. Prof. Nelson has been the Department Head of Mechanical and Process Engineering at ETH twice, the Chairman of the ETH Electron Microscopy Center, and a member of the Research Council of the Swiss National Science Foundation. He also serves on boards of three Swiss companies and is a member of the Swiss Academy of Engineering (SATW). Before moving to Europe, Nelson worked as an engineer at Honeywell and Motorola and served as a United States Peace Corps Volunteer in Botswana, Africa. He has also been a professor at the University of Minnesota and the University of Illinois at Chicago.

Plenary 2: Navigation Robot for the Visually Impaired

Oct. 25, 9:00-9:50@Main Hall

Chieko Asakawa

IBM Research, CMU, Miraikan

Abstract:

Blind people face many difficulties when they navigate and explore unfamiliar places alone. In general, sighted people use visual information to find a destination and to avoid collisions. Blind people, however, must rely on non-visual information, such as haptic information from a white cane or ambient sounds. Recent technologies, such as AI and robotics, have great possibilities to offer new solutions to interpret visual information to a non-visual medium, and guide blind people in a safe, socially compliant way in public spaces. In this talk, I will address our recent work concerning a suitcase-shaped navigation robot for the blind. It is equipped with motors and several sensors for autonomous navigation. On the top of the suitcase, it has LiDAR for localization and obstacle detection, and an RGB-D camera for detecting pedestrians to avoid collisions. The handle has buttons to control its movement, and vibrotactile devices to provide non-visual information. I will share recent experimental results in real-world environments, such as an airport, and a shopping mall. I will also talk about new challenges for the blind in the recent pandemic situation, and introduce possible solutions. Finally, I will discuss how we can accelerate the implementation of new technologies into our society.

Bio:

Chieko Asakawa is an IBM Fellow, working in the area of accessibility. Her initial contribution to the field started from braille digitalization and moved onto the Web accessibility, including the world’s first practical voice browser. Today, Chieko is focusing on real world accessibility to help the visually impaired understand their surroundings and navigate the world by the power of AI. She has been serving as an IBM Distinguished Service Professor at Carnegie Mellon University since 2014. She started to concurrently serve as Chief Executive Director of the Japanese National Museum of Emerging Science and Innovation (Miraikan) since April 2021. In 2013, the government of Japan awarded the Medal of Honor with Purple Ribbon to Chieko for her outstanding contributions to accessibility research. She was elected as a foreign member of the US National Academy of Engineering in 2017, inducted into the National Inventors Hall of Fame (NIHF) in 2019. She also received American Foundation for the Blind 2020 Helen Keller Achievement Award.

Plenary 2: Navigation Robot for the Visually Impaired

Oct. 25, 9:00-9:50@Main Hall

Chieko Asakawa

IBM Research, CMU, Miraikan

Abstract:

Blind people face many difficulties when they navigate and explore unfamiliar places alone. In general, sighted people use visual information to find a destination and to avoid collisions. Blind people, however, must rely on non-visual information, such as haptic information from a white cane or ambient sounds. Recent technologies, such as AI and robotics, have great possibilities to offer new solutions to interpret visual information to a non-visual medium, and guide blind people in a safe, socially compliant way in public spaces. In this talk, I will address our recent work concerning a suitcase-shaped navigation robot for the blind. It is equipped with motors and several sensors for autonomous navigation. On the top of the suitcase, it has LiDAR for localization and obstacle detection, and an RGB-D camera for detecting pedestrians to avoid collisions. The handle has buttons to control its movement, and vibrotactile devices to provide non-visual information. I will share recent experimental results in real-world environments, such as an airport, and a shopping mall. I will also talk about new challenges for the blind in the recent pandemic situation, and introduce possible solutions. Finally, I will discuss how we can accelerate the implementation of new technologies into our society.

Bio:

Chieko Asakawa is an IBM Fellow, working in the area of accessibility. Her initial contribution to the field started from braille digitalization and moved onto the Web accessibility, including the world’s first practical voice browser. Today, Chieko is focusing on real world accessibility to help the visually impaired understand their surroundings and navigate the world by the power of AI. She has been serving as an IBM Distinguished Service Professor at Carnegie Mellon University since 2014. She started to concurrently serve as Chief Executive Director of the Japanese National Museum of Emerging Science and Innovation (Miraikan) since April 2021. In 2013, the government of Japan awarded the Medal of Honor with Purple Ribbon to Chieko for her outstanding contributions to accessibility research. She was elected as a foreign member of the US National Academy of Engineering in 2017, inducted into the National Inventors Hall of Fame (NIHF) in 2019. She also received American Foundation for the Blind 2020 Helen Keller Achievement Award.

Plenary 3: Towards Collective Artificial Intelligence

Oct. 26, 9:00-9:50@Main Hall

Radhika Nagpal

Princeton University, USA

Abstract:

In nature, groups of thousands of individuals cooperate to create complex structure purely through local interactions — from cells that form complex organisms, to social insects like termites and ants that build nests and self-assemble bridges, to the complex and mesmerizing motion of fish schools and bird flocks. What makes these systems so fascinating to scientists and engineers alike, is that even though each individual has limited ability, as a collective they achieve tremendous complexity. What would it take to create our own artificial collectives of the scale and complexity that nature achieves? In this talk I will discuss several ongoing projects that use inspiration from biological self-assembly to create robotic systems: The Kilobot swarm inspired by cells, the Termes and EcitonR robots inspired by the 3D assembly of termites and army ants, and the BlueSwarm project inspired by fish schools. There are many challenges for both building and programming robot swarms, and we use these systems to explore decentralized algorithms, embodied intelligence, and methods for synthesizing complex global behavior. Our theme is the same: can we create simple robots that cooperate to achieve collective complexity?

Bio:

Radhika Nagpal is a Professor of robotics at Princeton University, joint between the departments of Mechanical Engineering and Computer Science, where she leads the Self-organizing Swarms & Robotics Lab (SSR).  Nagpal is a leading researcher in swarm robotics, bio-inspired algorithms, and self-organized collective intelligence. Projects from her lab include bio-inspired multi-robot systems such as the Kilobot thousand-robot swarm (Science 2014), the Termes robots for collective construction (Science 2014), and the Blueswarm underwater robots (Science Robotics 2021), as well as models of biological collective intelligence (Nature Communications 2022). In 2017 Nagpal co-founded ROOT Robotics, an educational robotics company aimed at democratizing AI and robotics through early education, recently acquired by iRobot. Nagpal is also known for her Scientific American blog article (“The Awesomest 7 Year Postdoc”, 2013) advocating academic cultural change and she received the Anita Borg Early Career Award (2010) and McDonald Mentoring Award (2015). Nagpal is an ACM Fellow and AAAI fellow (2020), and was an invited TED speaker in 2017. Nagpal was named by Nature magazine as one of the top ten influential scientists and engineers of the year (Nature10 award, Dec 2014).

Plenary 3: Towards Collective Artificial Intelligence

Oct. 26, 9:00-9:50@Main Hall

Radhika Nagpal

Princeton University, USA

Abstract:

In nature, groups of thousands of individuals cooperate to create complex structure purely through local interactions — from cells that form complex organisms, to social insects like termites and ants that build nests and self-assemble bridges, to the complex and mesmerizing motion of fish schools and bird flocks. What makes these systems so fascinating to scientists and engineers alike, is that even though each individual has limited ability, as a collective they achieve tremendous complexity. What would it take to create our own artificial collectives of the scale and complexity that nature achieves? In this talk I will discuss several ongoing projects that use inspiration from biological self-assembly to create robotic systems: The Kilobot swarm inspired by cells, the Termes and EcitonR robots inspired by the 3D assembly of termites and army ants, and the BlueSwarm project inspired by fish schools. There are many challenges for both building and programming robot swarms, and we use these systems to explore decentralized algorithms, embodied intelligence, and methods for synthesizing complex global behavior. Our theme is the same: can we create simple robots that cooperate to achieve collective complexity?

Bio:

Radhika Nagpal is a Professor of robotics at Princeton University, joint between the departments of Mechanical Engineering and Computer Science, where she leads the Self-organizing Swarms & Robotics Lab (SSR).  Nagpal is a leading researcher in swarm robotics, bio-inspired algorithms, and self-organized collective intelligence. Projects from her lab include bio-inspired multi-robot systems such as the Kilobot thousand-robot swarm (Science 2014), the Termes robots for collective construction (Science 2014), and the Blueswarm underwater robots (Science Robotics 2021), as well as models of biological collective intelligence (Nature Communications 2022). In 2017 Nagpal co-founded ROOT Robotics, an educational robotics company aimed at democratizing AI and robotics through early education, recently acquired by iRobot. Nagpal is also known for her Scientific American blog article (“The Awesomest 7 Year Postdoc”, 2013) advocating academic cultural change and she received the Anita Borg Early Career Award (2010) and McDonald Mentoring Award (2015). Nagpal is an ACM Fellow and AAAI fellow (2020), and was an invited TED speaker in 2017. Nagpal was named by Nature magazine as one of the top ten influential scientists and engineers of the year (Nature10 award, Dec 2014).