Sam Kriegman is an AI2050 Early Career Fellow at Northwestern University. His AI2050 project involves using AI to design novel robots that, by 2050, will carry out useful work in the real world. This work addresses Hard Problem 1 (AI capabilities) and Hard Problem 4 (addressing humanity’s greatest challenges). Previously, Sam and his collaborators used AI to create fully-biological robots known as “xenobots”. This work was featured in a documentary by Scientific American, “These Researchers Used AI to Design a Completely new ‘Animal Robot.”
“AI2050 is helping my students and I take AI-designed robots, such as xenobots, to the next level!” says Sam.
Learn more about Sam Kriegman:
How hard is it to create robots?
It takes human engineers years to dream up a new robot, design, build and test prototypes, and revise them until they work. When this process was automated in the past it still required weeks of supercomputing because it relied on randomly changing the robot’s design and hoping for the best. This is because we had no clue how to solve the credit assignment problem for robots.
What is the credit assignment problem?
We did not know how to trace deficiencies in a robot’s behavior back to errors or inefficiencies in a robot’s body. I am happy to report that we finally cracked this problem! We can now design new robots in seconds, on a laptop computer, and I see no reason this could not occur in milliseconds, more or less instantaneously.
How do you imagine this technology being used by engineers in 2050?
Say you want a robot that cleans up a polluted river in your town. What should your robot look like? What materials should be used to build it? Should it have limbs? Where? How many? What kind of sensors and motors should it have and where should they go? By 2050 you will be able to simply pose the design problem to an AI on your mobile device and see what it comes up with. These AI-generated robots may be directly manufactured and put to work or they might inspire you to think differently about the problem and its possible solutions.
Do robot-designing AIs differ from AIs that generate text and images?
Indeed, we rely on the very same mathematical trick that is used to automatically design chatbots, images and proteins. It took a few more clever tricks to generalize these ideas to automatically design complex physical machines such as robots. The upshot is that through robots, AI can act directly on the world, and experiment to understand our world and us better. This will result in a virtuous cycle of better robots and smarter, more reliable AI.
How do you actually make the robots?
At the moment our robots can crawl around but they are fairly simple, no more than 10 cm across, with just one or two air-powered muscles and little to no onboard intelligence [artificial intelligence that is built into a device, rather than outsourced to remote technology]. So we can easily 3D print and mold their bodies with existing technologies. But, in collaboration with my colleague and Schmidt Science Fellow, Ryan Truby, we are developing new ways to automatically fabricate AI-designed robots of diverse size and shape with complex musculature and nervous systems.