ChatGPT, arguably the most famous chatbot ever, learned its sometimes human-like conversational skills by parsing through absurd amounts of text data—millions of books, articles, Wikipedia pages, and everything else its creators could find by crawling around the Internet.
But what if an advanced AI could learn the way a little kid does, without reading 80 million books or looking at 97 million cats? Just making its first baby steps exploring an amazing new world under the patient guidance of mom and dad. A team of New York University researchers just gave it a shot, and it kind of worked.
“The big thing this project speaks to is this classic debate on nurture versus nature. What is built into the child and what can be acquired through experience out in the world?” says Wai Keen Vong, a researcher at the NYU Center for Data Science. To find out, Vong and his team pushed an AI algorithm through the closest possible equivalent of early human childhood. They did this by feeding it a database called SAYCam-S, which is filled with first-person video footage taken by a camera strapped to a baby named Sam, recorded while Sam was doing usual baby things between the sixth and 25th month of his life.