The 11 weirdest things humans did to robots in 2024


Robots have progressed over the years from clunky hunks of metal to complex, AI-enabled machines capable of running, speaking, and even painting pictures. But even with all those advances humans still can’t help but place robots in bizarre and uncomfortable situations. 

This year, researchers took advanced robots and had them clean up karate-chopped Coke cans, suck up cigarette butts, wear a fleshy, lab-grown face, and pick up dog poo. Two-legged, humanoid robots, which could one day work on factory floors, were gut-punched and forced to wear festive clothes while performing acrobatics. Here are just a few of the oddest things we did to robots this year. 

Pet-owner created a robot to autonomously scoop dog poop 

Having pets can add a layer of joy to life that’s irreplaceable. That is, except for the one to two times per day that furry new bundle of joy leads you to bend over and scoop excrement off of hot concrete. A Corgi-owning Minnesota man named Caleb Olson is all too familiar with this dilemma and believes he may have created a solution: an autonomous, flying poop-collecting robot. He calls his invention the “Poopcopter.” 

The quadcopter is programmed to fly around a backyard or other predetermined area and use real-time computer vision to scan for signs of poop. Once detected, the “doo doo drone” as Olsen sometimes refers to his invention, will soar down right above its target, rotate around 30 degrees and then use a custom design 3D-printed scooper to grab and remove the waste. In his demonstration, Olson said the packaged poo could then be dispersed in a detected garage area or maybe even on a neighbor’s roof.

“Whenever it detects she [Olson’s dog named Twinkie] is pooping it keeps a log of when she poops and stores an image and over time stores a location,” Olsen said during a demonstration. “Which is really nice in the winter when snow covers it.” 

[ Related: Researchers tortured robots to test the limits of human empathy ]

Engineers forced this vacuum robot to suck up spent cigarettes 

hqdefault

Smoking cigarettes isn’t just rough on the body: they can also make a mess of the planet. In the US in 2021, estimates show there were 9.7 billion cigarette butts discarded on the ground. That’s reportedly around 20% of all litter for the year. Researchers from the Italian Institute of Technology (IIT) created a four-legged, vacuum-equipped robot in an effort to shrink that growing mountain of butts down. 

The VERO, or “Vacuum-cleaner Equipped Robot,” has a 3D-printed nozzle equipped on each of its feet which allows it to suck up cigarettes. VERO uses a neural network to interpret visual data from the robot’s onboard cameras. Once it detects a butt, it then quickly runs a calculation to determine the best way to angle itself to vacuum the butts. In theory, a VERO could be deployed on public beaches or other outdoor areas where people tend to flick their spent smokes. 

[ Related: In 1928, Eric the Robot promised the robo-butler of the future ]

Researchers had humans violently shake robots to test the limits of our empathy

Humans don’t have the best track record when it comes to showing kindness to robots. There’s a long history of engineers and everyday people, kicking, beating, and generally abusing machines with cold-hearted detachment. But one researcher from Radboud University Nijmegen named Marieke Wieringa wanted to see if that same dynamic would play out if the robots being tortured could scream out in pain. 

In an experiment, she has people take a small robot and shake it violently. In some cases, nothing more would happen but other times the robot would emit a pitiful whimpering sound from its speakers. An artificial pair of eyes would attempt to convey sadness. The human subjects are more likely to feel guilty when the robot cries out. In an additional experiment, Wieringa gave subjects the option of performing a boring task or shaking the robot. The crying robot was a decisive factor in whether people performed the task or not. 

“Most people had no problem shaking a silent robot, but as soon as the robot began to make pitiful sounds, they chose to do the boring task instead,” Wieringa said in a statement. 

A humanoid robot painted an image of its ‘AI God’ 

hqdefault

There’s still no compelling evidence that machines are “conscious” or “sentient” in the way a human is but that hasn’t stopped many from running with the idea. In one of the odder, more esoteric examples of this, artists asked an AI-enabled humanoid robot what type of painting it would hypothetically make in relation to the phrase “AI for Good.” The robot, called Ai-Da, suggested a portrait of computer scientist Alan Turing. The robot created multiple portraits which were later called “AI God.” 

The painter robot was created by Oxford University researchers and the robotics company, Engineered Arts. It captures images using front-facing cameras and then uses onboard graphics algorithms to generate images. A pair or robotic arms controlling paintbrushes then translates those generated images onto paper. Whether you personally appreciate the work or not, someone found it compelling. AI God sold for $1,084,800 at a Sotheby’s auction earlier this year following a bidding war between 27 people. 

Researchers covered this robot’s face in ‘living’ human-like skin 

hqdefault

Robots designed to resemble humans already have a tendency to make some people feel uncomfortable. Researchers from the University of Tokyo took that uneasy feeling to the next level, however, by creating lab-grown “skin” bioengineered from human cells. They then took that new layer of skin and applied it to a robot’s face. The result is an utterly horrifying, pink, goopy blob. If that wasn’t enough, they also used mechanical actuators to make it look like the pink slop was smiling. 

[ Related: Watch Google’s ping pong robot beat humans at their own game ]

Researchers mind-controlled a squishy robot using a mushroom 

hqdefault

Mind-controlling mushrooms might not be as unlikely as it may sound. Earlier this year, researchers from Cornell University and the University of Florence in Italy demonstrated how electrical signals sent through mycelium could cause movements in a starfish shaped connected robot. In a nutshell, the team would shine the mycelium (which naturally doesn’t like light) with flashing UV strobes. The reaction from the fungus to the light would then trigger the squishy robot to move its leg. The mycelium, in other words, was controlling the robot’s “brain.” In practical terms, researchers believe these kinds of robotic biohybrids could one day analyze agricultural fields on their own to monitor for potentially harmful changes in soil chemistry.

A flexible humanoid robot crushed nuts and took punches to the chest

hqdefault

Human-looking bipedal robots are becoming increasingly popular, with several companies including Tesla collectively spending billions to make them a reality. But it’s still not entirely clear what they will end up doing. Supporters say they could work in factories, perform dangerous tasks, or even do your laundry. One humanoid robot company called Unitree recently showed off different use cases: smashing nuts and slicing open Coke cans. 

In a video released earlier this year, the company showed its oddly flexible ‘G1 Humanoid Agent’ contouring itself into pretzel shapes and performing a variety of seemingly useful and semi-useful tasks. At one point, a researcher straps on a red boxing glove and gives the robot a few hefty jabs to the chest. It stumbles but never falls. 

Robot surgeons used tweezer-like arms to operate on pork loins

hqdefault

Robots are getting better and better at mimicking surgeons every year. But just like human doctors, robots also can’t just operate on humans without practice. In an odd example this year, scientists taught this surgical robot to use a small pair of tweezer-like grabbers to operate on a pork loin and a chicken thigh. If that doesn’t sound all that impressive on its own, consider that the robots were able to perform this test surgery after simply analyzing prior video footage from real medical experts. The researchers behind the robot were surprised at just how well their training method, which is similar to the process powering large language models like ChatGPT, worked in practice. 

“All we need is image input and then this AI system finds the right action,” postdoctoral researcher Ji Woong Kim said. “We find that even with a few hundred demos, the model is able to learn the procedure and generalize new environments it hasn’t encountered.”

Google taught a pair of cute mini robots to play soccer 

hqdefault

Researchers from Google DeepMind actually realized the dream of many sports-living kids: they made upright robots play soccer with each other. Using deep reinforcement learning, the researchers were able to train a pair of robots in simulations on soccer-related training data. This process is similar to the way DeepMind previously trained AI models to beat humans at games like Chess, Go, and Starfield. But unlike those cases, the researchers now had to apply those learning to a physical robot body. It took some time but eventually, the pair of robots were able to dribble, defend, and even shoot goals though not without the occasional tumble here and there.

Scientists sent a ‘Robodog’ through radiated areas of the Large Hadron Collider. 

The Large Hadron Collider in Geneva is the source of important scientific discoveries but it’s also exceptionally dangerous. The powerful particle accelerator smashes protons at nearly the speed of light which results in radiation. That radiation is harmful to humans, but not to robots. Scientists at the European Organization for Nuclear Research (CERN) realized that and developed a four-legged robot specifically designed to crawl and scurry its way through otherwise impenetrable areas of the facility. Once inside, the Good Boy could autonomously patrol and monitor for signs of fires or other potentially dangerous hazards.  

Boston Dynamic made humanoid robots perform manual labor in festive costumes 

hqdefault

No one has made a bigger name for themselves by putting robots in weird situations than Boston Dynamics. The company repeated that trend this year with its new, smaller Atlas humanoid robot. In a demonstration of its practical abilities this year, the company released a video of it grabbing and moving engine covers, all autonomous. And because it was Halloween, it did all this while wearing a hot dog outfit. 

[ Related: Boston Dynamics wishes you a merry terrifying robot Christmas in new video ]

Just a month later Atlas made another festive appearance, this time in a Santa Claus outfit. In that video, Atlas performed a slightly terrifying backflip reminiscent of its older, beefier predecessor. It’s unclear what the practical application of robot backflips is, but they certainly are memorable.

 

ps ggs

Win the Holidays with PopSci’s Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.

 





Source link

About The Author

Scroll to Top