When Harry Potter receives an invisibility cloak as a Christmas gift he uses it to conceal himself from Hogwarts teachers and nasty caretaker Argus Filch. Now, researchers from Facebook AI and the University of Maryland have introduced a 21st century version — sweatshirts printed with adversarial examples that make the wearer undetectable to the AI-powered object detectors in today’s public surveillance systems.
Ian Goodfellow, the renown research scientist who pioneered generative adversarial networks (GANs), describes adversarial examples as “inputs to machine learning models that an attacker has intentionally designed to cause the model to make a mistake.” In the new study, researchers printed adversarial examples on sweatshirts and other items to “attack” object detectors and cause them to fail to recognize their targets from images or videos.
Fooling object detectors is much more difficult than fooling classifiers. As the researchers explain: “the ensembling effect of thousands of distinct priors, combined with complex texture, lighting, and measurement distortions in the real world, makes detectors naturally robust.”

This April, Synced reported onresearch from Belgian university KU Leuven which demonstrated how an adversarial attack using a colorful 40 sq cm printed patch could significantly lower the accuracy of object detectors. And in August we covered research from Lomonosov Moscow State University and Huawei Moscow Research Center, which proposed a wearable card designed to conceal a person’s identity from facial recognition systems. Both these efforts were limited to 2D printed patches, while the new study expands the method to the more practical but challenging realm of clothing and 3D objects.


Researchers “trained” their attack patches using a random subset of 10,000 images containing people from the huge COCO dataset. They first evaluated the patches in digital simulated settings: white-box attacks (detector weights used for patch learning) and black-box attacks (patches crafted on a surrogate model and tested on a victim model with different parameters). All the trained patches proved highly effective in digital simulations.


Researchers then moved on to physical world attacks, applying their adversarial examples to posters, paper dolls (folded printouts of test images at different scales) and sweatshirts. The wearable attacks significantly degraded the performance of SOTA object detectors across different environments.
The experiments show such digital attacks can transfer between models, classes and datasets, and also into the real world, although with less reliability than attacks on simple classifiers.
The paper Making an Invisibility Cloak: Real World Adversarial Attacks on Object Detectors is on arXiv.
Author: Yuqing Li | Editor: Michael Sarazen
Thanks for your efforts.
I came upon your webpage happy wheels
And if I have money for shopping but no store where I like the clothes. What to do in this case?
Hi. I want to make a gift for my dad. But I have absolutely no imagination. Maybe you know some interesting gifts that the man would like?
Hi Friend. Father’s Day is a special occasion to show your dad how much you appreciate him, and a great way to do that is with fathers day gift baskets. Hampers.com offers a great selection of Father’s Day gifts that are sure to bring a smile to his face. This isn’t the first time I’ve made these gifts.