Experts discovered that facial-recognition systems are allegedly using people’s random online photos without their permission. They claimed that online images of random users are being used by these technologies are extracting them so that developers can use them for further enhancements and improvements.
This is currently a serious privacy breach since most people are doing daily tasks online. Because of the ongoing COVID-19 pandemic, many individuals are forced to stay inside their houses, leaving them no choice but to communicate with their friends and family online.
Security researchers claimed that facial-recognition technology could be used to identify individuals even when offline. Because of this serious privacy issue, they created a website that could help raise people’s awareness regarding what’s happening.
Experts explained that facial-recognition tech needs to gather a huge number of random online photos, specifically those that have users’ faces so that developers could enhance more this innovation. Right now it is impossible to identify if your images are being used by the said technology.
New anti-facial-recognition tool
However, since the new tool called “Exposing.ai” is already available for use, you can now check if your and your friends’ online photos are being used by facial-recognition systems without your permission.
However, the new site’s ability is currently limited since it only allows you to search for the images you have uploaded on the website Flickr, a video and photo-sharing site. This new tool allows you to search more than 3.6 million photos in six facial-recognition image databases, specifically VGG Face, MegaFace, IJB-C, FaceScrub, DiveFace, and PIPA.
Using this new online software is a piece of cake. All you need to do is copy your Flickr username or get your photo URL and paste it on the website’s search bar. After that, you need to check the “I Read and Agree to the Terms and Use” option.
Adam Harvey, an artist, and researcher, said that “it’s easiest to understand when it becomes more personal.” He added that “sometimes it helps to have visual proof.”
Harvey is the that created the new online software, together with Jules LaPlace, a fellow artist and programmer. They also collaborated with the non-profit Surveillance Technology Oversight Project (STOP).
On the other hand, the new software has various features that would help you identify if your online photos are being used without any permission. f photos are found, Exposing.ai will show you a thumbnail of each, along with the month and year that they were posted to your Flickr account and the number of images that are in each dataset.
Is the new tool accurate?
Harvey explained that the new software doesn’t cover that many online photos. This is because most developers and tech firms don’t reveal the methods they use to acquire random online images used to improve their malicious technology. The site’s developer added that the 3.6 million photos are just the tip of the iceberg.
If the current privacy continues, this could lead to negative outcomes. Some authorities could use it to identify people that are wrongly accused of different crimes and violations. It is also alarming since facial-recog technologies could be used in the wrong ways, especially since racism is still an issue in the United States.