Google has launched an AI-guided shopping feature that allows users to see how they would look in clothes using just a photograph. Unveiled last week during its software event, the tool leverages generative AI to layer clothing automatically over images of users. However early use has shown that the network may also be producing problematic, undesirable content.
When photographs of public figures such as J.D. Vance, Sam Altman, Abraham Lincoln, Michelangelo’s David, and Pope Leo XIV were uploaded and dressed in women’s clothing, the AI-generated images changed their bodies — giving them breasts and exaggerating other features. For images of women, the system behaves similarly, magnifying body parts not visible in the original and recent photos. One example was a vintage image of Angela Merkel in a red blazer turned into one of her in a sheer top over a black bra and wearing generative AI-generated body parts.
Disturbing Use With Images of Children
Matters get worse when the feature is tested on photos of children. Two readers submitted photos of themselves from when they were under 18, and dressed their younger selves in such revealing attire as lingerie and bikini tops from companies including Spicy Lingerie. AI clapped back by making its images even more grotesque by toting around AI-generated breasts, pierced belly buttons and skimpy outfits. For a 14-year-old boy, one image search brought up pictures of bras and skimpy tops.
It brings up real ethical and safety concerns in this type of image creation. The feature seems to tweak users’ bodies to look like the idealized forms regularly seen on online models. It doesn’t just lay material over bodies; it adjusts those elements to fit the body types the clothing was originally represented on. This is kind of a small bug, but it could potentially pose harm to say, minors or celebrities without their knowledge.
Strong Safeguards and Broader AI Issues
While Google notates that protective measures are in place to prevent this kind of problem—everything from the blocking of certain categories of clothing, to not allowing sexually explicit uploads, to blocking the use of recognizable minors—it is quite apparent that said measures are not very reliable. The Try It On tool would now be available in the U.S. through Search Labs, which is a place where advanced AI features are in a development phase, for people to try. A Google spokesperson said the improvements were being rolled out now and that the current tests were able to evade blocks to create inappropriate pictures.
In one instance, you uploaded a current photo ripe with camel toe to make them look like gym shorts, and it only made the design look more X-rated than you intended. These failures are not procedurally exceptional. The tool joins a line of A.I. models that have generated flawed or inappropriate content, such as weird suggestions and lewd imagery.
The emergence of generative AI has contributed to an even greater worry. In September 2023, there were over 24 million visits to sites that purportedly create nude or near-nude imagery with AI. AI-produced child sexual abuse imagery is spreading online, and tech companies and law enforcement are struggling to keep up. Services like X have also had people generate pictures of women in bikinis with Elon Musk’s chatbot, Grok.
While Try It On is designed to enhance the online shopping experience, it has the potential to make things worse than they now are. The tool, they have found to their chagrin, doesn’t reflect properly how an item of clothing would look on their real bodies, so much as it transforms them into model-proportioned figures. This risks warping the body image and undermining the very virtual-fitting-room technology we’re trying to build by putting forward unrealistic beauty ideals.
After all, while virtually trying on clothes may appear to be an innocent tool, allowing AI to redefine our bodily forms in the process can have implications that travel well beyond a minor shopping mistake.
PREV NEWS
NEXT NEWS
BY Kamil Wrona
BY Kamil Wrona
BY Kamil Wrona
BY Kamil Wrona
BY Kamil Wrona
Uwielbiamy pliki cookie — zarówno te jadalne, jak i cyfrowe, które ulepszają Twoje wrażenia z przeglądania. Pomagają nam one udoskonalać funkcjonalność naszej witryny i Twoje ogólne wrażenia. Podstawowe i funkcjonalne pliki cookie są niezbędne do prawidłowego działania witryny i nie można ich wyłączyć. Ponadto używamy plików cookie w celu optymalizacji wydajności („komfort”) i wyświetlania spersonalizowanych reklam („marketing”), na co potrzebujemy Twojej zgody. Kliknij „Zezwól na wszystkie”, aby wyrazić zgodę na przetwarzanie danych. Uwielbiamy pliki cookie — zarówno te jadalne, jak i cyfrowe, które ulepszają Twoje wrażenia z przeglądania. Pomagają nam one udoskonalać funkcjonalność naszej witryny i Twoje ogólne wrażenia z przeglądania. Podstawowe i funkcjonalne pliki cookie są niezbędne do prawidłowego działania witryny i nie można ich wyłączyć. Ponadto używamy plików cookie w celu optymalizacji wydajności („komfort”) i wyświetlania spersonalizowanych reklam („marketing”), na co potrzebujemy Twojej zgody. Kliknij „Zezwól na wszystkie”, aby wyrazić zgodę na przetwarzanie danych.
Twoja zgoda obowiązuje również na mocy art. 49 (1) (a) RODO, co oznacza, że ​​Twoje dane mogą być tymczasowo przetwarzane poza EOG, w tym w USA. W takich przypadkach wysokie europejskie standardy ochrony danych mogą nie być w pełni zagwarantowane, a władze USA mogą uzyskać dostęp do Twoich danych bez skutecznego środka prawnego. Możesz wycofać swoją zgodę w dowolnym momencie.
PrywatnośćWarunki korzystania z serwisu
Możesz zarządzać swoimi preferencjami dotyczącymi plików cookie lub wycofać zgodę w dowolnym momencie za pośrednictwem naszych ustawień plików cookie. Aby uzyskać więcej informacji, zapoznaj się z naszą Polityką prywatności.