Innovation Monitor: Beauty is in the eye of the algorithm | by NYC Media Lab | June 2021
Innovation Monitor: Beauty is in the Eye of the Algorithm
Welcome to this week’s Innovation Monitor.
Innovation in augmented reality is a passion oyour community and me – check out our previous editions – AR adapts complex technology for fun everyday uses that surprise and delight. We love it beautiful and playful AR app called Wallflower created by a team from the recent NYC Media Lab Synthetic Media and Storytelling Challenge. This is a great example of how, especially during the pandemic, Filters provided a wonderful respite from the gravity of what we faced during Family Facetimes and Zooms.
At NYC Media Lab, we accept the rewards and recognize the risks of emerging technologies. As we explore and celebrate the new creative expressions of immersive and creative technology offerings, we will also consider the unintended consequences of rapid technological innovation.
In this week’s edition, we take a look at how augmented reality-based face filters have exploded in popularity with relatively little guidance. What societal implications are most urgent, especially since these tools collect our biometric information? How does this culture of filters, followers and instant celebrity impact the development and mental health of children and youth? Read on to discover ways people use AR that can be both surprising and disturbing.
Thanks for reading, and as always, stay safe and thank you for reading, and if you received this email you will can easily register here!
Everything is fine,
Erica Matsumoto Filter your life In his 1996 magnum opus Infinite jokeThe late David Foster Wallace wrote a weirdly premonitory passage: In the book, the telecommunications industry designs “high definition masking”, starting with “flattering multi-angle photos” that end up resulting in “a tight polybutylene resin. . mask. ”Today we also wear masks, but in the form of filters.
The beauty filter is rooted in the selfie culture, which flourished in the mid-90s on the “kawaii” scene in Japan, particularly at purikura, “photo booths that allowed customers to decorate self-portraits”. according to MIT Tech Review.
The AR filters themselves started out pretty innocent. It can be argued that Snapchat’s face change has sparked interest from users and media alike, although the technology has been around for a decade. A fancy first example was released in 2011, when media artist Kyle McDonald’s Face Substitution Research with Arturo Castro blew people away:
Things took off in 2015 when Snapchat added Lenses (aka Filters), and in 2016 when the company made face swapping a viral phenomenon. Therefore, the app also had the youngest demographic among the social giants of the time. And that’s where the concern lies:
“The amount of biometric data that TikTok, Snapchat and Facebook have collected through these filters [is worrying]. While Facebook and Snapchat say they don’t use filtering technology to collect personally identifiable data, a review of their privacy policies shows they do have the right to store data from photographs and videos. on platforms.
Besides very real privacy concerns, researchers are still unsure about the lasting impact of continued filter use on our mental health. When Snapchat launched in 2011, selfies went from a by-product to a mainstream conversation mode. In 2013, “selfie” was rooted in the Oxford Dictionary. Today, more than 90% of young people in the US, France and UK use Snapchat’s AR features. For some, AR filters are more than a makeup substitute, but a lifeline:
“Caroline Rocha, makeup artist and photographer, says social media filters have provided her with a lifeline at a crucial time. In 2018, she was at a personal low … [filters gave her] the chance to travel… to experiment, to try on makeup, to try on a piece of jewelry.
Rocha has become a popular Lens maker among hundreds of thousands of Snapchat filter makers. His first hit was “Alive”, dedicated to his fight against mental illness.
But filters can also become a compulsive habit: Rocha says that some people “refuse to be seen without these filters because in their minds they think they look like that.” Even the creative direction of Lens’s creators is changing, Rocha says, with an emphasis on embellishment for “the money and the fame.”
“There is a bad mood in the community. It’s all about the fame and the number of followers, and I think it’s sad, because we make art, and it’s about our emotions… It’s very sad what’s going on in this moment.
According to Claire Pescott, a researcher who studies the behavior of tween girls on social media, for young girls, AR filters are primarily used for beautification: “[In a study, girls] they would all say things like, “I put this filter on because I have perfect skin.” It removes my scars and stains. And they were 10 and 11 year olds. Facial distortion The filters not only smooth the skin, but perform facial distortions, much like digital cosmetic surgery – enlarge the lips, raise the eyebrows, narrow the jaw, slim the nose and widen the eyes. It is appalling.
Instagram in fact plastic surgery filters prohibited in 2019 due to the potential negative impact, and then brought them back with some limitations in 2020. (As CNBC noted in 2018, “even if Snapchat or Instagram removed its filters, other apps would simply take their place. “)
Changing your character or avatar online, especially at a young age, can lead to behavioral change. Invented by Nick Yee and Jeremy Bailenson of Stanford (who is the founding director of the Virtual Human Interaction Lab at Stanford), the Protée effect is the hypothesis that “an individual’s behavior conforms to his digital self-representation regardless of how others perceive him”.
“Through different behavioral measures and different representational manipulations, we observed the effect of altered self-representation on behavior. Participants who had more attractive avatars exhibited greater self-disclosure and were more willing to approach strangers of the opposite sex after less than a minute of exposure to their altered avatar. In other words, the attractiveness of their avatars impacted how intimate participants were prepared to be with a stranger. “
When asked what he thinks about the effects of AR filters on his two daughters, Bailenson said it’s “a really tough question, because it flies in the face of everything we learn in all of our basic cartoons, which is ‘Be yourself’. The eye of the algorithm A quick Google search will yield a collection of startups from around the world offering facial recognition and analytics APIs, leveraging AI for use cases like beauty scoring, makeup recommendations, dating apps etc. The largest open facial recognition platform in the world, Face ++, has a beauty scoring AI.
With thousands of filters available capable of turning your appearance into a popular preset look, along with algorithms that rate you on how you achieve that look, it feels like beauty becomes a commodity. This does not bode well for young people. According to a report from the Royal Society for Public Health which surveyed 1,500 social media users aged 14-24:
“Each platform, other than YouTube, was associated with user anxiety and depression. In fact, the use of the two most image-centric platforms, Snapchat and Instagram, has been ranked last for user well-being, especially when it comes to bullying and [FOMO], and in news that won’t surprise anyone who has looked at the #thinstagram hashtag, Instagram performed poorly in terms of body image and anxiety.
Social media algorithms don’t help. In fact, they’ve come under continual criticism for promoting this market version of beauty, excluding or flagging people of color or people with disabilities. Adore Me’s viral tweet thread, which showed how videos on TikTok with plus size, black, and disabled lingerie customers were deleted, is a great example:
This week in business history
June 14, 1822: Charles Babbage presents his “engine of differences” which will lay the groundwork for the future of the computer.
That day, Babbage unveiled the machine that would be the first example of a mechanical calculating machine. The British government funded the construction of this idea, which would never be completed, but paved the way for modern computing. While the engine of difference would never be finished, the “analysis engine” that would follow would be the precursor to the foundation of our understanding of computing and software. This newsletter explored this topic and Ada Lovelace a few editions ago we will bring you back to:
After spending a subsidy equal to “the cost of two large warships,” the inventor discovered that in the early 1800s no one could make the necessary parts. (Someone eventually funded the construction of the Difference Engine… in the 1990s.)
It was around this time – 1833 – that Babbage, 41, met 17-year-old Lovelace (who called herself Ada Byron at the time). Lovelace was fascinated by Babbage’s difference engine – and understood how it worked – and the two kept in touch.