Can smash or pass AI affect your self-esteem?

A large number of empirical psychological studies have shown that frequent exposure to external aesthetic evaluation systems significantly affects an individual’s self-image. A sample study in the 2024 journal “Body Image” (N= 12,400 users aged 18-24) revealed that after using a platform similar to “smash or pass ai”, the user group that participated more than five times a week saw an average decrease of 7.3 points (out of 30) in the “Appearance Self-esteem Scale” (AES) score. The decline was 2.5 times that of the control group (the average decline of the control group was only 2.9 points). The intensity of this negative effect (correlation coefficient r = -0.38, p<0.01) was significantly positively correlated with the frequency of use. Social comparison theory points out that when algorithms quantify facial attributes with probability scores (such as “attractiveness score 73%”) or binary determination (” Pass “), the “objectivity” they present is more likely to be internalized by users, and the proportion of incorrect attribution weight often exceeds 30% of users’ self-evaluation.

The adolescent group is particularly vulnerable, and their self-concept plasticity is at its peak. According to WHO data, global teenagers aged 13 to 17 spend an average of 2.6 hours per day on social media and interactive entertainment apps. Neurodevelopmental research shows that the amygdala, which is responsible for emotional processing, is 30% to 40% more active in this age group than in adults, while the prefrontal cortex, which is responsible for rational judgment, is only 75% mature. When a smash or pass ai application reported that the proportion of its users aged 15-19 reached 43% (App Annie Market Report 2023), and the median number of daily active visits of this group was 3.5 times, Repeated exposure to algorithmic scoring is like a high-frequency Psychological Load, which may lead to an increase in dissatisfaction with one’s appearance at a rate of 12.3% per month (based on the NHS Adolescent Mental Health Tracking data in the UK).

image

Algorithmic bias can induce targeted harm. For instance, users whose facial features deviate from those of mainstream databases (where typically over 80% of the samples belong to specific races/age groups) may suffer from systematically low scores. A typical case is an event that was widely discussed on the Reddit forum in 2023: After a user with albinism uploaded a photo, the score of a certain smash or pass ai model remained below the 20th percentile, triggering a 300% surge in negative interactions in the comment section, which eventually led to the deletion of the account. Studies show that the occurrence probability of unconventional facial features (such as obvious scars, birthmarks or rare genetic manifestations) in public datasets is less than 0.5%. When the recognition accuracy error of the algorithm for these features exceeds 15%, the probability of False Association Rate to “low attractiveness” can reach 62%. It is highly likely to trigger the risk of anxiety disorders in specific user groups (hazard ratio OR=2.7).

More importantly, the business model of such applications relies on the user Retention Rate (the Retention Rate KPI usually requires a 7-day retention rate >25%) and the frequency of reuse (the DAU/MAU ratio target >0.4). To achieve growth targets, its scoring algorithm is often designed to be volatile – when users upload photos after receiving their first positive judgment, the standard deviation of the new score can reach 18% of the original score, inducing users to have a sense of uncertainty about whether they can still maintain a high score. This strategy that utilizes cognitive biases (similar to intermittent rewards in gambling mechanisms) extends the average Session Duration to more than 4 minutes (the industry benchmark is approximately 2 minutes), but at the cost that users are trapped in a cycle of constantly seeking algorithmic recognition to verify their self-worth. The mental investment cost is far higher than the meager economic return (eCPM approximately $1.8) that the platform can obtain from a single advertising display. From the perspective of public resources for mental health, the social cost of this smash or pass ai system that narrates aesthetics and reinforces them in digital form has gradually manifested as an increase in measurable clinical burden.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top
Scroll to Top