
Generative AI mostly generates more of the same. It very rarely originates anything. It's all a bit of a beige Renaissance.
Sure, it can spit out all manner of media faster than you can finish this sentence, but it's same same, only different. Semi-polished but utterly predictable. Increasingly so, as more AI generated content floods the media spectrum and goes into the top of the grinder to produce a billion more beiges.
You don't have to be a mathematician to know that the more averages you add, the more average anything becomes.
Audiences are feeling the same dismal sameness, the same digital déjà vu of dullness while marketers throw about terms like personalisation-at-scale when we're ultimately serving you your favourite flavour of beige.
The question becomes: When will we reach the tipping point where society begins to push back against the glut of formulaic AI content? When will people rekindle a taste for messy, imperfect, unmistakably human expression? And what does that pivot mean for brands that have hitched themselves too heavily to the AI hype train?
AI systems are designed to optimise and please the widest audience, but in doing so they may flatten creative diversity. Tech strategist Hamilton Mann has coined the term “AI-formization” to describe how algorithm-driven creation yields a homogenization effect in arts and media.
In other words, generative AI often regurgitates patterns from its training data, leading to content that feels uniform and repetitive. This is because large language models thrive on probability. They predict the next likely token, not the next interesting one. Stack probability on probability and surprise collapses. Content becomes a copy of a copy. Blurry and fading ever more with each iteration.
It's a paradox. Even though AI could, in theory, broaden our cultural horizons, in practice, it often steers us toward a more monolithic, less diverse world.
Evidence is growing that both creators and consumers sense this creeping uniformity. A 2024 joint survey by Digitas and Vox Media found 65% of content creators believe generative AI is “eroding our creativity and making everything the same.” Meanwhile 81% of consumers report rising skepticism, questioning whether content they encounter is real or algorithmically generated.
In the study’s words, audiences feel the “integrity of nuanced, unpredictable human creativity is at stake” under a wave of AI outputs.
One major sociocultural impact of ubiquitous AI content is the perceived erosion of individuality. Traditionally, creative works have been valued as expressions of an individual’s perspective and life experience. By contrast, AI models have no personal life or emotions – they remix existing data.
As a result, critics argue AI-generated works often lack the emotional depth that comes from a human’s lived experience. Illustrators and digital artists observe how AI art always tends toward a certain glossy, derivative style. Writers sense the algorithmic senselessness in bot‑authored blogs and posts. Even boomers are starting to call out AI-generated posts and texts from lazy peers and partners.
While AI raises fundamental challenges about originality and authorship in everything it generates, it can never claim to have truly originated anything. Because an algorithm, not a person, is producing the outputs.
The danger is that we are being reduced to curators of machine media rather than creators of original media. Auto-generated media lacks the personal investment and sincerity that comes from a human choosing each proverbial word or brush stroke.
As AI content floods our feeds, society is grappling with an authenticity crisis. If any image, video, or article could have been machine-made, people start to doubt what (and who) is real.
Authenticity used to be assumed. Now it’s auditioned. A/B tested to see if it passes as human. Because experiments show people strongly prefer writing or art when they believe a human made it, revealing a pervasive bias against work made by AI. Because there is this deeply rooted sense that creativity is something quintessentially human, something that we are reluctant to surrender.
The Digitas and Vox Media survey noted that 74% of creators feel “content authenticity is something only humans can do,” reflecting a belief that genuineness requires a human touch. Likewise, 81% of consumers in that survey reported questioning the authenticity of what they see.
And this erosion of trust has implications beyond art... it affects journalism, marketing, and interpersonal communication, amongst others.
For instance, news audiences show reluctance to accept AI-generated reporting; nearly half of Americans say they don’t want news written by AI, preferring the accountability of a human journalist’s byline. In marketing, brands are also wary: corporate leaders worry that overly synthetic AI content could ring hollow and damage brand trust.
The result is a growing emphasis on authentic content labelling and curation. We see early signs of creators and platforms marking content as “human-made” or "AI-free" to assure audiences.
In what could indeed be a watershed cultural moment, movements emphasising human, imperfect, and individual expression are gaining momentum. The very idea of a “Made by Humans” badge highlights how novel and valuable human authenticity has become in this AI era. When even personal messages might be auto-generated, a genuine human voice starts to feel like a luxury good – something to be treasured for its rarity.
Several notable trends illustrate society’s impulse to reclaim authenticity:
In a striking counter-trend, physical and analogue media are making a comeback among digital natives.
Over the past few years, vinyl record sales have surged, film photography and Polaroid cameras have enjoyed a vibrant resurgence, and even DVDs and VHS tapes have seen renewed interest.
What these have in common is a tangible, human-touch experience that purely digital content lacks. Industry analysts at MIDiA observe that as streaming made music as cheap and abundant as tap water, consumers began craving the novelty, unique experiences, and authenticity that digital platforms couldn’t provide. Owning a physical album or a limited roll of film makes the experience feel precious again, restoring a sense of personal connection and curation. MIDiA’s data suggests this is not a niche fad but a broader shift as “the pendulum is swinging back from an era of ‘digital everything’ to a more comfortable place of ‘digital sometimes’.”
In fact, rather than seeking ever-more tech, many consumers (especially Gen Z) are looking backwards – embracing “dumb phones” with no internet, paper books, and other analogue escapes to “claw back” a feeling of authenticity lost in an AI-saturated, screen-saturated world.
Mirroring the slow food ethos, content creators are advocating “slow content” – a deliberate shift away from algorithm-driven content mills toward mindful creation and consumption. This movement emphasises quality, originality, and human creativity over sheer volume and speed.
In practice, that might mean writers spending weeks crafting a thoughtful longform essay (and readers taking time to digest it) rather than churning out 10 AI-generated blog posts a day. Enthusiasts argue that slow content is an antidote to what writer Cory Doctorow calls the “enshittification” of the internet – the saturation of feeds with cheap, clickbait, AI-padded material. By focusing on depth, nuance, and the creator’s unique voice, slow content aims to restore the human “signal” in the noise.
This approach deep dives into the rising slow content movement to reclaim our attention from automated feeds. The fact that “content designed for clicks rather than communication” now overwhelms us has led some educators, journalists, and creators to consciously slow down the pace. They hope audiences, fatigued by the firehose of AI info, will gravitate toward these islands of authenticity – a well-researched investigative piece, a handcrafted zine, a thoughtful podcast – in search of real human connection.
History offers a precedent for today’s pushback. In the late 19th century, the Arts and Crafts movement arose as a backlash against industrial mass production – tastemakers like William Morris championed handmade furniture and textiles, deliberately embracing imperfections to signal human craftsmanship.
Now, in the age of AI, we see a similar premium on the handcrafted. Design experts note a historical instinct to counterbalance rapid technological change with traditional craft that still rings true today. High-end fashion houses are showcasing manual weaving and embroidery; museums are celebrating folk crafts; and brands are touting how their products preserve local artisan skills.
Even OpenAI’s CEO Sam Altman predicts that as AI makes generic goods cheap, truly human-made creations will become astronomically more expensive. Not because they are costly to produce, but because society will prize their rarity and authenticity. “There has always been a premium on handmade products... I’d expect that to intensify,” Altman told an interviewer, suggesting that things we want to be expensive (like art, real human craftsmanship) will hold even greater cachet in the future.
Perhaps the most vocal pushback has come from communities of artists, writers, and musicians who see AI as a threat to their livelihoods and the integrity of their craft. Digital artists have launched campaigns like #NoToAIArt, protesting the use of their artwork to train AI and the deluge of AI-generated images flooding social media.
They argue that AI outputs have the potential to devalue the skill of human creators.
In some cases, artists are doubling down on imperfection as a statement. Photographershave taken to exhibiting unedited film photos with dust and scratches as a rebuke to the hyper-polished look of AI imagery. Similarly, some writers now emphasise personal memoir and highly subjective perspectives that an AI could never imitate, staking out territory that feels undeniably human. Even in pop culture, we see a hunger for the unpolished with YouTube reporting that viewers increasingly favour rough, authentic ‘behind the scenes’ content from creators, which garners nearly three times the views of slick, high-production videos.
This indicates audiences are responding to realness – the quirks and flaws of a human creator – over the seemingly perfect but impersonal content that AI (or highly edited teams) might produce.
Interestingly, these counter-movements share a common theme: a celebration of imperfection and limitation as markers of authenticity. What once might have been seen as flaws – a crackle in a vinyl record, a typo in a typed letter, a shaky handheld video – are now reassuring signs that a real person was behind the creation.
Sociologically, this can be viewed as a form of reclaiming agency: by infusing our media with human irregularities and constraints, we differentiate ourselves from machines. We find a deep resonance in imperfections and idiosyncrasies of things crafted by hand. Those quirks are something no generative model can truly emulate, because they emerge from lived experience and individual context.
Multiple sectors are adapting to ensure the human element isn’t lost. In education, essays are expected to be hand-written or orally presented, to guarantee a student’s own voice rather than a ChatGPT concoction. In hiring, recruiters have been so inundated with AI-written resumes that some companies resort to old-fashioned in-person evaluations or assessments of handwritten work, to sift genuine candidates from AI-generated noise. And in publishing, a few independent book publishers temporarily closed submissions after being flooded with formulaic AI-generated manuscripts.
Eventually new filters or verification methods will have to make the distinction between human and digital, but the immediate reaction has been to protect human storytelling from getting drowned out.
All these responses suggest that we are indeed at a pivotal point. Digital fatigue is setting in – roughly 70% of consumers (and over 80% of young adults) report trying to cut down on screentime and digital content in the past month.
Generative AI usage is skyrocketing (over half of teens have tried ChatGPT), yet paradoxically, this increases the desire for a non-digital means of differentiation and communication.
MIDiA's Q1 2024 consumer survey concluded that going analogue will not just be a novelty. It will become an increasingly necessary point of differentiation, simply to cut through the clutter and determine genuine works, building on values of scarcity and authenticity. Which are becoming increasingly prized in a content-saturated environment.
In other words, the more AI content bombards us, the more meaningful a human-made piece of content becomes by contrast. And this may be heralding in a renaissance of human-centred expression.
We appear to be reaching a cultural watershed moment: after initial marvelling at AI’s capabilities, society is waking up to the blandness of over-automated culture and yearning for the messy ingenuity of humanity.
The message is clear – people value what AI cannot replicate. That includes our flaws, emotions, and unique perspectives. Moving forward, we are likely to see a new equilibrium where AI becomes a useful tool alongside human creators, but not a replacement for them.
When used thoughtfully, AI can handle repetitive tasks or inspire new ideas, freeing humans to inject true creativity and individuality. But when content becomes too formulaic, audiences and creators alike instinctively rebel – nudging the culture back toward personal expression.
In a sense, the proliferation of AI has thrown into sharp relief what makes human creativity special. Authenticity, it turns out, is becoming the ultimate differentiator. In an era when anyone can generate passable art or text with a click, the most valuable currency is the genuine article – the human story, the personal touch, the imperfect but irreplaceable spark of real creativity.