Degrees of freedom.
Call it deja, deja, deja, deja vu, all over again, again. The same fingers pointed, the same circular firing squads assembled, the same blame assigned, and, most of all, the same words spoken each time we Democrats suck wind.
Call it stream of nervousness.
Searching for distraction, re-reading an article from the New York Time's crack Wirecutter product review squad headlined, “Apple’s new AI features? Overhyped.”
What the Heck.
Like all things done in moderation, it's okay for advertising to sip some of its own Kool Aid.
Lately, however, it feels more like the industry is adrift on a sea of sugary delusion and those purple, red, and oddly glowing orange icebergs up ahead are about to inflict serious damage on our theoretically unsinkable ship.
Not Sour Grapes.
Or talking out of school. At least, not entirely. But after years, okay, decades, of waiting for advertising to find and follow its better angels, I keep playing Charlie Brown to Lucy’s football-yanking set piece.
You can smell 'em from here.
I'm talking about the pundits cueing up cheek-by-jowl, pixel-to-pixel and podcast-to-podcast on the marketing lessons to be learned from the Kamala Harris phenomena.
Sleight of mind.
Translation: AI’s Jedi mind tricks really do work on a big slice of the audience.
Okay, not okay.
Long ago and back in the day, one of my first creative hires in one of the first of the three agencies I’ve co-founded was fond of saying “it’s not the shots from in front that kill, it’s the knife from behind.”
It’s 7 am and I’d rather not be writing this.
As it happens, while waiting for the plane to button up, I clicked on Michael Farmer’s interview with an anonymous former agency ECD turned client-side brand consultant. The question on the table was whether digital and social media dominance has resulted in a decline in advertising creative quality.
Today is national caviar day.
And if there was ever a moment to ponder the circus of made-up marketing holidays, a calendar carousel of wall-to-wall brand takeovers, and the cynical PT Barnumesque view of consumers too well reflected in the commercialization of excess consumption this must be it.
To our own damn selves. Or not.
While New York Times advertising coverage has fallen more than a few pegs from the glory days when Stuart Elliott surveyed his fiefdom from his usual table at the Four Seasons, the Gray Lady does manage to pay attention every now and again.
Heart unhealthy facts & split screen headaches.
The walking undead: according to Gallop, trust in the media has dropped to 32% with Edelman’s Trust Barometer showing 64% of people believe the media will “deliberately mislead us.” Well, according to YouGov, 54% still get their news from TV, almost 70% of people 45+. So there.
3-eyes blind.
Who says the soothsayers, savants, talking heads, and pundits trolling La Croisette for contacts and contracts get to hog the predictive fun? In dishonor of all that, I’ve been sweaty-palming my way to a list of some of the pathetic, cringe-worthy, and obviously obvious realities likely to slam into our noses-—as long as we close our eyes and walk into them.
Nod and bill.
David Ogilvy put it like so: “Don’t hire a dog and then bark for yourself.” Leo Burnett said it thusly: “Any fool can write a bad advertisement, but it takes a real genius to keep his hands off a good one.” Both were nailing a singular failing that’s plagued adland since we first slithered from the primordial economy on to dry land: how we arbitrate our creative creations.
Dullards.
Just about every advertising creative worth any damned thing at all believes three things down to the nubs of their former fingernails:
Agency life is unfair. This is one of those “res ipsa” things. Unarguable.
Work long and hard enough and that career-making idea will come. Dubious, but maybe marginally defensible per Woody Allen’s zing that 90% of life is a cliché.
Last, there’s our shared article of faith that better creative makes for more effective advertising; a point that been surprisingly controversial ever since Whipple started screeching about TP on TV.
S’tragedy.
Once upon a time, generally after I’d done something particularly boneheaded—filling the garage with toxic fumes from a “slightly” modified Gilbert chemistry set comes to mind—my mother would recite, “Sorry doesn’t feed the dinosaurs.”
Perplexed.
For the first time in forever—or at least the year and change since ChatGPT-3 hit the buzzworks, there’s actually something in AI-land that doesn’t come with a wisp of vapor, a hint of future-forward bait, and the distinct odor of fish. It’s called Perplexity A.I., although I think it could be more properly tagged as CliffsNotes on digital steroids.
How much is that doggie in the mainframe?
For all the waxing and pontificating about AI—and we’re well past gigapixels by now—there’s one item nobody seems much dialed into.
Batman’s Belt: The Unasked FAQ of Source Agnostic Creative.
Born at the intersection of necessity and opportunity, “source agnostic” is a different way to think about creative video production—one that frees your creative concepts from traps, brick walls, and dead ends. So, what does that really mean? We thought you’d never ask.
Déjà, déjà vu all over, over again.
In “Little Gidding” T.S. Elliott delivers one of poetry’s most evocative stanzas when he writes, “And the end of our exploring/Will be to arrive where we started/And know the place for the first time.”