Some years ago, back when I was enamored of the idea that Web 2.0 and its associated ideas about technology was about to make everything better, I got a chance to interview Laurie Anderson, the New York avante garde artist who became famous as an international pop star back at the dawn of the MTV era. Anderson is remarkable for lots of reasons — regardless of what you think of “O Superman” — but for my purposes, she seemed most remarkable as one of the early pioneers who explored the intersections of art and technology.

Since everyone in those days praised the spirit of innovation and creativity bubbling up into affordable tools that would go on to become podcasting, online video, blogging, and the beginnings of social media, I anticipated an enthusiastic answer when I asked Anderson what she thought of this new renaissance. She replied that she wasn’t too impressed. “Everybody’s drawing out of the same box of crayons,” she said.

Well, that sucked, I thought.

More than a decade later, of course, it’s obvious that Anderson was right. When Anderson moved to New York to become an artist in the 1970s, people were predicting the city’s imminent demise. To turn her violin into a digital device, Anderson had to invent and build each device herself. She wasn’t performing new sounds, she was actually making them, for the first time. For a starving artist in a failing city, with no assured return on investment, the cost of building the tape-bow violin was both expensive and high risk. In other words: The exact opposite of the digital revolution of 2003-2007.

I loved my Abode Master Collection because it gave me everything I needed to create and manipulate images, polish and distort audio, and edit and publish video. It was expensive, but it was far less expensive than buying all the analog tools I’d have needed to do all those things. I bought digital cameras, pro-sumer-grade camcorders, microphone, M-boxes, the works. It was all relatively low risk, since I could use the same tools for my job, then turn around and make create “art projects” in my spare time. And since I had access to the web and knew how to put my work in front of cozy little audiences, that last part was easy, too. Five thousand years of Western Civilization had just delivered a motherload of creative tools to my desk, and all I had to do to reach an audience was do the work and build my brand on social media.

Part of this faith-based approach to the future of technology was driven by the technologists themselves. Many of them arrived on the scene as coders, but bloomed in the resulting sunlight as social visionaries. The web meant that “the center was everywhere.” Digital media meant that “the network” would challenge “broadcast” for supremacy. One-way gatekeepers were in trouble. Community building was on the rise. The Long Tail was going to challenge all our old assumptions about both marketing and distribution. Democracy itself would be transformed: We would be able to get our information without having to rely on “mainstream media.” The truth would no longer be hidden, or owned. Open copyright and open systems would duke it out with the old monopolies. Artists and writers and musicians would find their niche audiences, and more people would be able to make a modest living doing things they loved.

Of course, those techtopian dreams collapsed like a bad souffle almost simultaneously with the sub-prime mortgage market.

Why?

One answer is that though we each felt like we were being creative — because individually we certainly were — Anderson was correct. Viewed from outside, most of us looked like excited children drawing from the same box of crayons. Three types of successful people emerged from this crucible: The leading-edge innovators; the institutionally sponsored, and the lowest-common denominator, by which I mean those who embraced the metrics of digital popularity and made junk content that aimed toward the goal of becoming “viral.” Which, if you’ll remember its meaning before social media, was a word once associated with diseases.

Here’s another answer: While the creative class was enjoying its false revolution, so too was the business class.

This is certainly NOT the impression you’d have formed if you were reading (or, more likely, watching) Seth Godin, Chris Anderson, or the jointly written “Cluetrain Manifesto.”  Godin described the old economy as the “Television Industrial Complex,” which generated cycles of revenue by selling “average products to average people.” Anderson discovered there was more value in the aggregate tail of the sales curve. Cluetrain critiqued the old business model as being dictated from on high, then declared that “Markets Are Conversations.” Anderson (and later Kevin Kelly) proposed that networked technology would allow writers and musicians to earn a living without appealing to the mass audience. Godin proclaimed that the only way for businesses to cut through the infotainment clutter was to make products that were truly “remarkable,” and then encourage otaku consumers to make remarks about those products where others could find them.

Here’s what the business class was reading instead: Numbers.

Business has always run on numbers. But never in the history of humanity have numbers described so much of humanity.

Today we have a term for this: Big Data. By that we mean primarily the supra-human data streams emanating from your customer loyalty cards at your grocery store, your browsing history, your social media profiles. David Mamet based the plot of his 1984 Pulitzer-prize-winning “Glengarry Glen Ross” on the desperate quest for “the good leads.” If the play were written today, the coveted sales tool wouldn’t be “the good leads,” purchased from some nebulous third-party and doled out on index cards, but “the cross-indexed data set with the new analytics engine.”  Creatives went in search of tools to make better stuff. Business went out in search of tools to increase sales and reduce costs. The result: By the time the creative class was approaching its goal, the business class was shutting down that branch of R&D and moving in an entirely new direction: Precision analysis.

Everyone knows that business is about profit, but the things we learned about profit in school are now being erased and rewritten at breathtaking speed. Conventional wisdom says you make profits by winning market competitions with words like Highest Quality or Greatest Innovation or Lowest Cost or Best Designed or Most Reliable, or maybe even Friendliest Customer Service. Big Data flattens all those competitions to a few basic goals: Least risk, lowest cost, largest audience, highest profits. Any business consideration that approaches a judgment based on quality is redirected into a decision based on fashion.

Competing for a share of audience attention based on the quality of content was just too risky. Better to figure out exactly what audience/market segments exist, map their purchasing patterns, and then do what the numbers say.

And let me just say: In the early stages of this current phase, that’s a pretty smart strategy.

Remember the Dallas Cowboys in the 1970s? They made the playoffs nine times, won the NFC Championship five times, and twice took home the Lombardy Trophy. No other team in the league won as many games in the Disco Decade.

That’s mostly because the Cowboys had a guy named Tex Schramm running the club. Sure, they had a Hall of Fame coach in Tom Landry. But Landry simply wouldn’t have been Landry without Schramm’s groundbreaking application of computer technology to the previously sweat-stained art of player evaluation. This was more than a decade before the advent of the personal computer, at a time when computers cost about $500,000 in today’s dollars, took up entire rooms, and required highly trained operators. It was a huge risk to sink that much money into such an unproven scouting tool, but it paid off as a huge competitive advantage.

The 1980s were another story. Though Dallas regularly made the playoffs through the middle of the decade, they never made it back to the Super Bowl.  It’s not that Schramm stopped using computers in the 1980s. It’s that all the other clubs simply started using them.

Right now, the numbers say “Don’t risk money on new ideas. Repackage old ones.” Right now the numbers say, “Don’t risk $70 million on a ‘quality’ movie script that might bomb. Spend $300 million on a Marvel franchise comic book movie — or even better, a Fast and Furious variant, with no licensing fees — that is guaranteed to make money. The numbers say that vast swaths of American society fall into easily predictable behavioral groups. Put your money there. Repeat proven success. Minimize risk, maximize profit.

Like I said, it’s a great strategy — or it was until the 2017 Hollywood Summer Blockbuster Season. Apparently, everyone drawing out of the same box of crayons has its limits.

Of course, Big Data is really a totem in this story — the ideal of reducing everything necessary to predict human behavior down to a few data sets and algorithms. For most businesses, there’s something else at play. Let’s call it “Little Data.”

Here’s an example: It’s 1985, and I sell bikes in Smallville. When I want to boost sales, I invest some portion of my advertising budget in either 1. The local newspaper; 2. local radio; 3. local television or 4. Some type of direct mail. To purchase ads, I meet with ad salespeople, listen to pitches, and pick one or more outlets for my campaign. I spend my money, ads run, and I monitor my sales, looking for a relationship between my sales/profit and my ad buy. Typically my results are a little hard to quantify, and maybe I think I’ll do better if I hire an advertising agency. My costs go up, but so do my profits, and I have more time to “focus on my business.”

Fast-forward 30 years to 2015. I sell bikes in Smallville. I can advertise the way my father did, on the radio, or I can run campaigns on Facebook. Radio is more expensive, but it reaches more people. Then again, I know my business. I know that while parents with kids are likely to buy a bike every year or two, most of them aren’t going to buy one of my kid bikes, because they’re twice as expensive as bikes sold at Big Boxes. Every time I meet with my rep at the local ad agency, he wants me to advertise kid bikes, because he doesn’t know my business as well as I do. If I want to sell kids bikes, it has to be a more specific message to a more specific target market.

Facebook is an entirely different experience. There are no sales reps — no people at all. I get to control all the choices. I get to target where my ads are seen, when they are seen, and everything down to who sees them, based on their social media profiles. My previous advertising experience always felt like I was the mark in a Three-Card Monte hustle. On Facebook I feel in control. Smart. I’m taking what I know about my customers and reducing it to a series of numbers, with a budget and a schedule and an audience that I control. Control. Control. Control.

Numbers give us the illusion of control. You play video games? Your strength, your hit power, your health, it’s all a number. Chances of success or failure hang suspended over every decision, not vague analog risk, but precise, digital risk. Everything that can be reduced to a number can be minimized as a number, stored as a number, controlled as a number. Who cares whether I fully understand what’s being measured, or how? So long as it usually works, I feel in control. So long as my costs go down and my revenues go up, it works. And I’m in control.

I first experienced this when I was running a newspaper website, and our in-house analytics let me see what kind of traffic individual stories received. I’d been a city editor before that, and I knew, generally, from industrywide readership surveys, that most stories beyond the front page got single-digit readership. But here were the precise figures for online readership, for each story, in real time. And let me tell you, for some people, those numbers were like crack cocaine: Totally addictive, and bound to drive you insane.

Like the 1990s, when the economy boomed thanks in large part to a one-time productivity boom ignited by the first affordable, practical desktop computers, the advantages I’ve discussed here are temporary. Big Data, Small Data, and sharing the same box of low-cost crayons give you a leg up on competitors who are slow to get in the game. Eventually, as the rest of the economy catches up, there’s a reckoning. A few of the pioneers survive. More are acquired. The majority are simply trampled.

I write about this neither as a lament nor as an exhortation. Rather, this is my warning: What we are watching isn’t technology organizing itself into something higher and better, but technology driving a great race to the bottom. The Big Box of Crayons destroyed the livelihoods of all sorts of creative professionals and turned their replacements into digital serfs. Big Data redefined creativity as risk, and produced a redundant, self-referential culture and economy that is now winding down. Small Data is merely extending that process to the smaller scale of business, wiping out jobs and older, more personal ways of doing business.

We once believed that what we were witnessing was an explosion of creativity. We were mistaken.

We were witnessing the descent of a new form of cultural and economic entropy.

We don’t yet know where or when it will end.

The more interesting question is: Once it ends, will there be a chance for humanity to recover? Will there still be a place for a Laurie Anderson figure, someone who humanized technology and blurred the lines between human and machine expression?

Or will the Big Box of Crayons, at full scale, require or desire us in any way whatsoever?

 

%d bloggers like this: