Another wonderfully astute, highly enlightening post from one of my favorites: David McRaney at YOU ARE NOT SO SMART (I just bought his book on Kindle, in fact, so the man must be doing something right!). Here he tackles the notion of ego depletion (see also: Decision Fatigue) and how it affects us (often, without our even realizing it) as we go about our day in this ever-increasingly-complex modern world... Writers, in particular, seem to be most prone to ego depletion--when you sit down to edit a piece or, God help you, face the blank page and begin anew, the number of decisions you're making, oh-so-rapidly, all in a row, is so extraordinary (terrifying, really), that it's no wonder even the best, most seasoned writers, the pros at the very top of their game, burn out after 4 to 5 hours or so (I've noticed, over the years, that I can write for about 5 hours in a row--total concentration--before I suddenly, and almost without warning, hit a kind of "wall", much like hitting muscle failure at the gym, after which point I can keep going, but the writing all turns to mush--time to take a long break then, or wrap for the day completely...). What's fascinating to consider here, however, is the idea that ego depletion can, just like muscle exhaustion, can be tracked, measured, quantified, and, under certain circumstances, even blunted or postponed significantly (by taking in more sugar, for instance), to which I say: if that means I can extend my willpower and, thus, my decision making ability, total concentration time, and total effective WRITING time, all simply by taking more pre-scheduled breaks and / or getting a piece of fruit in, well, I'm ALL for that... Check it out--
The Misconception: Willpower is just a metaphor.
The Truth: Willpower is a finite resource.
In 2005, a team of psychologists made a group of college students feel like scum.
The researchers invited the undergraduates into their lab and asked the students to just hang out for a while and get to know each other. The setting was designed to simulate a casual meet-and-greet atmosphere, you know, like a reception or an office Christmas party – the sort of thing that never really feels all that casual?
The students divided into same-sex clusters of about six people each and chatted for 20 minutes using conversation starters provided by the researchers. They asked things like “Where are you from?” and “What is your major?” and “If you could travel anywhere in the world, where would you go?” Researchers asked the students beforehand to make an effort to learn each other’s names during the hang-out period, which was important, because the next task was to move into a room, sit alone, and write down the names of two people from the fake party with whom the subjects would most like to be partnered for the next part of the study. The researchers noted the responses and asked the students to wait to be called. Unbeknownst to the subjects, their choices were tossed aside while they waited.
The researchers – Roy F. Baumeister, C. Nathan DeWall, Natalie J. Ciarocco and Jean M. Twenge of Florida State, Florida Atlantic, and San Diego State universities – then asked the young men and women to proceed to the next stage of the activity in which the subjects would learn, based on their social skills at the party, what sort of impression they had made on their new acquaintances. This is where it got funky.
The scientists individually told the members of one group of randomly selected people, “everyone chose you as someone they’d like to work with.” To keep each person in the wanted group isolated, the researchers also told each person the groups were already too big and he or she would have to work alone. Students in the wanted group proceeded to the next task with a spring in their step, their hearts filled with moonbeams and fireworks. The scientists individually told each member of another group of randomly selected people, “I hate to tell you this, but no one chose you as someone they wanted to work with.” Believing absolutely no one wanted to hang out with them, people in this group then learned they would have to work by themselves. Punched in the soul, their self-esteem dripping with inky sludge, the people in the unwanted group proceeded to the main task.
The task, the whole point of going through all of this as far as the students knew, was to sit in front of a bowl containing 35 mini chocolate-chip cookies and judge those cookies on taste, smell, and texture. The subjects learned they could eat as many as they wanted while filling out a form commonly used in corporate taste tests. The researchers left them alone with the cookies for 10 minutes.
This was the actual experiment – measuring cookie consumption based on social acceptance. How many cookies would the wanted people eat, and how would their behavior differ from the unwanted? Well, if you’ve had much contact with human beings, and especially if you’ve ever felt the icy embrace of being left-out of the party or getting picked last in kickball, your hypothesis is probably the same as the one put forth by the psychologists. They predicted the rejects would gorge themselves, and so they did. On average the rejects ate twice as many cookies as the popular people. To an outside observer, nothing was different – same setting, same work, similar students sitting alone in front of scrumptious cookies. In their heads though, they were on different planets. For those on the sunny planet with the double-rainbow sky, the cookies were easy to resist. Those on the rocky, lifeless world where the forgotten go to fade away found it more difficult to stay their hands when their desire to reach into the bowl surfaced.
Why did the rejected group feel motivated to keep mushing cookies into their sad faces? Why is it, as explained by the scientists in this study, that social exclusion impairs self-regulation? The answer has to do with something psychologists now call ego depletion, and you would be surprised to learn how many things can cause it, how often you feel it, and how much in life depends on it. Before we get into all of that, let’s briefly discuss the ego.
Three men doing time in Israeli prisons recently appeared before a parole board consisting of a judge, a criminologist and a social worker. The three prisoners had completed at least two-thirds of their sentences, but the parole board granted freedom to only one of them. Guess which one:
Case 1 (heard at 8:50 a.m.): An Arab Israeli serving a 30-month sentence for fraud.
Case 2 (heard at 3:10 p.m.): A Jewish Israeli serving a 16-month sentence for assault.
Case 3 (heard at 4:25 p.m.): An Arab Israeli serving a 30-month sentence for fraud.
There was a pattern to the parole board’s decisions, but it wasn’t related to the men’s ethnic backgrounds, crimes or sentences. It was all about timing, as researchers discovered by analyzing more than 1,100 decisions over the course of a year. Judges, who would hear the prisoners’ appeals and then get advice from the other members of the board, approved parole in about a third of the cases, but the probability of being paroled fluctuated wildly throughout the day. Prisoners who appeared early in the morning received parole about 70 percent of the time, while those who appeared late in the day were paroled less than 10 percent of the time.
The odds favored the prisoner who appeared at 8:50 a.m. — and he did in fact receive parole. But even though the other Arab Israeli prisoner was serving the same sentence for the same crime — fraud — the odds were against him when he appeared (on a different day) at 4:25 in the afternoon. He was denied parole, as was the Jewish Israeli prisoner at 3:10 p.m, whose sentence was shorter than that of the man who was released. They were just asking for parole at the wrong time of day.
There was nothing malicious or even unusual about the judges’ behavior, which was reported earlier this year by Jonathan Levav of Stanford and Shai Danziger of Ben-Gurion University. The judges’ erratic judgment was due to the occupational hazard of being, as George W. Bush once put it, “the decider.” The mental work of ruling on case after case, whatever the individual merits, wore them down. This sort of decision fatigue can make quarterbacks prone to dubious choices late in the game and C.F.O.’s prone to disastrous dalliances late in the evening. It routinely warps the judgment of everyone, executive and nonexecutive, rich and poor — in fact, it can take a special toll on the poor. Yet few people are even aware of it, and researchers are only beginning to understand why it happens and how to counteract it.
Decision fatigue helps explain why ordinarily sensible people get angry at colleagues and families, splurge on clothes, buy junk food at the supermarket and can’t resist the dealer’s offer to rustproof their new car. No matter how rational and high-minded you try to be, you can’t make decision after decision without paying a biological price. It’s different from ordinary physical fatigue — you’re not consciously aware of being tired — but you’re low on mental energy. The more choices you make throughout the day, the harder each one becomes for your brain, and eventually it looks for shortcuts, usually in either of two very different ways. One shortcut is to become reckless: to act impulsively instead of expending the energy to first think through the consequences. (Sure, tweet that photo! What could go wrong?) The other shortcut is the ultimate energy saver: do nothing. Instead of agonizing over decisions, avoid any choice. Ducking a decision often creates bigger problems in the long run, but for the moment, it eases the mental strain. You start to resist any change, any potentially risky move — like releasing a prisoner who might commit a crime. So the fatigued judge on a parole board takes the easy way out, and the prisoner keeps doing time.
Decision fatigue is the newest discovery involving a phenomenon called ego depletion, a term coined by the social psychologist Roy F. Baumeister in homage to a Freudian hypothesis. Freud speculated that the self, or ego, depended on mental activities involving the transfer of energy. He was vague about the details, though, and quite wrong about some of them (like his idea that artists “sublimate” sexual energy into their work, which would imply that adultery should be especially rare at artists’ colonies). Freud’s energy model of the self was generally ignored until the end of the century, when Baumeister began studying mental discipline in a series of experiments, first at Case Western and then at Florida State University.
It is easy to get worked up over remakes and prequels and sequels these days, but it's also not terribly productive. This is the modern Hollywood film industry in the year 2011, and you can either accept that or you can rail against it, but either way, they're going to keep on doing business this way until there is a compelling reason for them to not do business this way.
I wrote about my experience at Comic-Con this summer with the "Prometheus" panel, and certainly I hope that film delivers something special when it is released next year. I am willing to walk into it open-minded, especially since it's not like the "Alien" franchise is this untouched, pristine thing. Any time your iconic creation has already been roughed up behind the bleachers by Paul "Show me on the teddy bear where he touched your favorite movie" W.S. Anderson, it's fair game for anyone. Besides, having Ridley Scott back in the world that he helped create in the original 1979 film is interesting, no doubt about it.
But that "helped create" is important, and something to consider today as the news breaks that once again, Ridley Scott is planning to revisit one of the SF worlds he was part of with a "follow-up" to "Blade Runner" being announced this morning. And while I'm a big fan of the 1982 film, I think the notion of any sequel or prequel in that world is a terrible one. Awful. Catastrophically bad.
The simple truth is that not all films are franchises, and not every narrative can support a sequel or a prequel. This disturbing idea that has taken hold that we need to wring every drop of creative juice out of any film that has ever attracted any audience of any size is, quite honestly, death. This is what the death throes of studio filmmaking look like, folks, and the only real or substantial thing that film fans can do is grab a bag of marshmallows to roast as the whole thing goes up in flames. People love to point at the occasional fluke like "Inception" as proof that the system isn't broken beyond repair, but the only reason that film happened was because Christopher Nolan made a remake, which convinced the studio he was responsible enough for them to trust him with a reboot, and then he made a sequel to his reboot that made a billion dollars. And for that, finally, they "rewarded" him with the opportunity to make something he wrote. That ended up making the studio some $800 million, which is great, and which guarantees him more freedom. So far, he's used that freedom to sign on to direct another sequel while producing, yes, another reboot. This is the guy film fans love to hold up as an example for how to do it right in Hollywood, but so far, what I see is a very good filmmaker who is still having to navigate the same blood-filled waters as everyone else. He does it well, certainly, but he's still stuck in the same box that other filmmakers are, and his work hasn't changed the system at all. If anything, he's given the studios more ammunition to prove that what they are doing is right. It works. It's the correct model to follow.
Ridley Scott may never set foot on a set for a "Blade Runner" follow-up. Signing a deal is one thing, while making the actual film is something totally different. There's a long way to go before that film is a real and tangible thing. And in that time, they may end up deciding not to ever roll film, something that's happened with plenty of in-development projects, particularly with things Ridley Scott has been attached to over the years. After all, I'm not sitting down this summer to a big-screen giant-budget version of "The Forever War," so just because he says he's going to direct something, that doesn't mean it will really get a greenlight.
With "Blade Runner," though, there is a special level of anxiety that the announcement brings. I've said before that the real problem with filmmakers who go back to continue screwing around with a film after it's been in release is that filmmakers often have no understanding of what it is that an audience loves about a film. Once you've released it, you have to stop touching it, because further adjustments could well erase the thing that made it important to someone. You could screw up a character or the timing of a sequence or a thematic point, and the various versions of "Blade Runner" perfectly highlight that problem. When I first got Internet access in 1994, I was amazed to find people in newsgroups debating ideas like "Was Deckard a replicant in 'Blade Runner'?," especially since I know from firsthand experience in 1982 that general audiences totally rejected the film. That ambiguity, and the way the film left room for interpretation, was one of the reasons it lingered so well. When Ridley Scott started playing around with the movie and adding new effects and tinkering with it after the brief release of the Workprint version, all of a sudden that ambiguity started getting a lot less ambiguous, and Scott seemed determined to answer the question for us. I found it infuriating, but at least I knew I still had the original version of the film to go back to. If Scott's planning to return to the world of the movie, I'm afraid of him creating something which will not just rob that first movie of any and all ambiguity, but which will make me wonder if what I saw in the original film was ever really there at all. He can't erase the original from existence, but he can absolutely destroy my interest in the narrative, and I'm afraid that when it comes to "Blade Runner," he's the last person I want to see playing around with that property.
The Misconception: You procrastinate because you are lazy and can’t manage your time well.
The Truth: Procrastination is fueled by weakness in the face of impulse and a failure to think about thinking.
Netflix reveals something about your own behavior you should have noticed by now, something which keeps getting between you and the things you want to accomplish.
If you have Netflix, especially if you stream it to your TV, you tend to gradually accumulate a cache of hundreds of films you think you’ll watch one day. This is a bigger deal than you think.
Take a look at your queue. Why are there so damn many documentaries and dramatic epics collecting virtual dust in there? By now you could draw the cover art to “Dead Man Walking” from memory. Why do you keep passing over it?
Psychologists actually know the answer to this question, to why you keep adding movies you will never watch to your growing collection of future rentals, and its the same reason you believe you will eventually do what’s best for yourself in all the other parts of your life, but rarely do.
A study conducted in 1999 by Read, Loewenstein and Kalyanaraman had people pick three movies out of a selection of 24. Some were lowbrow like “Sleepless in Seattle” or “Mrs. Doubtfire.” Some were highbrow like “Schindler’s List” or “The Piano.” In other words, it was a choice between movies which promised to be fun and forgettable or would be memorable but require more effort to absorb.
After picking, the subjects had to watch one movie right away. They then had to watch another in two days and a third two days after that.
Most people picked Schindler’s List as one of their three. They knew it was a great movie because all their friends said it was. All the reviews were glowing, and it earned dozens of the highest awards. Most didn’t, however, choose to watch it on the first day.
Instead, people tended to pick lowbrow movies on the first day. Only 44 percent went for the heavier stuff first. The majority tended to pick comedies like “The Mask” or action flicks like “Speed” when they knew they had to watch it forthwith.
Planning ahead, people picked highbrow movies 63 percent of the time for their second movie and 71 percent of the time for their third.
When they ran the experiment again but told subjects they had to watch all three selections back-to-back, “Schindler’s List” was 13 times less likely to be chosen at all.
The researchers had a hunch people would go for the junk food first, but plan healthy meals in the future.
Many studies over the years have shown you tend to have time-inconsistent preferences. When asked if you would rather have fruit or cake one week from now, you will usually say fruit. A week later when the slice of German chocolate and the apple are offered, you are statistically more likely to go for the cake.
This is why your Netflix queue is full of great films you keep passing over for “Family Guy.” With Netflix, the choice of what to watch right now and what to watch later is like candy bars versus carrot sticks. When you are planning ahead, your better angels point to the nourishing choices, but in the moment you go for what tastes good.
As behavioral economist Katherine Milkman has pointed out, this is why grocery stores put candy right next to the checkout.
This is sometimes called present bias – being unable to grasp what you want will change over time, and what you want now isn’t the same thing you will want later. Present bias explains why you buy lettuce and bananas only to throw them out later when you forget to eat them. This is why when you are a kid you wonder why adults don’t own more toys.
Earlier this year, just 2,300 of 32,000 applicants to Stanford University were accepted — a rate of 7.2%, the lowest in the school's history.
Sumo stable in Tokyo, Japan: you don’t need to be a superstar to use the Superstar Effect.
The students who survived this screening are phenomenally accomplished. A quarter had SAT math scores higher than 780, and over 90% had high school G.P.A.'s above 3.75, which works out, more or less, to straight A's over four years of schooling. And these weren't easy A's: the average applicant to a top-tier university takes an overwhelming volume of demanding AP or IB-level courses. (Not surprising, considering that the Stanford admissions departments ranks the "rigor of secondary school record" as "very important" in their decision.)
If you eliminate recruited athletes and the children of the rich and famous from this pool — categories that receive special consideration — these numbers become even starker. In short, for the average, middle-class American high school senior, applying to Stanford is like playing the lottery.
Which is why Michael Silverman proves baffling.
When Michael, a student from Paradise Valley, Arizona, applied to Stanford, his G.P.A. put him in the bottom 10% of accepted students. His SAT scores fell similarly short. "Standardized testing isn't my strong point," he told me. Perhaps more surprising, Michael avoided the crushing course load that diminishes the will of so many college hopefuls, instead taking only a single AP course during the dreaded junior year. He kept his extracurricular schedule equally clean — joining no clubs or sports and dedicating his attention to no more than one outside project at any given time.
Michael's rejection of the no pain, no gain ethos surrounding American college admissions is perhaps best summarized by his habit of ending each school day with a 1 – 2 hour hike to the summit of nearby Camelback Mountain. While his peers worked slavishly at their killer schedules, Michael took in the view, using his ritual as a time to "chill out and relax."
Despite this heretical behavior, Michael was still accepted at Stanford. To understand why, I will turn your attention to a little-known economics theory that changes the way we think about impressiveness. To get there, however, we'll start at an unlikely location: the competitive world of professional opera singers.
The Opera Singer and the Valedictorian
Juan Diego Florez cemented his reputation as a top operatic tenor during a 2008 performance of Gaetano Donizetti's La Fille du Regiment. Among professional singers, Donizetti's masterpiece is known as "the Mount Everest of opera"; a reputation due, almost entirely, to a devilishly tricky aria, "Ah! Mes amis, quel jour de fete," that arrives early in the first act. The aria demands the tenor to hit nine high C's in a row — a supremely difficult feat. To avoid embarrassment, most performers resort to the far easier natural C.
In his 2008 performance of Donizetti, at the Metropolitan Opera House, Florez hit all nine notes. The acclaim was so overwhelming that he was summoned back to the stage for an encore, overturning the Met's long-standing ban on the practice.
As a top opera singer, we can assume that Florez does well for himself financially (likely on the order of 5-digit paydays per performance), but not lavishly well. Put another way: he's well-off but not wealthy.
Then there are the superstars.
In 1972, a young tenor by the name of Luciano Pavarotti also made a name for himself performing Donizetti at the Met. Like Florez, he too hit the high C's. But there was something extra in Pavarotti's voice. The audience at the Met in 1972 did more than demand an encore from Pavarotti, they weren't content until he had returned to the stage seventeen times! In writing about Florez's 2008 performance, the New York Times noted: "If truth be told, it's not as hard as it sounds for a tenor with a light lyric voice like Mr. Florez to toss off those high C's…[I]n the early 1970's, when Luciano Pavarotti…let those high Cs ring out, that was truly astonishing."
In other words, both Florez and Pavarotti are exceptional tenors, but Pavarotti was slightly better — the best among an elite class. The impact of this small difference, however, was huge. Whereas we estimated that Florez was well off but not wealthy, when Pavoratti died in 2007, sources estimated his estate to be worth $275 to 475 million.
In a 1981 paper published in the American Economics Review, the economist Sherwin Rosen worked through the mathematics that explains why superstars, like Pavarotti, reap so many more rewards than peers who are only slightly less talented. He called the phenomenon, “The Superstar Effect.”
Though the details of Rosen's formulas are complex, the intuition is simple: Imagine a million opera fans who each have $10 to spend on an opera album. They're trying to decide whether to buy an album by Florez or Pavarotti. Rosen's theory predicts that the bulk of the consumers will purchase the Pavarotti album, thinking, roughly: "although both singers are great, Pavarotti is the best, and if I can only get one album I might as well get the best one available." The result is that the vast majority of the $10 million goes to Pavarotti, even though his talent advantage over Florez is small.
Once identified, The Superstar Effect turned up in a variety of unexpected settings, from the sales of books to CEO salaries. It was found to apply even in settings that have nothing to do with financial transactions. In a particularly compelling example, a researcher named Paul Atwell, publishing in the journal Sociology of Education in 2001, studied the Superstar Effect for high school valedictorians.
Atwell imagined two students both with 700s on their various SAT tests. The first student was the valedictorian and the second student was ranked number five in the class. Rationally speaking, these two students are near identical — the difference in G.P.A. between the number one and number five rank is vanishingly small. But using statistics from Dartmouth College, Atwell showed that the valedictorian has a 75% of acceptance at this Ivy League institution while the near identical fifth-ranked student has only a 25% chance.
In other words, in many fields, it pays disproportionately well to be not just very good, but the best.
Hacking the Superstar Effect
Taking a step back, we likely agree that it's an interesting finding that being the best has a hidden advantage. If reaping this advantage, however, requires becoming class valedictorian or honing a brilliant singing voice — both staggeringly difficult feats — it's doesn't seem all that applicable.
This is where Michael Silverman reenters the picture.
The details of his story reveal a crucial addendum that makes the power of the Superstar Effect available to most people. I call this addendum The Superstar Corollary, and it's here I turn your attention next.
I discovered The Superstar Corollary in an unlikely setting: the extracurricular lives of high school students. I was researching a book on students, like Michael, who get accepted to outstanding colleges while still living low-stress and interesting lives. During this research, I kept noticing the same trait in these teen-aged lifehackers: they had accomplishments that triggered The Superstar Effect, but which revealed on closer examination to not require a rare natural talent or years and years of grinding work.
Brute force seldom works with haters. Redirection does. (Photo: Deadstar 2.0)
I also gave a short keynote at The NextWeb about how to deal with haters, protect yourself from (some) media, respond to FlipCams, and other personal branding self-defense 101.
Think you have crazy people contacting you or commenting on your blog? Me too. I share some of my favorite hater e-mails, Amazon reviews, and voicemails. It’ll make you feel better to hear the stories.
It is possible to learn to love haters. But it does take some know-how and tactical planning…
I elaborated on a few points in an interview in the Netherlands with Amy-Mae Elliot, who originally posted them on Mashable in her piece Tim Ferriss: 7 Great Principles for Dealing with Haters:
1. It doesn’t matter how many people don’t get it. What matters is how many people do.
“It’s critical in social media, as in life, to have a clear objective and not to lose sight of that,” Ferriss says. He argues that if your objective is to do the greatest good for the greatest number of people or to change the world in some small way (be it through a product or service), you only need to pick your first 1,000 fans — and carefully. “As long as you’re accomplishing your objectives, that 1,000 will lead to a cascading effect,” Ferriss explains. “The 10 million that don’t get it don’t matter.”
2. 10% of people will find a way to take anything personally. Expect it.
“People are least productive in reactive mode,” Ferriss states, before explaining that if you are expecting resistance and attackers, you can choose your response in advance, as opposed to reacting inappropriately. This, Ferriss says, will only multiply the problem. “Online I see people committing ’social media suicide’ all the time by one of two ways. Firstly by responding to all criticism, meaning you’re never going to find time to complete important milestones of your own, and by responding to things that don’t warrant a response.” This, says Ferriss, lends more credibility by driving traffic.
3. “Trying to get everyone to like you is a sign of mediocrity.” (Colin Powell)
“If you treat everyone the same and respond to everyone by apologizing or agreeing, you’re not going to be recognizing the best performers, and you’re not going to be improving the worst performers,” Ferriss says. “That guarantees you’ll get more behavior you don’t want and less you do.” That doesn’t mean never respond, Ferriss goes on to say, but be “tactical and strategic” when you do.
4. “If you are really effective at what you do, 95% of the things said about you will be negative.” (Scott Boras)
“This principle goes hand-in-hand with number two,” Ferriss says. “I actually keep this quote in my wallet because it is a reminder that the best people in almost any field are almost always the people who get the most criticism.” The bigger your impact, explains Ferriss (whose book is a New York Times, WSJ and BusinessWeek bestseller), and the larger the ambition and scale of your project, the more negativity you’ll encounter. Ferriss jokes he has haters “in about 35 languages.”
5. “If you want to improve, be content to be thought foolish and stupid.” (Epictetus)
“Another way to phrase this is through a more recent quote from Elbert Hubbard,” Ferriss says. “‘To avoid criticism, do nothing, say nothing, and be nothing.” Ferriss, who holds a Guinness World Record for the most consecutive tango spins, says he has learned to enjoy criticism over the years. Ferriss, using Roman philosophy to expand on his point, says: “Cato, who Seneca believed to be the perfect stoic, practiced this by wearing darker robes than was customary and by wearing no tunic. He expected to be ridiculed and he was, he did this to train himself to only be ashamed of those things that are truly worth being ashamed of. To do anything remotely interesting you need to train yourself to be effective at dealing with, responding to, even enjoying criticism… In fact, I would take the quote a step further and encourage people to actively pursue being thought foolish and stupid.”
"I had reservations about making art a business," the famous art collector Mary Boone once said. "But I got over it."
Such is the tension within all artistic industries -- film, painting, theater or music, the idea of selling-out dogs them all. Are the high prices that paintings go for at Sotheby's or films sell for at Sundance indicative of their success, or their impurity? And how do you distinguish the "true" art from the art that's just hyped? Do the two have to be mutually exclusive?
The recent documentary "Exit Through the Gift Shop" takes up these questions and then some. Ever since its "surprise" Sundance premiere in January, the film has generated a considerable amount of attention. Supposedly directed by British street-art provocateur Banksy -- famous for his political and controversial acts of graffiti, such as painting on Israel's West Bank Barrier -- much of the buzz has circled around questions of the film's veracity: Was the film's protagonist, a French videomaker-turned-artist named Thierry Guetta, just a fabrication? Was the entire project yet another infamous Banksy prank?
But whether the film is real or staged or somewhere in between misses the point: "Exit Through the Gift Shop" -- as its title suggests -- is ultimately a lacerating critique on the commercialization of art, making it the latest in a new wave of documentaries that focus on the struggles of artists and art aficionados to define the value of art in a world dominated by profit motives and capitalist enterprise. As the recently released "The Art of The Steal" makes strikingly apparent in its chronicle of Philadelphia's power grab of a private collection of impressionist masterworks, art is big business.
It's no surprise that Banksy also raises the ugly specter of art's commodification in his debut film. After his works sold at Sotheby's in 2007 for record-breaking amounts for a young artist, he posted a painting of an auction house on his website with the caption, "I can't believe you morons actually buy this shit."
One could pose a similar question to the patrons of abstract expressionist artist Marla Olmstead, the four-year-old painter at the center of Amir Bar-Lev's 2007 documentary "My Kid Could Paint That." Like "Exit Through the Gift Shop," which contrasts art that's heralded as legitimate (from Banksy) with work that is depicted as a rip-off (by Guetta), Bar-Lev's film addresses a similar conflict. Are Olmstead's paintings true expressions of childhood genius, or is her art guided by her father, an amateur painter, and then exploited for profit as the work of a prodigy?
2006's "Who the #$&% Is Jackson Pollock?" starts with a matching quandary. The film opens with an image of an abstract expressionist painting and the voiceover: "Is this a genuine honest-to-god no-doubt-about-it American masterpiece, possibly worth up to $50 million? Maybe." In a former female truck driver's quest to make millions off an alleged Pollock she bought at a thrift shop, the film explores the ambiguities inherent in the validation of a piece of art. While art experts claim the painting is a cheap knock-off, the woman and her family hire forensic scientists to prove the work to be Pollock's based on fingerprint analysis. Despite the high-brow art world's unwavering refusal to acknowledge the art as legitimate, bids for the drip painting go from $2 million to $9 million. (As of last reporting, the painting was still awaiting higher offers.)
Ultimately, "Exit," "Kid" and "Pollock" leave the question of their art's authenticity up for the audience to decide -- it's actually this ambiguity that helps construct the films' central conflicts and mysteries. But by the movies' final frames, a few things become clear: quality art is difficult to define, the people who buy it (and buy into it) are often ignorant about what makes it worthwhile, and the background of the artists may be more important to observers and consumers than the artwork itself. There may be no more ironic display of such misguided celebrification and misunderstanding of art than the array of young L.A. hipster-fashionistas in "Exit" captured on camera declaring brand-new art-star Guetta's laughably derivative debut show "a revelation."
These issues are nothing new in the art world, of course. "It's always been there," says arts journalist David D'Arcy. "You're not just selling a work of art for what it is; you're selling it as an abstract painting by a child. It's not so different from selling a painting by a serial killer. You're selling an autograph," continues D'Arcy. "When Basquiat died of an overdose in 1988, it had to be his shrewdest career move. Modigliani, Frida Kahlo, same thing. You can sell martyrdom. Would these pictures mean anything if we didn't have the biography? It's almost like having the footnotes."
If personality has supplanted quality, who gets to determinate art's "quality" in the first place? Or to borrow the title of another recent doc, about Henry Geldzahler, the Met's first curator of contemporary art, "Who Gets to Call It Art?"
"Who gets to call it art is still a relevant question," says art-world and museum veteran Karl Katz, who is also an executive producer on "Who Gets To Call it Art?" and another recent art-doc, "Herb and Dorothy," which looks at two unlikely art collectors, a retired postal worker and librarian, who humbly amassed a multi-million-dollar collection of minimalist and conceptual art. "There is such a proliferation of art now that you have to turn to a museum or their chief curator. Who the hell knows what art is," adds Katz. "But if a curator wants to call it art, then it's art."
The long tail is famously good news for two classes of people; a few lucky aggregators, such as Amazon and Netflix, and 6 billion consumers. Of those two, I think consumers earn the greater reward from the wealth hidden in infinite niches.
But the long tail is a decidedly mixed blessing for creators. Individual artists, producers, inventors and makers are overlooked in the equation. The long tail does not raise the sales of creators much, but it does add massive competition and endless downward pressure on prices. Unless artists become a large aggregator of other artist's works, the long tail offers no path out of the quiet doldrums of minuscule sales.
Other than aim for a blockbuster hit, what can an artist do to escape the long tail?
One solution is to find 1,000 True Fans. While some artists have discovered this path without calling it that, I think it is worth trying to formalize. The gist of 1,000 True Fans can be stated simply:
A creator, such as an artist, musician, photographer, craftsperson, performer, animator, designer, videomaker, or author - in other words, anyone producing works of art - needs to acquire only 1,000 True Fans to make a living.
A True Fan is defined as someone who will purchase anything and everything you produce. They will drive 200 miles to see you sing. They will buy the super deluxe re-issued hi-res box set of your stuff even though they have the low-res version. They have a Google Alert set for your name. They bookmark the eBay page where your out-of-print editions show up. They come to your openings. They have you sign their copies. They buy the t-shirt, and the mug, and the hat. They can't wait till you issue your next work. They are true fans.
To raise your sales out of the flatline of the long tail you need to connect with your True Fans directly. Another way to state this is, you need to convert a thousand Lesser Fans into a thousand True Fans.
Assume conservatively that your True Fans will each spend one day's wages per year in support of what you do. That "one-day-wage" is an average, because of course your truest fans will spend a lot more than that. Let's peg that per diem each True Fan spends at $100 per year. If you have 1,000 fans that sums up to $100,000 per year, which minus some modest expenses, is a living for most folks.
One thousand is a feasible number. You could count to 1,000. If you added one fan a day, it would take only three years. True Fanship is doable. Pleasing a True Fan is pleasurable, and invigorating. It rewards the artist to remain true, to focus on the unique aspects of their work, the qualities that True Fans appreciate.
The key challenge is that you have to maintain direct contact with your 1,000 True Fans. They are giving you their support directly. Maybe they come to your house concerts, or they are buying your DVDs from your website, or they order your prints from Pictopia. As much as possible you retain the full amount of their support. You also benefit from the direct feedback and love.
The technologies of connection and small-time manufacturing make this circle possible. Blogs and RSS feeds trickle out news, and upcoming appearances or new works. Web sites host galleries of your past work, archives of biographical information, and catalogs of paraphernalia. Diskmakers, Blurb, rapid prototyping shops, Myspace, Facebook, and the entire digital domain all conspire to make duplication and dissemination in small quantities fast, cheap and easy. You don't need a million fans to justify producing something new. A mere one thousand is sufficient.
This small circle of diehard fans, which can provide you with a living, is surrounded by concentric circles of Lesser Fans. These folks will not purchase everything you do, and may not seek out direct contact, but they will buy much of what you produce. The processes you develop to feed your True Fans will also nurture Lesser Fans. As you acquire new True Fans, you can also add many more Lesser Fans. If you keep going, you may indeed end up with millions of fans and reach a hit. I don't know of any creator who is not interested in having a million fans.
But the point of this strategy is to say that you don't need a hit to survive. You don't need to aim for the short head of best-sellerdom to escape the long tail. There is a place in the middle, that is not very far away from the tail, where you can at least make a living. That mid-way haven is called 1,000 True Fans. It is an alternate destination for an artist to aim for.
Young artists starting out in this digitally mediated world have another path other than stardom, a path made possible by the very technology that creates the long tail. Instead of trying to reach the narrow and unlikely peaks of platinum hits, bestseller blockbusters, and celebrity status, they can aim for direct connection with 1,000 True Fans. It's a much saner destination to hope for. You make a living instead of a fortune. You are surrounded not by fad and fashionable infatuation, but by True Fans. And you are much more likely to actually arrive there.
Trying to control, or even manage, your online reputation is becoming increasingly difficult. And much like the fight by big labels against the illegal sharing of music, it will soon become pointless to even try. It’s time we all just give up on the small fights and become more accepting of the indiscretions of our fellow humans. Because the skeletons are coming out of the closet and onto the front porch.
We’ll look back on the good old days when your reputation was really only on the line with eBay via confirmed, actual transactions and LinkedIn, where you can simply reject anyone who leaves bad feedback on your professional life.
Today we have quick fire and semi or completely anonymous attacks on people, brands, businesses and just about everything else. And it is becoming increasingly findable on the search engines. Twitter, Yelp, Facebook, etc. are the new printing presses, and absolutely everyone, even the random wingnuts, have access.
That picture of you making out with two guys in college up on Facebook. Or perhaps doing a bong hit after winning a few Olympic gold medals. The random slam against your restaurant anonymously left by the owner of the competitor around the corner. The Twitter flame about how bad a driver you are, complete with a link to a picture of your license plate.
And it’s about to get a lot worse. Next week a startup is launching that’s effectively Yelp for people (look for our coverage in a few days). If someone has something good or bad to say about you, they’ll be able to do it anonymously and with very little potential legal or social fallout.
We’ve seen services like this in the past. Rapleaf and iKarma come to mind. But they were flawed – Rapleaf now collects and sells data about people, and iKarma seems to be little more than a realtor focused service. Another service, Gorb, has vanished completely.
But something tells me this new service, or some other one, might succeed where the others have failed. We’re primed and ready now and have lots of experience publishing all those random opinions about people and things on Twitter, Yelp and Facebook already. It’s time for a centralized, well organized place for anonymous mass defamation on the Internet. Scary? Yes. But it’s coming nonetheless.
My management team was bickering. Two managers in particular: Leo and Vincent. Both of their projects were fine. Both of their teams were producing, but in any meeting where they were both representing their teams, they just started pushing each other’s buttons. Every meeting on some trivial topic:
Leo: “Vincent, are you on track to ship the tool on Wednesday?”
Vincent: “We’re on schedule.”
Leo: “For Wednesday?”
Vincent: “We’ll hit our schedule.”
Endless passive aggressive verbal warfare. Two type A personalities who absolutely hated to be told what to do. My 1:1s with each of them were productive meetings and when I brought up the last Leo’n’Vincent battle of the wills, they immediately started pointing at their counterpart: “I really don’t know what his problem is.”
I do. They didn’t trust each other.
On the Topic of Trust
There’s a question out there regarding how close you want to get with you co-workers in your job. There’s a camp out there that employs a policy of “professional distance”. This camp believes it is appropriate to keep those they work with at arm’s length.
The managerial reason here is more concrete than the individual reasoning. Managers are representatives or officers of the company and, as such, may be asked to do randomly enforce the will of the business. Who gets laid off? Why doesn’t this person get a raise? How much more does this person get? Profession distance or not, these responsibilities will always give managers an air of otherness.
Here’s my question: do you or do you not want be the person someone trusts when they need help? Manager or not, do you see the act of someone trusting you as fitting with who you are?
Yes, there’s a line that needs to be drawn between you and your co-workers, but artificially distancing yourself from the people you spend all day every day with seems like a good way to put artificial barriers between yourself the people you need to get your job done.
Is that who you are or who you want to work for?
The topic of trust is where I draw a line in both my personal and management philosophy. My belief is that a team built on trust and respect is vastly more productive and efficient than the one where managers are distant supervisors and co-workers are 9-to-5 people you occasionally see in meetings. You’re not striving to be everyone’s pal; that’s not the goal. The goal is a set of relationships where there is a mutual belief in each other’s the reliability, truth, ability, and strengths.
And it’s something you can build with a card game.
It’s pronounced how you think. Rhymes with crab. It’s an acronym for a game which, with practice, will knit your team together in unexpected ways. It’s Back Alley Bridge. Here are the rules, but before I explain why this game is a great team building exercise, you need to understand a few of the rules.
BAB isn’t bridge. The game does have a few important similarities. First, it’s a game for four players, involving two teams — the folks facing each other are on the same team and share their score. Second, it’s a trick-based game where the goal is for each team to get as many tricks as possible. A trick is won when each player turns up a card and the highest wins, unless someone plays a trump suit, which, in the case of BAB, is always spades.
Bidding. Also like bridge, BAB has bidding, meaning each team bids how many tricks they think they’re going to get after the cards have been dealt. Scoring is optimized to reward teams who get the number of tricks they bid and heavily punishes those who don’t get their bid. Bidding is a blind team effort — you have no idea what your teammate has in their hand other than what you can infer from their bid.
Decreasing hand count. Unlike bridge, the number of cards each player gets decreases with each hand. Each player gets 13 cards in the first hand, 12 in the second, and so on. Play continues down to a single card and then heads back up to 13. A work-friendly modification I’ve made is to only play every other hand (13-11-9, etc.) This number of hands fits nicely into a lunch hour.
Hail Mary. There are two special bids: Board and Boston. A bid of Board indicates the team is going to take every single bid. A board of Boston indicates the team intends to take the first six. Achieving a Board or Boston can be an impressive feat and is rewarded handsomely from a scoring perspective. Failure results in a scoring beat-down. Both of these special bids allow for wild variances in the score, which can be handy for teams who are falling behind.
Scoring, game play, and other information are in the complete rules. Now, let me explain why I picked this game as a recurring weekly lunch meeting.
In BAB, you talk shit. I’ve landed BAB in three different teams now and in each case, the amount of trash talking that showed up once players became comfortable with the game was impressive. This is a function of my personality, but it’s also a byproduct of any healthy competition amongst bright people. It’s also a sign of a healthy team. I’ll explain.
Trash talking is improvisational critical thinking — it’s the art of building comedy in the moment with only the immediate materials provided. As I’m looking for candidates for my next BAB game, I’m looking for two things: who will be able to talk trash and who needs to receive it?
Dan Hesse, the CEO of Sprint, is back making commercials for his company. In the latest version, he is doing more of what he did in previous ads - selling on price. The top guy in the company, the big boss, numero uno, looks straight into the camera and tells you Sprint's latest calling plan is better than the competition's. That's what the most senior person in the company wants us to know about his company - they're cheap.
On the opposite side of the spectrum is Phil Knight, the charismatic founder and former CEO of Nike. Knight was the keynote speaker at a conference and, like the CEO of Sprint, he too made a case for why you should choose Nike over the competition. But Knight took a different approach. He didn’t say what Nike does or how they are better. And he certainly didn't attempt to differentiate the company based on price. Instead, he told a story that explains Why Nike exists.
Looking across the audience, Knight asked those who run to stand up. And a good percentage of the room stood up. Then he asked those who run three or more times a week to keep standing; everyone else was asked to sit down.
Looking out at the people left standing, Knight said, "we are for you."
"When you get up at 5 o’clock in the morning to go for a run," he went on, "even if it’s cold and wet out, you go. And when you get to mile 4, we’re the one standing under the lamp post, out there in the cold and wet with you, cheering you on. We’re the inner athlete. We’re the inner champion.”
Without a single mention of their latest technologies or which athletes wear their products, Knight makes a vastly more compelling case for Why we want Nike in our lives. Nike may or may not be better, but we are drawn to them because they have a cause. They know and we know Why the do what they do. The same can not be said for Sprint and so many other companies.
Phil Knight knows Why Nike exists and he tells us. It is the same purpose, cause or belief that inspires his employees as well as his customers. “Just Do It” is more than a tag line, it’s a motto. It’s a cheer. It’s a rallying cry. Are Sprint employees inspired to be cheap?
The mistake Mr. Hesse and so many other marketers make is that they tell us what the company does and how they think they are better, but is not a single mention of Why the company exists in the first place. And it’s the Why that matters most in a purchase decision. People are not attracted to what you do, they are drawn to Why you do it. And Why is what truly differentiates one company from another.