Robert McKee: Story: Substance, Structure, Style and The Principles of Screenwriting
James Bonnet: Stealing Fire from the Gods: A Dynamic New Story Model for Writers and Filmmakers
Steven Katz: Film Directing Shot by Shot : Visualizing from Concept to Screen
Judith Weston: Directing Actors: Creating Memorable Performances for Film & Television
Dov S-S Simens: From Reel to Deal: Everything You Need to Create a Successful Independent Film
Another wonderfully astute, highly enlightening post from one of my favorites: David McRaney at YOU ARE NOT SO SMART (I just bought his book on Kindle, in fact, so the man must be doing something right!). Here he tackles the notion of ego depletion (see also: Decision Fatigue) and how it affects us (often, without our even realizing it) as we go about our day in this ever-increasingly-complex modern world... Writers, in particular, seem to be most prone to ego depletion--when you sit down to edit a piece or, God help you, face the blank page and begin anew, the number of decisions you're making, oh-so-rapidly, all in a row, is so extraordinary (terrifying, really), that it's no wonder even the best, most seasoned writers, the pros at the very top of their game, burn out after 4 to 5 hours or so (I've noticed, over the years, that I can write for about 5 hours in a row--total concentration--before I suddenly, and almost without warning, hit a kind of "wall", much like hitting muscle failure at the gym, after which point I can keep going, but the writing all turns to mush--time to take a long break then, or wrap for the day completely...). What's fascinating to consider here, however, is the idea that ego depletion can, just like muscle exhaustion, can be tracked, measured, quantified, and, under certain circumstances, even blunted or postponed significantly (by taking in more sugar, for instance), to which I say: if that means I can extend my willpower and, thus, my decision making ability, total concentration time, and total effective WRITING time, all simply by taking more pre-scheduled breaks and / or getting a piece of fruit in, well, I'm ALL for that... Check it out--
The Misconception: Willpower is just a metaphor.
The Truth: Willpower is a finite resource.
In 2005, a team of psychologists made a group of college students feel like scum.
The researchers invited the undergraduates into their lab and asked the students to just hang out for a while and get to know each other. The setting was designed to simulate a casual meet-and-greet atmosphere, you know, like a reception or an office Christmas party – the sort of thing that never really feels all that casual?
The students divided into same-sex clusters of about six people each and chatted for 20 minutes using conversation starters provided by the researchers. They asked things like “Where are you from?” and “What is your major?” and “If you could travel anywhere in the world, where would you go?” Researchers asked the students beforehand to make an effort to learn each other’s names during the hang-out period, which was important, because the next task was to move into a room, sit alone, and write down the names of two people from the fake party with whom the subjects would most like to be partnered for the next part of the study. The researchers noted the responses and asked the students to wait to be called. Unbeknownst to the subjects, their choices were tossed aside while they waited.
The researchers – Roy F. Baumeister, C. Nathan DeWall, Natalie J. Ciarocco and Jean M. Twenge of Florida State, Florida Atlantic, and San Diego State universities – then asked the young men and women to proceed to the next stage of the activity in which the subjects would learn, based on their social skills at the party, what sort of impression they had made on their new acquaintances. This is where it got funky.
The scientists individually told the members of one group of randomly selected people, “everyone chose you as someone they’d like to work with.” To keep each person in the wanted group isolated, the researchers also told each person the groups were already too big and he or she would have to work alone. Students in the wanted group proceeded to the next task with a spring in their step, their hearts filled with moonbeams and fireworks. The scientists individually told each member of another group of randomly selected people, “I hate to tell you this, but no one chose you as someone they wanted to work with.” Believing absolutely no one wanted to hang out with them, people in this group then learned they would have to work by themselves. Punched in the soul, their self-esteem dripping with inky sludge, the people in the unwanted group proceeded to the main task.
The task, the whole point of going through all of this as far as the students knew, was to sit in front of a bowl containing 35 mini chocolate-chip cookies and judge those cookies on taste, smell, and texture. The subjects learned they could eat as many as they wanted while filling out a form commonly used in corporate taste tests. The researchers left them alone with the cookies for 10 minutes.
This was the actual experiment – measuring cookie consumption based on social acceptance. How many cookies would the wanted people eat, and how would their behavior differ from the unwanted? Well, if you’ve had much contact with human beings, and especially if you’ve ever felt the icy embrace of being left-out of the party or getting picked last in kickball, your hypothesis is probably the same as the one put forth by the psychologists. They predicted the rejects would gorge themselves, and so they did. On average the rejects ate twice as many cookies as the popular people. To an outside observer, nothing was different – same setting, same work, similar students sitting alone in front of scrumptious cookies. In their heads though, they were on different planets. For those on the sunny planet with the double-rainbow sky, the cookies were easy to resist. Those on the rocky, lifeless world where the forgotten go to fade away found it more difficult to stay their hands when their desire to reach into the bowl surfaced.
Why did the rejected group feel motivated to keep mushing cookies into their sad faces? Why is it, as explained by the scientists in this study, that social exclusion impairs self-regulation? The answer has to do with something psychologists now call ego depletion, and you would be surprised to learn how many things can cause it, how often you feel it, and how much in life depends on it. Before we get into all of that, let’s briefly discuss the ego.
>> To read the rest of the article at YOU ARE NOT SO SMART click here
Thursday, September 27, 2012 at 09:34 AM in Business, Life, Philosophy, Screenwriting, Work, Writing | Permalink
As a few of us writer / actor / director types here in New York City debate the (now, it would seem) inevitable move to Los Angeles, aside from the obvious "Abandon all hope, ye who enter here..." sign which should surely be posted (just for us New Yorkers) right outside the taxi stand at LAX, perhaps a few sagely words of wisdom might be in order, with all this. Check out this post my man Darby Parker just kicked my way, written by one Derek Sivers (of CD Baby fame), re: "Advice on Moving to Los Angeles". In particular I like--
Americans are already quite individualist, but Los Angeles is the most individualist part of America. Because so many people are employed by the entertainment industry, many are self-employed freelancers. They’re very focused on themselves. People talk about themselves a lot because they feel they have to, for survival, for self-promotion. Just as you can’t fault anyone in the world for doing something for survival, try not to fault them for being so self-promotional. Learn to lovingly listen like you’d listen to an 8-year-old who excitedly tells you about their train set for an hour.
...
Every culture values different things. In some places, it’s your bloodline. In others, your university. In others, it’s where you live. In LA, it’s who you know. Since the entertainment industry is all about short-term projects, everyone survives by their next project, and these projects always come from a connection. So everyone is collecting contacts. (Again: it’s survival.) Friendships are pragmatic and often short. Don’t fault them for talking about who they know, the same way you wouldn’t fault someone from India asking about your family. Introducing people to each other, people who could potentially work together, is the most valuable thing you can do, as it raises your value and theirs. LA people want (NEED!) to have powerful well-connected friends, to survive and thrive.
>> To read the rest of the article at DEREK SIVERS click here
Monday, September 26, 2011 at 03:00 PM in Life, Philosophy | Permalink
Three men doing time in Israeli prisons recently appeared before a parole board consisting of a judge, a criminologist and a social worker. The three prisoners had completed at least two-thirds of their sentences, but the parole board granted freedom to only one of them. Guess which one:
Case 1 (heard at 8:50 a.m.): An Arab Israeli serving a 30-month sentence for fraud.
Case 2 (heard at 3:10 p.m.): A Jewish Israeli serving a 16-month sentence for assault.
Case 3 (heard at 4:25 p.m.): An Arab Israeli serving a 30-month sentence for fraud.
There was a pattern to the parole board’s decisions, but it wasn’t related to the men’s ethnic backgrounds, crimes or sentences. It was all about timing, as researchers discovered by analyzing more than 1,100 decisions over the course of a year. Judges, who would hear the prisoners’ appeals and then get advice from the other members of the board, approved parole in about a third of the cases, but the probability of being paroled fluctuated wildly throughout the day. Prisoners who appeared early in the morning received parole about 70 percent of the time, while those who appeared late in the day were paroled less than 10 percent of the time.
The odds favored the prisoner who appeared at 8:50 a.m. — and he did in fact receive parole. But even though the other Arab Israeli prisoner was serving the same sentence for the same crime — fraud — the odds were against him when he appeared (on a different day) at 4:25 in the afternoon. He was denied parole, as was the Jewish Israeli prisoner at 3:10 p.m, whose sentence was shorter than that of the man who was released. They were just asking for parole at the wrong time of day.
There was nothing malicious or even unusual about the judges’ behavior, which was reported earlier this year by Jonathan Levav of Stanford and Shai Danziger of Ben-Gurion University. The judges’ erratic judgment was due to the occupational hazard of being, as George W. Bush once put it, “the decider.” The mental work of ruling on case after case, whatever the individual merits, wore them down. This sort of decision fatigue can make quarterbacks prone to dubious choices late in the game and C.F.O.’s prone to disastrous dalliances late in the evening. It routinely warps the judgment of everyone, executive and nonexecutive, rich and poor — in fact, it can take a special toll on the poor. Yet few people are even aware of it, and researchers are only beginning to understand why it happens and how to counteract it.
Decision fatigue helps explain why ordinarily sensible people get angry at colleagues and families, splurge on clothes, buy junk food at the supermarket and can’t resist the dealer’s offer to rustproof their new car. No matter how rational and high-minded you try to be, you can’t make decision after decision without paying a biological price. It’s different from ordinary physical fatigue — you’re not consciously aware of being tired — but you’re low on mental energy. The more choices you make throughout the day, the harder each one becomes for your brain, and eventually it looks for shortcuts, usually in either of two very different ways. One shortcut is to become reckless: to act impulsively instead of expending the energy to first think through the consequences. (Sure, tweet that photo! What could go wrong?) The other shortcut is the ultimate energy saver: do nothing. Instead of agonizing over decisions, avoid any choice. Ducking a decision often creates bigger problems in the long run, but for the moment, it eases the mental strain. You start to resist any change, any potentially risky move — like releasing a prisoner who might commit a crime. So the fatigued judge on a parole board takes the easy way out, and the prisoner keeps doing time.
Decision fatigue is the newest discovery involving a phenomenon called ego depletion, a term coined by the social psychologist Roy F. Baumeister in homage to a Freudian hypothesis. Freud speculated that the self, or ego, depended on mental activities involving the transfer of energy. He was vague about the details, though, and quite wrong about some of them (like his idea that artists “sublimate” sexual energy into their work, which would imply that adultery should be especially rare at artists’ colonies). Freud’s energy model of the self was generally ignored until the end of the century, when Baumeister began studying mental discipline in a series of experiments, first at Case Western and then at Florida State University.
If utopia is supposed to be the ideal and perfect place, where everyone lives in harmony, then why do so many of them turn out to suck? To get an answer, let's go to the source: Thomas More, whose 1516 travelogue Utopia gave us the word, a pun meaning "no place" and "perfect place." More's Utopia describes an island where everyone is happy and smiling and living in divinely inspired synchronization. Told with verve and a sly wit, Utopia is one of the foundational texts of contemporary science fiction as well as utopian thought.
But More wasn't just a writer of fantastic tales. He was also a politician and one-time Undersheriff of London. As such, More was not only an enthusiastic upholder of a radically unequal and oppressive social order, but also an advocate for burning 16th century heretics. Live by the sword, die by the sword: in 1535 Henry VIII beheaded More and anyone else who didn't support his accession to Supreme Head of the Church of England. The violence of More's historical period is never far from the surface of More's island Utopia, where a single act of adultery is punishable by slavery and serial adulterers are punished with death. If More's narrator had looked past the happy smiling faces of Utopia, what fear and violence might he have seen?
Yet utopia—a word that has come to represent a hope that the future could surpass the present—persists. "As long as necessity is socially dreamed," Guy Debord says in his 1973 film The Society of the Spectacle, "dreaming will remain a social necessity." Debord meant that in conditions of inequality and injustice, people will always imagine a better place. What constitutes "better" is, however, a matter of much dispute. We dream our fears as well as hopes, reflecting all the agonies and contradictions of the waking world; in dreams, demons rise from our darkest places. This is the dangerous element in utopian aspiration, the monster behind the smiling face. Utopias can embody the highest hopes of humankind and frameworks for continuous evolution, but they can also reflect our worst fears and sickest appetites—not to mention a mania for power and control that is latent in every person. "What a strange scene you describe and what strange prisoners," says Glaucon, Socrates' disciple, in Plato's Republic, the template for the stupid utopia. "They are just like us," answers the master.
Plato's Republic
Told in the form of a dialogue between Socrates and Glaucon, one of the earliest and most primitive utopias is all about limits, discipline, and hierarchy. The number of inhabitants of the Republic, pronounced Socrates, should be limited to 5,040 in order to maximize conformity and control. In this most "just" of cities, women and children are property, for how could they be otherwise? "For men born and educated like our citizens," Socrates says, "the only way of arriving at the right conclusion about the possession and use of women and children is to follow the path on which we originally started, when we said that the men were to be the guardians and watchdogs of the herd." Quite naturally, the State would be ruled over by men most like Socrates himself, philosopher-kings, "the best of our citizens." This guardian class would live communally and apart from the herd: "having wives and children in common, they must live in common houses and meet at common meals."
Exactly how stupid is Plato's Republic, and who am I to call one of history's greatest philosophers "stupid"? Is Plato's time simply too different from our own for us to pass judgment? I don't think so, for The Republic lives on in the rhetoric of contemporary political movements of both right and left—every elitist and technocratic fantasy of our time has grown from the seed of The Republic. Plato would not have understood the term "dehumanization" as we understand it—he'd never, of course, seen a factory floor or a gas chamber—but when his ideas have been enacted in places like the Soviet Union, Mussolini's Italy, or modern state-capitalist China, they have proven brutally dehumanizing, his apparat of "guardians" thoroughly corrupted by power.
Though protected from criticism by the moldy gauze of antiquity, Plato and his teacher/mouthpiece Socrates are not so different from us; they had their fears and prejudices just as we have ours. We still live in the fabled cave Socrates describes in The Republic, watching the shadows on the walls and thinking them reality. He thought that only the philosopher could throw off his shackles in the sensible world and leave the cave for the heaven of Ideas; the philosopher alone could wield this pure knowledge in ruling the Republic. In didn't occur to Plato/Socrates that the World of Forms beyond the cave might only be another shadow or even hallucination. Plato could not escape the trap into which any utopian can fall: he didn't believe enough in his own fallibility.
The City on the Hill
More than just a source of gold, in the Christian imagination the New World represented the triumph of the natural ideal over a decadent European culture. Naked and innocent "Indians," living in communitarian grace, appeared immediately in the writings of Conquistadors and would serve to bolster a utopian image in Europe of the New World. (This chimera persists, mutated, in New Age idealizations of indigenous culture.) When the natives wouldn't conform to that image, in due course it became necessary destroy their villages so that their souls might be saved. "They are not fit to command or lead," said one exemplary Catholic of those he deemed his racial inferiors, "but to be commanded and lead."
Up North, their Calvinist counterparts arrived and set about creating a Protestant utopia of everlasting hierarchy. "We must consider that we shall be a city upon a hill," preached Massachusetts governor John Winthrop. "God Almighty in His most holy and wise providence, hath soe disposed of the condition of mankind, as in all times some must be rich, some poore, some high and eminent in power and dignitie; others mean and in submission." The New World was to be More's Utopia, at last made real. In Salem sixty years later, witches would burn. A Native American apocalypse was not far behind, followed by Filipinos, El Salvadorans, Vietnamese, Iraqis, and anyone else who ran afoul of this utopian marriage of theocratic and imperial aspiration.
>>To read the rest of the article at STRANGE HORIZONS click here
Sunday, July 03, 2011 at 04:42 PM in Life, Philosophy | Permalink
Thursday, June 23, 2011 at 07:15 PM in Comedy, Internet, Life, Technology | Permalink
The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents... Some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the light into the peace and safety of a new Dark Age.
-- HP Lovecraft, Call of Cthulhu, 1926
Monday, May 30, 2011 at 06:32 PM in Life, Philosophy, Technology | Permalink
Perhaps we could endeavor to teach our future the following:
- How to focus intently on a problem until it's solved.
- The benefit of postponing short-term satisfaction in exchange for long-term success.
- How to read critically.
- The power of being able to lead groups of peers without receiving clear delegated authority.
- An understanding of the extraordinary power of the scientific method, in just about any situation or endeavor.
- How to persuasively present ideas in multiple forms, especially in writing and before a group.
- Project management. Self-management and the management of ideas, projects and people.
- Personal finance. Understanding the truth about money and debt and leverage.
- An insatiable desire (and the ability) to learn more. Forever.
- Most of all, the self-reliance that comes from understanding that relentless hard work can be applied to solve problems worth solving.
>> To read the rest of the article at SETH'S BLOG click here
Saturday, May 07, 2011 at 12:55 PM in Life, Philosophy, Work | Permalink
The Misconception: You procrastinate because you are lazy and can’t manage your time well.
The Truth: Procrastination is fueled by weakness in the face of impulse and a failure to think about thinking.
Netflix reveals something about your own behavior you should have noticed by now, something which keeps getting between you and the things you want to accomplish.
If you have Netflix, especially if you stream it to your TV, you tend to gradually accumulate a cache of hundreds of films you think you’ll watch one day. This is a bigger deal than you think.
Take a look at your queue. Why are there so damn many documentaries and dramatic epics collecting virtual dust in there? By now you could draw the cover art to “Dead Man Walking” from memory. Why do you keep passing over it?
Psychologists actually know the answer to this question, to why you keep adding movies you will never watch to your growing collection of future rentals, and its the same reason you believe you will eventually do what’s best for yourself in all the other parts of your life, but rarely do.
A study conducted in 1999 by Read, Loewenstein and Kalyanaraman had people pick three movies out of a selection of 24. Some were lowbrow like “Sleepless in Seattle” or “Mrs. Doubtfire.” Some were highbrow like “Schindler’s List” or “The Piano.” In other words, it was a choice between movies which promised to be fun and forgettable or would be memorable but require more effort to absorb.
After picking, the subjects had to watch one movie right away. They then had to watch another in two days and a third two days after that.
Most people picked Schindler’s List as one of their three. They knew it was a great movie because all their friends said it was. All the reviews were glowing, and it earned dozens of the highest awards. Most didn’t, however, choose to watch it on the first day.
Instead, people tended to pick lowbrow movies on the first day. Only 44 percent went for the heavier stuff first. The majority tended to pick comedies like “The Mask” or action flicks like “Speed” when they knew they had to watch it forthwith.
Planning ahead, people picked highbrow movies 63 percent of the time for their second movie and 71 percent of the time for their third.
When they ran the experiment again but told subjects they had to watch all three selections back-to-back, “Schindler’s List” was 13 times less likely to be chosen at all.
The researchers had a hunch people would go for the junk food first, but plan healthy meals in the future.
Many studies over the years have shown you tend to have time-inconsistent preferences. When asked if you would rather have fruit or cake one week from now, you will usually say fruit. A week later when the slice of German chocolate and the apple are offered, you are statistically more likely to go for the cake.
This is why your Netflix queue is full of great films you keep passing over for “Family Guy.” With Netflix, the choice of what to watch right now and what to watch later is like candy bars versus carrot sticks. When you are planning ahead, your better angels point to the nourishing choices, but in the moment you go for what tastes good.
As behavioral economist Katherine Milkman has pointed out, this is why grocery stores put candy right next to the checkout.
This is sometimes called present bias – being unable to grasp what you want will change over time, and what you want now isn’t the same thing you will want later. Present bias explains why you buy lettuce and bananas only to throw them out later when you forget to eat them. This is why when you are a kid you wonder why adults don’t own more toys.
>> To read the rest of the article at YOU ARE NOT SO SMART click here
Thursday, October 28, 2010 at 06:32 AM in Business, Life, Philosophy, Screenwriting, Work, Writing | Permalink
Earlier this year, just 2,300 of 32,000 applicants to Stanford University were accepted — a rate of 7.2%, the lowest in the school's history.
Sumo stable in Tokyo, Japan: you don’t need to be a superstar to use the Superstar Effect.The students who survived this screening are phenomenally accomplished. A quarter had SAT math scores higher than 780, and over 90% had high school G.P.A.'s above 3.75, which works out, more or less, to straight A's over four years of schooling. And these weren't easy A's: the average applicant to a top-tier university takes an overwhelming volume of demanding AP or IB-level courses. (Not surprising, considering that the Stanford admissions departments ranks the "rigor of secondary school record" as "very important" in their decision.)
If you eliminate recruited athletes and the children of the rich and famous from this pool — categories that receive special consideration — these numbers become even starker. In short, for the average, middle-class American high school senior, applying to Stanford is like playing the lottery.
Which is why Michael Silverman proves baffling.
When Michael, a student from Paradise Valley, Arizona, applied to Stanford, his G.P.A. put him in the bottom 10% of accepted students. His SAT scores fell similarly short. "Standardized testing isn't my strong point," he told me. Perhaps more surprising, Michael avoided the crushing course load that diminishes the will of so many college hopefuls, instead taking only a single AP course during the dreaded junior year. He kept his extracurricular schedule equally clean — joining no clubs or sports and dedicating his attention to no more than one outside project at any given time.
Michael's rejection of the no pain, no gain ethos surrounding American college admissions is perhaps best summarized by his habit of ending each school day with a 1 – 2 hour hike to the summit of nearby Camelback Mountain. While his peers worked slavishly at their killer schedules, Michael took in the view, using his ritual as a time to "chill out and relax."
Despite this heretical behavior, Michael was still accepted at Stanford. To understand why, I will turn your attention to a little-known economics theory that changes the way we think about impressiveness. To get there, however, we'll start at an unlikely location: the competitive world of professional opera singers.
The Opera Singer and the Valedictorian
Juan Diego Florez cemented his reputation as a top operatic tenor during a 2008 performance of Gaetano Donizetti's La Fille du Regiment. Among professional singers, Donizetti's masterpiece is known as "the Mount Everest of opera"; a reputation due, almost entirely, to a devilishly tricky aria, "Ah! Mes amis, quel jour de fete," that arrives early in the first act. The aria demands the tenor to hit nine high C's in a row — a supremely difficult feat. To avoid embarrassment, most performers resort to the far easier natural C.
Not Florez.
In his 2008 performance of Donizetti, at the Metropolitan Opera House, Florez hit all nine notes. The acclaim was so overwhelming that he was summoned back to the stage for an encore, overturning the Met's long-standing ban on the practice.
As a top opera singer, we can assume that Florez does well for himself financially (likely on the order of 5-digit paydays per performance), but not lavishly well. Put another way: he's well-off but not wealthy.
Then there are the superstars.
In 1972, a young tenor by the name of Luciano Pavarotti also made a name for himself performing Donizetti at the Met. Like Florez, he too hit the high C's. But there was something extra in Pavarotti's voice. The audience at the Met in 1972 did more than demand an encore from Pavarotti, they weren't content until he had returned to the stage seventeen times! In writing about Florez's 2008 performance, the New York Times noted: "If truth be told, it's not as hard as it sounds for a tenor with a light lyric voice like Mr. Florez to toss off those high C's…[I]n the early 1970's, when Luciano Pavarotti…let those high Cs ring out, that was truly astonishing."
In other words, both Florez and Pavarotti are exceptional tenors, but Pavarotti was slightly better — the best among an elite class. The impact of this small difference, however, was huge. Whereas we estimated that Florez was well off but not wealthy, when Pavoratti died in 2007, sources estimated his estate to be worth $275 to 475 million.
In a 1981 paper published in the American Economics Review, the economist Sherwin Rosen worked through the mathematics that explains why superstars, like Pavarotti, reap so many more rewards than peers who are only slightly less talented. He called the phenomenon, “The Superstar Effect.”
Though the details of Rosen's formulas are complex, the intuition is simple: Imagine a million opera fans who each have $10 to spend on an opera album. They're trying to decide whether to buy an album by Florez or Pavarotti. Rosen's theory predicts that the bulk of the consumers will purchase the Pavarotti album, thinking, roughly: "although both singers are great, Pavarotti is the best, and if I can only get one album I might as well get the best one available." The result is that the vast majority of the $10 million goes to Pavarotti, even though his talent advantage over Florez is small.
Once identified, The Superstar Effect turned up in a variety of unexpected settings, from the sales of books to CEO salaries. It was found to apply even in settings that have nothing to do with financial transactions. In a particularly compelling example, a researcher named Paul Atwell, publishing in the journal Sociology of Education in 2001, studied the Superstar Effect for high school valedictorians.
Atwell imagined two students both with 700s on their various SAT tests. The first student was the valedictorian and the second student was ranked number five in the class. Rationally speaking, these two students are near identical — the difference in G.P.A. between the number one and number five rank is vanishingly small. But using statistics from Dartmouth College, Atwell showed that the valedictorian has a 75% of acceptance at this Ivy League institution while the near identical fifth-ranked student has only a 25% chance.
In other words, in many fields, it pays disproportionately well to be not just very good, but the best.
Hacking the Superstar Effect
Taking a step back, we likely agree that it's an interesting finding that being the best has a hidden advantage. If reaping this advantage, however, requires becoming class valedictorian or honing a brilliant singing voice — both staggeringly difficult feats — it's doesn't seem all that applicable.
This is where Michael Silverman reenters the picture.
The details of his story reveal a crucial addendum that makes the power of the Superstar Effect available to most people. I call this addendum The Superstar Corollary, and it's here I turn your attention next.
I discovered The Superstar Corollary in an unlikely setting: the extracurricular lives of high school students. I was researching a book on students, like Michael, who get accepted to outstanding colleges while still living low-stress and interesting lives. During this research, I kept noticing the same trait in these teen-aged lifehackers: they had accomplishments that triggered The Superstar Effect, but which revealed on closer examination to not require a rare natural talent or years and years of grinding work.
>> To read the rest of the article at TIM FERRISS click here
The French word frisson describes something English has no better word for: a brief intense reaction, usually a feeling of excitement, recognition, or terror. It's often accompanied by a physical shudder, but not so much when you're web surfing.
You know how it happens. You're clicking here or clicking there, and suddenly you have the OMG moment. In recent days, for example, I felt frissons when learning that Gary Coleman had died, that most of the spilled oil was underwater, that Joe McGinness had moved in next to the Palins, that a group of priests' mistresses had started their own Facebook group, and that Bill Nye the Science Guy says "to prevent Computer Vision Syndrome, every 20 minutes, spend 20 seconds looking 20 feet away."
Oh, there were many more. A frisson can be quite a delight. The problem is, I seem to be spending way too much time these days in search of them. In an ideal world, I would sit down at my computer, do my work, and that would be that. In this world, I get entangled in surfing and an hour disappears.
Twitter is an enabler for this behavior. It provides a quiet, subtle pressure to tweet frissons, and be tweeted in return. A good tweet can involve a funny comment, a snarky one, or one so poetic I read it and marvel. It can contain breaking news. It can be a small autobiographical revelation. I enjoy this. Deprived of speech, I chatter all day on Twitter, and have virtual relationships with the carefully chosen Tweeters I follow. Some are great writers. Some are deep thinkers. Some keep me updated on American Idol. Some persist in updating the scores of sporting events. I hate that, except in a situation like the Blackhawks' winning season. I care about the Blackhawks, but not enough to watch. All I require is the frisson.
This is not in praise of Twitter. It has to do with the possibility that my brain--and yours too, since you are here--has been rewired by the internet. There's an article by Nicholas Carr in the new issue of Wired magazine about a UCLA professor who used an MRI scan to observe the brain activity of six volunteers. Three were web veterans, three were not. He found that veteran Web users had developed "distinctive neural pathways."He asked his newbies to surf the web for six days, and then he repeated the experiment: "The new scans revealed that their brain activity had changed dramatically; it now resembled that of the veteran surfers." The article suggests this possibility: "When we go online, we enter an environment that promotes cursory reading, hurried and distracted thinking, and superficial learning. Even as the Internet grants us easy access to vast amounts of information, it is turning us into shallower thinkers, literally changing the structure of our brain."
In other words, instead of seeking substance, we're distractedly scurrying hither and yon, seeking frisson.
>> To read the rest of the article at ROGER EBERT click here
Wednesday, June 02, 2010 at 11:10 AM in Art, Internet, Life, Philosophy | Permalink