Documents Show Media Plotting To Kill Stories About Rev. Jeremiah Wright
By Jonathan Strong
http://dailycaller.com/2010/07/20/documents-show-media-plotting-to-kill-stories-about-rev-jeremiah-wright/It was the moment of greatest peril for then-Sen. Barack Obama’s political career. In the heat of the presidential campaign, videos surfaced of Obama’s pastor, the Rev. Jeremiah Wright, angrily denouncing whites, the U.S. government and America itself. Obama had once bragged of his closeness to Wright. Now the black nationalist preacher’s rhetoric was threatening to torpedo Obama’s campaign.
The crisis reached a howling pitch in mid-April, 2008, at an ABC News debate moderated by Charlie Gibson and George Stephanopoulos. Gibson asked Obama why it had taken him so long – nearly a year since Wright’s remarks became public – to dissociate himself from them. Stephanopoulos asked, “Do you think Reverend Wright loves America as much as you do?”
Watching this all at home were members of Journolist, a listserv comprised of several hundred liberal journalists, as well as like-minded professors and activists. The tough questioning from the ABC anchors left many of them outraged. “George [Stephanopoulos],” fumed Richard Kim of the Nation, is “being a disgusting little rat snake.”
Others went further. According to records obtained by The Daily Caller, at several points during the 2008 presidential campaign a group of liberal journalists took radical steps to protect their favored candidate. Employees of news organizations including Time, Politico, the Huffington Post, the Baltimore Sun, the Guardian, Salon and the New Republic participated in outpourings of anger over how Obama had been treated in the media, and in some cases plotted to fix the damage.
In one instance, Spencer Ackerman of the Washington Independent urged his colleagues to deflect attention from Obama’s relationship with Wright by changing the subject. Pick one of Obama’s conservative critics, Ackerman wrote, “Fred Barnes, Karl Rove, who cares — and call them racists.”
Michael Tomasky, a writer for the Guardian, also tried to rally his fellow members of Journolist: “Listen folks–in my opinion, we all have to do what we can to kill ABC and this idiocy in whatever venues we have. This isn’t about defending Obama. This is about how the [mainstream media] kills any chance of discourse that actually serves the people.”
“Richard Kim got this right above: ‘a horrible glimpse of general election press strategy.’ He’s dead on,” Tomasky continued. “We need to throw chairs now, try as hard as we can to get the call next time. Otherwise the questions in October will be exactly like this. This is just a disease.”
(In an interview Monday, Tomasky defended his position, calling the ABC debate an example of shoddy journalism.)
Thomas Schaller, a columnist for the Baltimore Sun as well as a political science professor, upped the ante from there. In a post with the subject header, “why don’t we use the power of this list to do something about the debate?” Schaller proposed coordinating a “smart statement expressing disgust” at the questions Gibson and Stephanopoulos had posed to Obama.
“It would create quite a stir, I bet, and be a warning against future behavior of the sort,” Schaller wrote.
Tomasky approved. “YES. A thousand times yes,” he exclaimed.
The members began collaborating on their open letter. Jonathan Stein of Mother Jones rejected an early draft, saying, “I’d say too short. In my opinion, it doesn’t go far enough in highlighting the inanity of some of [Gibson's] and [Stephanopoulos’s] questions. And it doesn’t point out their factual inaccuracies …Our friends at Media Matters probably have tons of experience with this sort of thing, if we want their input.”
Jared Bernstein, who would go on to be Vice President Joe Biden’s top economist when Obama took office, helped, too. The letter should be “Short, punchy and solely focused on vapidity of gotcha,” Bernstein wrote.
In the midst of this collaborative enterprise, Holly Yeager, now of the Columbia Journalism Review, dropped into the conversation to say “be sure to read” a column in that day’s Washington Post that attacked the debate.
Columnist Joe Conason weighed in with suggestions. So did Slate contributor David Greenberg, and David Roberts of the website Grist. Todd Gitlin, a professor of journalism at Columbia University, helped too.
Journolist members signed the statement and released it April 18, calling the debate “a revolting descent into tabloid journalism and a gross disservice to Americans concerned about the great issues facing the nation and the world.”
The letter caused a brief splash and won the attention of the New York Times. But only a week later, Obama – and the journalists who were helping him – were on the defensive once again.
Jeremiah Wright was back in the news after making a series of media appearances. At the National Press Club, Wright claimed Obama had only repudiated his beliefs for “political reasons.” Wright also reiterated his charge that the U.S. federal government had created AIDS as a means of committing genocide against African Americans.
It was another crisis, and members of Journolist again rose to help Obama.
Chris Hayes of the Nation posted on April 29, 2008, urging his colleagues to ignore Wright. Hayes directed his message to “particularly those in the ostensible mainstream media” who were members of the list.
The Wright controversy, Hayes argued, was not about Wright at all. Instead, “It has everything to do with the attempts of the right to maintain control of the country.”
Hayes castigated his fellow liberals for criticizing Wright. “All this hand wringing about just how awful and odious Rev. Wright remarks are just keeps the hustle going.”
“Our country disappears people. It tortures people. It has the blood of as many as one million Iraqi civilians — men, women, children, the infirmed — on its hands. You’ll forgive me if I just can’t quite dredge up the requisite amount of outrage over Barack Obama’s pastor,” Hayes wrote.
Hayes urged his colleagues – especially the straight news reporters who were charged with covering the campaign in a neutral way – to bury the Wright scandal. “I’m not saying we should all rush en masse to defend Wright. If you don’t think he’s worthy of defense, don’t defend him! What I’m saying is that there is no earthly reason to use our various platforms to discuss what about Wright we find objectionable,” Hayes said.
(Reached by phone Monday, Hayes argued his words then fell on deaf ears. “I can say ‘hey I don’t think you guys should cover this,’ but no one listened to me.”)
Katha Pollitt – Hayes’s colleague at the Nation – didn’t disagree on principle, though she did sound weary of the propaganda. “I hear you. but I am really tired of defending the indefensible. The people who attacked Clinton on Monica were prissy and ridiculous, but let me tell you it was no fun, as a feminist and a woman, waving aside as politically irrelevant and part of the vast rightwing conspiracy Paula, Monica, Kathleen, Juanita,” Pollitt said.
“Part of me doesn’t like this shit either,” agreed Spencer Ackerman, then of the Washington Independent. “But what I like less is being governed by racists and warmongers and criminals.”
Ackerman went on:
"I do not endorse a Popular Front, nor do I think you need to. It’s not necessary to jump to Wright-qua-Wright’s defense. What is necessary is to raise the cost on the right of going after the left. In other words, find a rightwinger’s [sic] and smash it through a plate-glass window. Take a snapshot of the bleeding mess and send it out in a Christmas card to let the right know that it needs to live in a state of constant fear. Obviously I mean this rhetorically."
"And I think this threads the needle. If the right forces us all to either defend Wright or tear him down, no matter what we choose, we lose the game they’ve put upon us. Instead, take one of them — Fred Barnes, Karl Rove, who cares — and call them racists. Ask: why do they have such a deep-seated problem with a black politician who unites the country? What lurks behind those problems? This makes *them* sputter with rage, which in turn leads to overreaction and self-destruction."
Ackerman did allow there were some Republicans who weren’t racists. “We’ll know who doesn’t deserve this treatment — Ross Douthat, for instance — but the others need to get it.” He also said he had begun to implement his plan. “I previewed it a bit on my blog last week after Commentary wildly distorted a comment Joe Cirincione made to make him appear like (what else) an antisemite. So I said: why is it that so many on the right have such a problem with the first viable prospective African-American president?”
Several members of the list disagreed with Ackerman – but only on strategic grounds.
“Spencer, you’re wrong,” wrote Mark Schmitt, now an editor at the American Prospect. “Calling Fred Barnes a racist doesn’t further the argument, and not just because Juan Williams is his new black friend, but because that makes it all about character. The goal is to get to the point where you can contrast some _thing_ — Obama’s substantive agenda — with this crap.”
(In an interview Monday, Schmitt declined to say whether he thought Ackerman’s plan was wrong. “That is not a question I’m going to answer,” he said.)
Kevin Drum, then of Washington Monthly, also disagreed with Ackerman’s strategy. “I think it’s worth keeping in mind that Obama is trying (or says he’s trying) to run a campaign that avoids precisely the kind of thing Spencer is talking about, and turning this into a gutter brawl would probably hurt the Obama brand pretty strongly. After all, why vote for him if it turns out he’s not going change the way politics works?”
But it was Ackerman who had the last word. “Kevin, I’m not saying OBAMA should do this. I’m saying WE should do this.”
Liberal journalists suggest government shut down Fox News
By Jonathan Strong
http://dailycaller.com/2010/07/21/liberal-journalists-suggest-government-shut-down-fox-news/If you were in the presence of a man having a heart attack, how would you respond? As he clutched his chest in desperation and pain, would you call 911? Would you try to save him from dying? Of course you would.
But if that man was Rush Limbaugh, and you were Sarah Spitz, a producer for National Public Radio, that isn’t what you’d do at all.
In a post to the list-serv Journolist, an online meeting place for liberal journalists, Spitz wrote that she would “Laugh loudly like a maniac and watch his eyes bug out” as Limbaugh writhed in torment.
In boasting that she would gleefully watch a man die in front of her eyes, Spitz seemed to shock even herself. “I never knew I had this much hate in me,” she wrote. “But he deserves it.”
Spitz’s hatred for Limbaugh seems intemperate, even imbalanced. On Journolist, where conservatives are regarded not as opponents but as enemies, it barely raised an eyebrow.
In the summer of 2009, agitated citizens from across the country flocked to town hall meetings to berate lawmakers who had declared support for President Obama’s health care bill. For most people, the protests seemed like an exercise in participatory democracy, rowdy as some of them became.
On Journolist, the question was whether the protestors were garden-variety fascists or actual Nazis.
“You know, at the risk of violating Godwin’s law, is anyone starting to see parallels here between the teabaggers and their tactics and the rise of the Brownshirts?” asked Bloomberg’s Ryan Donmoyer. “Esp. Now that it’s getting violent? Reminds me of the Beer Hall fracases of the 1920s.”
Richard Yeselson, a researcher for an organized labor group who also writes for liberal magazines, agreed. “They want a deficit driven militarist/heterosexist/herrenvolk state,” Yeselson wrote. “This is core of the Bush/Cheney base transmorgrified into an even more explicitly racialized/anti-cosmopolitan constituency. Why? Um, because the president is a black guy named Barack Hussein Obama. But it’s all the same old nuts in the same old bins with some new labels: the gun nuts, the anti tax nuts, the religious nuts, the homophobes, the anti-feminists, the anti-abortion lunatics, the racist/confederate crackpots, the anti-immigration whackos (who feel Bush betrayed them) the pathological government haters (which subsumes some of the othercategories, like the gun nuts and the anti-tax nuts).”
“I’m not saying these guys are capital F-fascists,” added blogger Lindsay Beyerstein, “but they don’t want limited government. Their desired end looks more like a corporate state than a rugged individualist paradise. The rank and file wants a state that will reach into the intimate of citizens when it comes to sex, reproductive freedom, censorship, and rampant incarceration in the name of law and order.”
On Journolist, there was rarely such thing as an honorable political disagreement between the left and right, though there were many disagreements on the left. In the view of many who’ve posted to the list-serv, conservatives aren’t simply wrong, they are evil. And while journalists are trained never to presume motive, Journolist members tend to assume that the other side is acting out of the darkest and most dishonorable motives.
When the writer Victor Davis Hanson wrote an article about immigration for National Review, for example, blogger Ed Kilgore didn’t even bother to grapple with Hanson’s arguments. Instead Kilgore dismissed Hanson’s piece out of hand as “the kind of Old White Guy cultural reaction that is at the heart of the Tea Party Movement. It’s very close in spirit to the classic 1970s racist tome, The Camp of the Saints, where White Guys struggle to make up their minds whether to go out and murder brown people or just give up.”
The very existence of Fox News, meanwhile, sends Journolisters into paroxysms of rage. When Howell Raines charged that the network had a conservative bias, the members of Journolist discussed whether the federal government should shut the channel down.
“I am genuinely scared” of Fox, wrote Guardian columnist Daniel Davies, because it “shows you that a genuinely shameless and unethical media organisation *cannot* be controlled by any form of peer pressure or self-regulation, and nor can it be successfully cold-shouldered or ostracised. In order to have even a semblance of control, you need a tough legal framework.” Davies, a Brit, frequently argued the United States needed stricter libel laws.
“I agree,” said Michael Scherer of Time Magazine. Roger “Ailes understands that his job is to build a tribal identity, not a news organization. You can’t hurt Fox by saying it gets it wrong, if Ailes just uses the criticism to deepen the tribal identity.”
Jonathan Zasloff, a law professor at UCLA, suggested that the federal government simply yank Fox off the air. “I hate to open this can of worms,” he wrote, “but is there any reason why the FCC couldn’t simply pull their broadcasting permit once it expires?”
And so a debate ensued. Time’s Scherer, who had seemed to express support for increased regulation of Fox, suddenly appeared to have qualms: “Do you really want the political parties/white house picking which media operations are news operations and which are a less respectable hybrid of news and political advocacy?”
But Zasloff stuck to his position. “I think that they are doing that anyway; they leak to whom they want to for political purposes,” he wrote. “If this means that some White House reporters don’t get a press pass for the press secretary’s daily briefing and that this means that they actually have to, you know, do some reporting and analysis instead of repeating press releases, then I’ll take that risk.”
Scherer seemed alarmed. “So we would have press briefings in which only media organizations that are deemed by the briefer to be acceptable are invited to attend?”
John Judis, a senior editor at the New Republic, came down on Zasloff’s side, the side of censorship. “Pre-Fox,” he wrote, “I’d say Scherer’s questions made sense as a question of principle. Now it is only tactical.”
The Creativity Crisis
For the first time, research shows that American creativity is declining. What went wrong—and how we can fix it.
by Po Bronson and Ashley Merryman
http://www.newsweek.com/2010/07/10/the-creativity-crisis.htmlBack in 1958, Ted Schwarzrock was an 8-year-old third grader when he became one of the “Torrance kids,” a group of nearly 400 Minneapolis children who completed a series of creativity tasks newly designed by professor E. Paul Torrance. Schwarzrock still vividly remembers the moment when a psychologist handed him a fire truck and asked, “How could you improve this toy to make it better and more fun to play with?” He recalls the psychologist being excited by his answers. In fact, the psychologist’s session notes indicate Schwarzrock rattled off 25 improvements, such as adding a removable ladder and springs to the wheels. That wasn’t the only time he impressed the scholars, who judged Schwarzrock to have “unusual visual perspective” and “an ability to synthesize diverse elements into meaningful products.”
The accepted definition of creativity is production of something original and useful, and that’s what’s reflected in the tests. There is never one right answer. To be creative requires divergent thinking (generating many unique ideas) and then convergent thinking (combining those ideas into the best result).
In the 50 years since Schwarzrock and the others took their tests, scholars—first led by Torrance, now his colleague, Garnet Millar—have been tracking the children, recording every patent earned, every business founded, every research paper published, and every grant awarded. They tallied the books, dances, radio shows, art exhibitions, software programs, advertising campaigns, hardware innovations, music compositions, public policies (written or implemented), leadership positions, invited lectures, and buildings designed.
Nobody would argue that Torrance’s tasks, which have become the gold standard in creativity assessment, measure creativity perfectly. What’s shocking is how incredibly well Torrance’s creativity index predicted those kids’ creative accomplishments as adults. Those who came up with more good ideas on Torrance’s tasks grew up to be entrepreneurs, inventors, college presidents, authors, doctors, diplomats, and software developers. Jonathan Plucker of Indiana University recently reanalyzed Torrance’s data. The correlation to lifetime creative accomplishment was more than three times stronger for childhood creativity than childhood IQ.
Like intelligence tests, Torrance’s test—a 90-minute series of discrete tasks, administered by a psychologist—has been taken by millions worldwide in 50 languages. Yet there is one crucial difference between IQ and CQ scores. With intelligence, there is a phenomenon called the Flynn effect—each generation, scores go up about 10 points. Enriched environments are making kids smarter. With creativity, a reverse trend has just been identified and is being reported for the first time here: American creativity scores are falling.
Kyung Hee Kim at the College of William & Mary discovered this in May, after analyzing almost 300,000 Torrance scores of children and adults. Kim found creativity scores had been steadily rising, just like IQ scores, until 1990. Since then, creativity scores have consistently inched downward. “It’s very clear, and the decrease is very significant,” Kim says. It is the scores of younger children in America—from kindergarten through sixth grade—for whom the decline is “most serious.”
The potential consequences are sweeping. The necessity of human ingenuity is undisputed. A recent IBM poll of 1,500 CEOs identified creativity as the No. 1 “leadership competency” of the future. Yet it’s not just about sustaining our nation’s economic growth. All around us are matters of national and international importance that are crying out for creative solutions, from saving the Gulf of Mexico to bringing peace to Afghanistan to delivering health care. Such solutions emerge from a healthy marketplace of ideas, sustained by a populace constantly contributing original ideas and receptive to the ideas of others.
It’s too early to determine conclusively why U.S. creativity scores are declining. One likely culprit is the number of hours kids now spend in front of the TV and playing videogames rather than engaging in creative activities. Another is the lack of creativity development in our schools. In effect, it’s left to the luck of the draw who becomes creative: there’s no concerted effort to nurture the creativity of all children.
Around the world, though, other countries are making creativity development a national priority. In 2008 British secondary-school curricula—from science to foreign language—was revamped to emphasize idea generation, and pilot programs have begun using Torrance’s test to assess their progress. The European Union designated 2009 as the European Year of Creativity and Innovation, holding conferences on the neuroscience of creativity, financing teacher training, and instituting problem-based learning programs—curricula driven by real-world inquiry—for both children and adults. In China there has been widespread education reform to extinguish the drill-and-kill teaching style. Instead, Chinese schools are also adopting a problem-based learning approach.
Plucker recently toured a number of such schools in Shanghai and Beijing. He was amazed by a boy who, for a class science project, rigged a tracking device for his moped with parts from a cell phone. When faculty of a major Chinese university asked Plucker to identify trends in American education, he described our focus on standardized curriculum, rote memorization, and nationalized testing. “After my answer was translated, they just started laughing out loud,” Plucker says. “They said, ‘You’re racing toward our old model. But we’re racing toward your model, as fast as we can.’ ”
Overwhelmed by curriculum standards, American teachers warn there’s no room in the day for a creativity class. Kids are fortunate if they get an art class once or twice a week. But to scientists, this is a non sequitur, borne out of what University of Georgia’s Mark Runco calls “art bias.” The age-old belief that the arts have a special claim to creativity is unfounded. When scholars gave creativity tasks to both engineering majors and music majors, their scores laid down on an identical spectrum, with the same high averages and standard deviations. Inside their brains, the same thing was happening—ideas were being generated and evaluated on the fly.
Researchers say creativity should be taken out of the art room and put into homeroom. The argument that we can’t teach creativity because kids already have too much to learn is a false trade-off. Creativity isn’t about freedom from concrete facts. Rather, fact-finding and deep research are vital stages in the creative process. Scholars argue that current curriculum standards can still be met, if taught in a different way.
To understand exactly what should be done requires first understanding the new story emerging from neuroscience. The lore of pop psychology is that creativity occurs on the right side of the brain. But we now know that if you tried to be creative using only the right side of your brain, it’d be like living with ideas perpetually at the tip of your tongue, just beyond reach.
When you try to solve a problem, you begin by concentrating on obvious facts and familiar solutions, to see if the answer lies there. This is a mostly left-brain stage of attack. If the answer doesn’t come, the right and left hemispheres of the brain activate together. Neural networks on the right side scan remote memories that could be vaguely relevant. A wide range of distant information that is normally tuned out becomes available to the left hemisphere, which searches for unseen patterns, alternative meanings, and high-level abstractions.
Having glimpsed such a connection, the left brain must quickly lock in on it before it escapes. The attention system must radically reverse gears, going from defocused attention to extremely focused attention. In a flash, the brain pulls together these disparate shreds of thought and binds them into a new single idea that enters consciousness. This is the “aha!” moment of insight, often followed by a spark of pleasure as the brain recognizes the novelty of what it’s come up with.
Now the brain must evaluate the idea it just generated. Is it worth pursuing? Creativity requires constant shifting, blender pulses of both divergent thinking and convergent thinking, to combine new information with old and forgotten ideas. Highly creative people are very good at marshaling their brains into bilateral mode, and the more creative they are, the more they dual-activate.
Is this learnable? Well, think of it like basketball. Being tall does help to be a pro basketball player, but the rest of us can still get quite good at the sport through practice. In the same way, there are certain innate features of the brain that make some people naturally prone to divergent thinking. But convergent thinking and focused attention are necessary, too, and those require different neural gifts. Crucially, rapidly shifting between these modes is a top-down function under your mental control. University of New Mexico neuroscientist Rex Jung has concluded that those who diligently practice creative activities learn to recruit their brains’ creative networks quicker and better. A lifetime of consistent habits gradually changes the neurological pattern.
A fine example of this emerged in January of this year, with release of a study by University of Western Ontario neuroscientist Daniel Ansari and Harvard’s Aaron Berkowitz, who studies music cognition. They put Dartmouth music majors and nonmusicians in an fMRI scanner, giving participants a one-handed fiber-optic keyboard to play melodies on. Sometimes melodies were rehearsed; other times they were creatively improvised. During improvisation, the highly trained music majors used their brains in a way the nonmusicians could not: they deactivated their right-temporoparietal junction. Normally, the r-TPJ reads incoming stimuli, sorting the stream for relevance. By turning that off, the musicians blocked out all distraction. They hit an extra gear of concentration, allowing them to work with the notes and create music spontaneously.
Charles Limb of Johns Hopkins has found a similar pattern with jazz musicians, and Austrian researchers observed it with professional dancers visualizing an improvised dance. Ansari and Berkowitz now believe the same is true for orators, comedians, and athletes improvising in games.
The good news is that creativity training that aligns with the new science works surprisingly well. The University of Oklahoma, the University of Georgia, and Taiwan’s National Chengchi University each independently conducted a large-scale analysis of such programs. All three teams of scholars concluded that creativity training can have a strong effect. “Creativity can be taught,” says James C. Kaufman, professor at California State University, San Bernardino.
What’s common about successful programs is they alternate maximum divergent thinking with bouts of intense convergent thinking, through several stages. Real improvement doesn’t happen in a weekend workshop. But when applied to the everyday process of work or school, brain function improves.
So what does this mean for America’s standards-obsessed schools? The key is in how kids work through the vast catalog of information. Consider the National Inventors Hall of Fame School, a new public middle school in Akron, Ohio. Mindful of Ohio’s curriculum requirements, the school’s teachers came up with a project for the fifth graders: figure out how to reduce the noise in the library. Its windows faced a public space and, even when closed, let through too much noise. The students had four weeks to design proposals.
Working in small teams, the fifth graders first engaged in what creativity theorist Donald Treffinger describes as fact-finding. How does sound travel through materials? What materials reduce noise the most? Then, problem-finding—anticipating all potential pitfalls so their designs are more likely to work. Next, idea-finding: generate as many ideas as possible. Drapes, plants, or large kites hung from the ceiling would all baffle sound. Or, instead of reducing the sound, maybe mask it by playing the sound of a gentle waterfall? A proposal for double-paned glass evolved into an idea to fill the space between panes with water. Next, solution-finding: which ideas were the most effective, cheapest, and aesthetically pleasing? Fiberglass absorbed sound the best but wouldn’t be safe. Would an aquarium with fish be easier than water-filled panes?
Then teams developed a plan of action. They built scale models and chose fabric samples. They realized they’d need to persuade a janitor to care for the plants and fish during vacation. Teams persuaded others to support them—sometimes so well, teams decided to combine projects. Finally, they presented designs to teachers, parents, and Jim West, inventor of the electric microphone.
Along the way, kids demonstrated the very definition of creativity: alternating between divergent and convergent thinking, they arrived at original and useful ideas. And they’d unwittingly mastered Ohio’s required fifth-grade curriculum—from understanding sound waves to per-unit cost calculations to the art of persuasive writing. “You never see our kids saying, ‘I’ll never use this so I don’t need to learn it,’ ” says school administrator Maryann Wolowiec. “Instead, kids ask, ‘Do we have to leave school now?’ ” Two weeks ago, when the school received its results on the state’s achievement test, principal Traci Buckner was moved to tears. The raw scores indicate that, in its first year, the school has already become one of the top three schools in Akron, despite having open enrollment by lottery and 42 percent of its students living in poverty.
With as much as three fourths of each day spent in project-based learning, principal Buckner and her team actually work through required curricula, carefully figuring out how kids can learn it through the steps of Treffinger’s Creative Problem-Solving method and other creativity pedagogies. “The creative problem-solving program has the highest success in increasing children’s creativity,” observed William & Mary’s Kim.
The home-game version of this means no longer encouraging kids to spring straight ahead to the right answer. When UGA’s Runco was driving through California one day with his family, his son asked why Sacramento was the state’s capital—why not San Francisco or Los Angeles? Runco turned the question back on him, encouraging him to come up with as many explanations as he could think of.
Preschool children, on average, ask their parents about 100 questions a day. Why, why, why—sometimes parents just wish it’d stop. Tragically, it does stop. By middle school they’ve pretty much stopped asking. It’s no coincidence that this same time is when student motivation and engagement plummet. They didn’t stop asking questions because they lost interest: it’s the other way around. They lost interest because they stopped asking questions.
Having studied the childhoods of highly creative people for decades, Claremont Graduate University’s Mihaly Csikszentmihalyi and University of Northern Iowa’s Gary G. Gute found highly creative adults tended to grow up in families embodying opposites. Parents encouraged uniqueness, yet provided stability. They were highly responsive to kids’ needs, yet challenged kids to develop skills. This resulted in a sort of adaptability: in times of anxiousness, clear rules could reduce chaos—yet when kids were bored, they could seek change, too. In the space between anxiety and boredom was where creativity flourished.
It’s also true that highly creative adults frequently grew up with hardship. Hardship by itself doesn’t lead to creativity, but it does force kids to become more flexible—and flexibility helps with creativity.
In early childhood, distinct types of free play are associated with high creativity. Preschoolers who spend more time in role-play (acting out characters) have higher measures of creativity: voicing someone else’s point of view helps develop their ability to analyze situations from different perspectives. When playing alone, highly creative first graders may act out strong negative emotions: they’ll be angry, hostile, anguished. The hypothesis is that play is a safe harbor to work through forbidden thoughts and emotions.
In middle childhood, kids sometimes create paracosms—fantasies of entire alternative worlds. Kids revisit their paracosms repeatedly, sometimes for months, and even create languages spoken there. This type of play peaks at age 9 or 10, and it’s a very strong sign of future creativity. A Michigan State University study of MacArthur “genius award” winners found a remarkably high rate of paracosm creation in their childhoods.
From fourth grade on, creativity no longer occurs in a vacuum; researching and studying become an integral part of coming up with useful solutions. But this transition isn’t easy. As school stuffs more complex information into their heads, kids get overloaded, and creativity suffers. When creative children have a supportive teacher—someone tolerant of unconventional answers, occasional disruptions, or detours of curiosity—they tend to excel. When they don’t, they tend to underperform and drop out of high school or don’t finish college at high rates.
They’re quitting because they’re discouraged and bored, not because they’re dark, depressed, anxious, or neurotic. It’s a myth that creative people have these traits. (Those traits actually shut down creativity; they make people less open to experience and less interested in novelty.) Rather, creative people, for the most part, exhibit active moods and positive affect. They’re not particularly happy—contentment is a kind of complacency creative people rarely have. But they’re engaged, motivated, and open to the world.
The new view is that creativity is part of normal brain function. Some scholars go further, arguing that lack of creativity—not having loads of it—is the real risk factor. In his research, Runco asks college students, “Think of all the things that could interfere with graduating from college.” Then he instructs them to pick one of those items and to come up with as many solutions for that problem as possible. This is a classic divergent-convergent creativity challenge. A subset of respondents, like the proverbial Murphy, quickly list every imaginable way things can go wrong. But they demonstrate a complete lack of flexibility in finding creative solutions. It’s this inability to conceive of alternative approaches that leads to despair. Runco’s two questions predict suicide ideation—even when controlling for preexisting levels of depression and anxiety.
In Runco’s subsequent research, those who do better in both problem-finding and problem-solving have better relationships. They are more able to handle stress and overcome the bumps life throws in their way. A similar study of 1,500 middle schoolers found that those high in creative self-efficacy had more confidence about their future and ability to succeed. They were sure that their ability to come up with alternatives would aid them, no matter what problems would arise.
When he was 30 years old, Ted Schwarzrock was looking for an alternative. He was hardly on track to becoming the prototype of Torrance’s longitudinal study. He wasn’t artistic when young, and his family didn’t recognize his creativity or nurture it. The son of a dentist and a speech pathologist, he had been pushed into medical school, where he felt stifled and commonly had run-ins with professors and bosses. But eventually, he found a way to combine his creativity and medical expertise: inventing new medical technologies.
Today, Schwarzrock is independently wealthy—he founded and sold three medical-products companies and was a partner in three more. His innovations in health care have been wide ranging, from a portable respiratory oxygen device to skin-absorbing anti-inflammatories to insights into how bacteria become antibiotic-resistant. His latest project could bring down the cost of spine-surgery implants 50 percent. “As a child, I never had an identity as a ‘creative person,’ ” Schwarzrock recalls. “But now that I know, it helps explain a lot of what I felt and went through.”
Creativity has always been prized in American society, but it’s never really been understood. While our creativity scores decline unchecked, the current national strategy for creativity consists of little more than praying for a Greek muse to drop by our houses. The problems we face now, and in the future, simply demand that we do more than just hope for inspiration to strike. Fortunately, the science can help: we know the steps to lead that elusive muse right to our doors.
Syria Bans Full Islamic Face Veils At Universities
By Albert Aji and Elizabeth A. Kennedy
http://www.washingtonpost.com/wp-dyn/content/article/2010/07/19/AR2010071902268.html?wprss=rss_world/wires
Syria has forbidden the country's students and teachers from wearing the niqab - the full Islamic veil that reveals only a woman's eyes - taking aim at a garment many see as political.
The ban shows a rare point of agreement between Syria's secular, authoritarian government and the democracies of Europe: Both view the niqab as a potentially destabilizing threat.
"We have given directives to all universities to ban niqab-wearing women from registering," a government official in Damascus told The Associated Press on Monday.
The order affects both public and private universities and aims to protect Syria's secular identity, said the official, who spoke on condition of anonymity because he was not authorized to speak publicly about the issue. Hundreds of primary school teachers who were wearing the niqab at government-run schools were transferred last month to administrative jobs, he added.
The ban, issued Sunday by the Education Ministry, does not affect the hijab, or headscarf, which is far more common in Syria than the niqab's billowing black robes.
Syria is the latest in a string of nations from Europe to the Middle East to weigh in on the veil, perhaps the most visible symbol of conservative Islam. Veils have spread in other secular-leaning Arab countries, such as Egypt, Jordan and Lebanon, with Jordan's government trying to discourage them by playing up reports of robbers who wear veils as masks.
Turkey bans Muslim headscarves in universities, with many saying attempts to allow them in schools amount to an attack on modern Turkey's secular laws.
The issue has been debated across Europe, where France, Spain, Belgium and the Netherlands are considering banning the niqab on the grounds it is degrading to women.
Last week, France's lower house of parliament overwhelmingly approved a ban on both the niqab and the burqa, which covers even a woman's eyes, in an effort to define and protect French values - a move that angered many in the country's large Muslim community.
The measure goes before the Senate in September; its biggest hurdle could come when France's constitutional watchdog scrutinizes it later. A controversial 2004 law in France earlier prohibited Muslim headscarves and other "ostentatious" religious symbols in the classrooms of French primary and secondary public schools.
Opponents say such bans violate freedom of religion and personal choice, and will stigmatize all Muslims.
In Damascus, a 19-year-old university student who would give only her first name, Duaa, said she hopes to continue wearing her niqab to classes when the next term begins in the fall, despite the ban.
Otherwise, she said, she will not be able to study.
"The niqab is a religious obligation," said the woman, who would not give her surname because she was uncomfortable speaking out against the ban. "I cannot go without it."
Nadia, a 44-year-old science teacher in Damascus who was reassigned last month because of her veil, said: "Wearing my niqab is a personal decision."
"It reflects my freedom," she said, also declining to give her full name.
In European countries, particularly France, the debate has turned on questions of how to integrate immigrants and balance a minority's rights with secular opinion that the garb is an affront to women.
But in the Middle East - particularly Syria and Egypt, where there have been efforts to ban the niqab in the dorms of public universities - experts say the issue underscores the gulf between the secular elite and largely impoverished lower classes who find solace in religion.
Some observers say the bans also stem in part from fear of dissent.
The niqab is not widespread in Syria, although it has become more common in recent years, a development that has not gone unnoticed by the authoritarian government.
"We are witnessing a rapid income gap growing in Syria - there is a wealthy ostentatious class of people who are making money and wearing European clothes," said Joshua Landis, an American professor and Syria expert who runs a blog called Syria Comment.
The lower classes are feeling the squeeze, he said.
"It's almost inevitable that there's going to be backlash. The worry is that it's going to find its expression in greater Islamic radicalism," Landis said.
Four decades of secular rule under the Baath Party have largely muted sectarian differences in Syria, although the state is quick to quash any dissent. In the 1980s, Syria crushed a bloody campaign by Sunni militants to topple the regime of then-President Hafez Assad.
The veil is linked to Salafism, a movement that models itself on early Islam with a doctrine that is similar to Saudi Arabia's. In the broad spectrum of Islamic thought, Salafism is on the extreme conservative end.
In Gaza, radical Muslim groups encourage women to cover their faces and even conceal the shape of their shoulders by using layers of drapes.
It's a mistake to view the niqab as a "personal freedom," Bassam Qadhi, a Syrian women's rights activist, told local media recently.
"It is rather a declaration of extremism," Qadhi said.
My Biggest Mistake in the White House
Failing to refute charges that Bush lied us into war has hurt our country.
By Karl Rove
http://online.wsj.com/article/SB10001424052748704518904575365793062101552.html?mod=WSJ_hp_mostpop_readSeven years ago today, in a speech on the Iraq war, Sen. Ted Kennedy fired the first shot in an all-out assault on President George W. Bush's integrity. "All the evidence points to the conclusion," Kennedy said, that the Bush administration "put a spin on the intelligence and a spin on the truth." Later that day Senate Minority Leader Tom Daschle told reporters Mr. Bush needed "to be forthcoming" about the absence of weapons of mass destruction (WMD).
Thus began a shameful episode in our political life whose poisonous fruits are still with us.
The next morning, Democratic presidential candidates John Kerry and John Edwards joined in. Sen. Kerry said, "It is time for a president who will face the truth and tell the truth." Mr. Edwards chimed in, "The administration has a problem with the truth."
The battering would continue, and it was a monument to hypocrisy and cynicism. All these Democrats had said, like Mr. Bush did, that Saddam Hussein possessed WMD. Of the 110 House and Senate Democrats who voted in October 2002 to authorize the use of force against his regime, 67 said in congressional debate that Saddam had these weapons. This didn't keep Democrats from later alleging something they knew was false—that the president had lied America into war.
Senate Intelligence Chairman Bob Graham organized a bipartisan letter in December 2001 warning Mr. Bush that Saddam's "biological, chemical and nuclear weapons programs . . . may be back to pre-Gulf War status," and enhanced by "longer-range missiles that will threaten the United States and our allies." Yet two years later, he called for Mr. Bush's impeachment for having said Saddam had WMD.
On July 9, 2004, Mr. Graham's fellow Democrat on Senate Intelligence, Jay Rockefeller, charged that the Bush administration "at all levels . . . used bad information to bolster the case for war." But in his remarks on Oct. 10, 2002, supporting the war resolution, he said that "Saddam's existing biological and chemical weapons capabilities pose real threats to America."
Even Kennedy, who opposed the war resolution, nonetheless said the month before the vote that Saddam's "pursuit of lethal weapons of mass destruction cannot be tolerated." But he warned if force were employed, the Iraqi dictator "may decide he has nothing to lose by using weapons of mass destruction himself or by sharing them with terrorists."
Then there was Al Gore, who charged on June 24, 2004, that Mr. Bush spent "prodigious amounts of energy convincing people of lies" and accused him of treason, bellowing that Mr. Bush "betrayed his country." Yet just a month before the war resolution debate, the former vice president said, "We know that [Saddam] has stored away secret supplies of biological and chemical weapons throughout his country."
Top Democrats led their party in making the "Bush lied, people died" charge because they wanted to defeat him in 2004. That didn't happen. Several bipartisan commissions would later catalogue the serious errors in the intelligence on which Mr. Bush and Democrats relied. But these commissions, particularly the Silberman-Robb report of March 31, 2005, found that the "Bush lied" charge was false. Still, the attacks hurt: When they began, less than a third of Americans believed the charge. Two years later, polls showed that just over half did.
The damage extended beyond Mr. Bush's presidency. The attacks on Mr. Bush poisoned America's political discourse. Saying the commander-in-chief intentionally lied America into war is about the most serious accusation that can be leveled at a president. The charge was false—and it opened the way for politicians in both parties to move the debate from differences over issues into ad hominem attacks.
At the time, we in the Bush White House discussed responding but decided not to relitigate the past. That was wrong and my mistake: I should have insisted to the president that this was a dagger aimed at his administration's heart. What Democrats started seven years ago left us less united as a nation to confront foreign challenges and overcome America's enemies.
We know President Bush did not intentionally mislead the nation. Saddam Hussein was deposed and eventually hanged for his crimes. Iraq is a democracy and an ally instead of an enemy of America. Al Qaeda suffered tremendous blows in the "land between the two rivers." But Democrats lost more than the election in 2004. In telling lie after lie, week after week, many lost their honor and blackened their reputations.
Mr. Rove is the former senior adviser and deputy chief of staff to President George W. Bush.