NYTs Review: ‘Let It Fall: Los Angeles 1982-1992,’ a Wrenching John Ridley Film

Teeming with acts both heroic and reprehensible, John Ridley’s wrenchingly humane documentary, “Let It Fall: Los Angeles 1982-1992,” reveals the Los Angeles riots as the almost inevitable culmination of a decade of heightening racial tensions.

To that end, this multiethnic oral history allows politicians and the police, victims and survivors who witnessed the events of April 29 and 30, 1992, to expose a relentless accretion of official decisions and public resentments. We’re almost 90 minutes in before the uprising begins, but not a second feels unnecessary: Rather, some sections are so powerfully elucidated by the movie’s commitment to context and nuance that even too-familiar tragedies — like the agonizing beatings of Rodney King and Reginald Denny — arrive freighted with fresh insight.

“It was truly a war like no other,” says Jung Hui Lee, a Korean-American shopkeeper whose 18-year-old son, Edward Jae Song Lee, was mistaken for a looter and shot while trying to protect another business. His death, like others in the film, is indicative of a chaos that defies a simple black-and-white explanation.

Drawing from multiple visual sources, Mr. Ridley follows the rise of gang membership and drug use, and the troubling evolution of the policing tactic known as “pain compliance” (an escalating series of actions designed to subdue a suspect). Neither blame nor absolution entices him as, again and again (eased by Colin Rich’s masterly editing), he coaxes forth the perfect, shining moment that speaks volumes.

“He was only 20,” one woman says, distractedly, of the boyfriend who died from a police chokehold, as if she were fully registering his youth for the first time. Similarly, a telling encounter with a biracial juror plays like a spotlight on the complexities of tribal loyalty. But Mr. Ridley (whose terrific anthology series, “American Crime,” is currently in its third season on ABC) saves his most devastating device for last, sideswiping us with the realization that someone who has already touched our hearts might also, at one time, have turned our stomachs.

Posted in Uncategorized | Comments Off on NYTs Review: ‘Let It Fall: Los Angeles 1982-1992,’ a Wrenching John Ridley Film

Obama’s Female Staffers Came Up With a Genius Strategy to Make Sure Their Voices Were Heard

By
Under Obama, women are in the room where it happens. Photo: Pete Souza/The White House

When President Obama first took office, the White House wasn’t exactly the friendliest place for female staffers. Most of Obama’s senior staffers — such as former chief of staff Rahm Emanuel and former economic adviser Lawrence Summers — were men who’d worked on his campaign and subsequently filled his cabinet.

“If you didn’t come in from the campaign, it was a tough circle to break into,” Anita Dunn, who served as White House communications director until November 2009, told the Washington Post. “Given the makeup of the campaign, there were just more men than women.”

Susan Rice, who’s currently the national security adviser, said she (and other women) had to shoulder their way into important conversations: “It’s not pleasant to have to appeal to a man to say, ‘Include me in that meeting.’”

And even when they’d made it into the room, female staffers were sometimes overlooked. So they banded together (shine theory!) and came up with a system to make sure they were heard:

Female staffers adopted a meeting strategy they called “amplification”: When a woman made a key point, other women would repeat it, giving credit to its author. This forced the men in the room to recognize the contribution — and denied them the chance to claim the idea as their own.

“We just started doing it, and made a purpose of doing it. It was an everyday thing,” said one former Obama aide who requested anonymity to speak frankly. Obama noticed, she and others said, and began calling more often on women and junior aides.

As the Post points out, things have gotten much better for female staffers in Obama’s second term. There’s an even gender split among his top aides, and half of all White House departments are headed by women. “I think having a critical mass makes a difference,” White House senior adviser Valerie Jarrett said. “It’s fair to say that there was a lot of testosterone flowing in those early days. Now we have a little more estrogen that provides a counterbalance.”

Posted in Uncategorized | Comments Off on Obama’s Female Staffers Came Up With a Genius Strategy to Make Sure Their Voices Were Heard

The end of privatization in the prison system

Posted in Uncategorized | Comments Off on The end of privatization in the prison system

Interview with Dr. Randy Stoecker, Professor of Community and Environmental Sociology, University of Wisconsin, Madison

Posted in Uncategorized | Comments Off on Interview with Dr. Randy Stoecker, Professor of Community and Environmental Sociology, University of Wisconsin, Madison

Theodore Roszak : Towards an Eco-Psychology

Posted in Uncategorized | Comments Off on Theodore Roszak : Towards an Eco-Psychology

Pondering Trigger Warnings…

This is an extremely timely article as we begin the Fall semester of 2015. A lot of this resonates with me and some of the struggles I have in the classroom. How do we teach about the perils of society without making students aware of the trauma that exists? How do we public problem-solve and push for policy changes if we don’t talk about injustices? At the same time, I walk across campus and religious right groups are carrying inflammatory signs calling female students “sluts” and yelling “you’re a faggot and you’re going to hell.” “Pro-life” groups set up displays every semester in the middle of campus of photo-shopped, mutilated fetuses and hand out pamphlets featuring the carnage of abortion.  Why is it wrong for me to teach but okay for them to preach? Ah, freedom of speech says the administration! I feel compelled to carry on and guide students to critically think and collect evidence to formulate a coherent, informed argument. Isn’t that what we are charged to do?

The Coddling of the American Mind

The Atlantic (theatlantic.com) by Greg Lukianoff and Jonathan Haidt

Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.

Two terms have risen quickly from obscurity into common campus parlance. Microaggressions are small actions or word choices that seem on their face to have no malicious intent but that are thought of as a kind of violence nonetheless. For example, by some campus guidelines, it is a microaggression to ask an Asian American or Latino American “Where were you born?,” because this implies that he or she is not a real American. Trigger warnings are alerts that professors are expected to issue if something in a course might cause a strong emotional response. For example, some students have called for warnings that Chinua Achebe’s Things Fall Apart describes racial violence and that F. Scott Fitzgerald’s The Great Gatsby portrays misogyny and physical abuse, so that students who have been previously victimized by racism or domestic violence can choose to avoid these works, which they believe might “trigger” a recurrence of past trauma.

Some recent campus actions border on the surreal. In April, at Brandeis University, the Asian American student association sought to raise awareness of microaggressions against Asians through an installation on the steps of an academic hall. The installation gave examples of microaggressions such as “Aren’t you supposed to be good at math?” and “I’m colorblind! I don’t see race.” But a backlash arose among other Asian American students, who felt that the display itself was a microaggression. The association removed the installation, and its president wrote an e-mail to the entire student body apologizing to anyone who was “triggered or hurt by the content of the microaggressions.”

This new climate is slowly being institutionalized, and is affecting what can be said in the classroom, even as a basis for discussion. During the 2014–15 school year, for instance, the deans and department chairs at the 10 University of California system schools were presented by administrators at faculty leader-training sessions with examples of microaggressions. The list of offensive statements included: “America is the land of opportunity” and “I believe the most qualified person should get the job.”

The press has typically described these developments as a resurgence of political correctness. That’s partly right, although there are important differences between what’s happening now and what happened in the 1980s and ’90s. That movement sought to restrict speech (specifically hate speech aimed at marginalized groups), but it also challenged the literary, philosophical, and historical canon, seeking to widen it by including more-diverse perspectives. The current movement is largely about emotional well-being. More than the last, it presumes an extraordinary fragility of the collegiate psyche, and therefore elevates the goal of protecting students from psychological harm. The ultimate aim, it seems, is to turn campuses into “safe spaces” where young adults are shielded from words and ideas that make some uncomfortable. And more than the last, this movement seeks to punish anyone who interferes with that aim, even accidentally. You might call this impulse vindictive protectiveness. It is creating a culture in which everyone must think twice before speaking up, lest they face charges of insensitivity, aggression, or worse.

We have been studying this development for a while now, with rising alarm. (Greg Lukianoff is a constitutional lawyer and the president and CEO of the Foundation for Individual Rights in Education, which defends free speech and academic freedom on campus, and has advocated for students and faculty involved in many of the incidents this article describes; Jonathan Haidt is a social psychologist who studies the American culture wars. The stories of how we each came to this subject can be read here.) The dangers that these trends pose to scholarship and to the quality of American universities are significant; we could write a whole essay detailing them. But in this essay we focus on a different question: What are the effects of this new protectiveness on the students themselves? Does it benefit the people it is supposed to help? What exactly are students learning when they spend four years or more in a community that polices unintentional slights, places warning labels on works of classic literature, and in many other ways conveys the sense that words can be forms of violence that require strict control by campus authorities, who are expected to act as both protectors and prosecutors?

There’s a saying common in education circles: Don’t teach students what to think; teach them how to think. The idea goes back at least as far as Socrates. Today, what we call the Socratic method is a way of teaching that fosters critical thinking, in part by encouraging students to question their own unexamined beliefs, as well as the received wisdom of those around them. Such questioning sometimes leads to discomfort, and even to anger, on the way to understanding.

But vindictive protectiveness teaches students to think in a very different way. It prepares them poorly for professional life, which often demands intellectual engagement with people and ideas one might find uncongenial or wrong. The harm may be more immediate, too. A campus culture devoted to policing speech and punishing speakers is likely to engender patterns of thought that are surprisingly similar to those long identified by cognitive behavioral therapists as causes of depression and anxiety. The new protectiveness may be teaching students to think pathologically.
How Did We Get Here?

It’s difficult to know exactly why vindictive protectiveness has burst forth so powerfully in the past few years. The phenomenon may be related to recent changes in the interpretation of federal antidiscrimination statutes (about which more later). But the answer probably involves generational shifts as well. Childhood itself has changed greatly during the past generation. Many Baby Boomers and Gen Xers can remember riding their bicycles around their hometowns, unchaperoned by adults, by the time they were 8 or 9 years old. In the hours after school, kids were expected to occupy themselves, getting into minor scrapes and learning from their experiences. But “free range” childhood became less common in the 1980s. The surge in crime from the ’60s through the early ’90s made Baby Boomer parents more protective than their own parents had been. Stories of abducted children appeared more frequently in the news, and in 1984, images of them began showing up on milk cartons. In response, many parents pulled in the reins and worked harder to keep their children safe.

The flight to safety also happened at school. Dangerous play structures were removed from playgrounds; peanut butter was banned from student lunches. After the 1999 Columbine massacre in Colorado, many schools cracked down on bullying, implementing “zero tolerance” policies. In a variety of ways, children born after 1980—the Millennials—got a consistent message from adults: life is dangerous, but adults will do everything in their power to protect you from harm, not just from strangers but from one another as well.

These same children grew up in a culture that was (and still is) becoming more politically polarized. Republicans and Democrats have never particularly liked each other, but survey data going back to the 1970s show that on average, their mutual dislike used to be surprisingly mild. Negative feelings have grown steadily stronger, however, particularly since the early 2000s. Political scientists call this process “affective partisan polarization,” and it is a very serious problem for any democracy. As each side increasingly demonizes the other, compromise becomes more difficult. A recent study shows that implicit or unconscious biases are now at least as strong across political parties as they are across races.

So it’s not hard to imagine why students arriving on campus today might be more desirous of protection and more hostile toward ideological opponents than in generations past. This hostility, and the self-righteousness fueled by strong partisan emotions, can be expected to add force to any moral crusade. A principle of moral psychology is that “morality binds and blinds.” Part of what we do when we make moral judgments is express allegiance to a team. But that can interfere with our ability to think critically. Acknowledging that the other side’s viewpoint has any merit is risky—your teammates may see you as a traitor.

Social media makes it extraordinarily easy to join crusades, express solidarity and outrage, and shun traitors. Facebook was founded in 2004, and since 2006 it has allowed children as young as 13 to join. This means that the first wave of students who spent all their teen years using Facebook reached college in 2011, and graduated from college only this year.

These first true “social-media natives” may be different from members of previous generations in how they go about sharing their moral judgments and supporting one another in moral campaigns and conflicts. We find much to like about these trends; young people today are engaged with one another, with news stories, and with prosocial endeavors to a greater degree than when the dominant technology was television. But social media has also fundamentally shifted the balance of power in relationships between students and faculty; the latter increasingly fear what students might do to their reputations and careers by stirring up online mobs against them.

We do not mean to imply simple causation, but rates of mental illness in young adults have been rising, both on campus and off, in recent decades. Some portion of the increase is surely due to better diagnosis and greater willingness to seek help, but most experts seem to agree that some portion of the trend is real. Nearly all of the campus mental-health directors surveyed in 2013 by the American College Counseling Association reported that the number of students with severe psychological problems was rising at their schools. The rate of emotional distress reported by students themselves is also high, and rising. In a 2014 survey by the American College Health Association, 54 percent of college students surveyed said that they had “felt overwhelming anxiety” in the past 12 months, up from 49 percent in the same survey just five years earlier. Students seem to be reporting more emotional crises; many seem fragile, and this has surely changed the way university faculty and administrators interact with them. The question is whether some of those changes might be doing more harm than good.
The Thinking Cure

For millennia, philosophers have understood that we don’t see life as it is; we see a version distorted by our hopes, fears, and other attachments. The Buddha said, “Our life is the creation of our mind.” Marcus Aurelius said, “Life itself is but what you deem it.” The quest for wisdom in many traditions begins with this insight. Early Buddhists and the Stoics, for example, developed practices for reducing attachments, thinking more clearly, and finding release from the emotional torments of normal mental life.

Cognitive behavioral therapy is a modern embodiment of this ancient wisdom. It is the most extensively studied nonpharmaceutical treatment of mental illness, and is used widely to treat depression, anxiety disorders, eating disorders, and addiction. It can even be of help to schizophrenics. No other form of psychotherapy has been shown to work for a broader range of problems. Studies have generally found that it is as effective as antidepressant drugs (such as Prozac) in the treatment of anxiety and depression. The therapy is relatively quick and easy to learn; after a few months of training, many patients can do it on their own. Unlike drugs, cognitive behavioral therapy keeps working long after treatment is stopped, because it teaches thinking skills that people can continue to use.

The goal is to minimize distorted thinking and see the world more accurately. You start by learning the names of the dozen or so most common cognitive distortions (such as overgeneralizing, discounting positives, and emotional reasoning; see the list at the bottom of this article). Each time you notice yourself falling prey to one of them, you name it, describe the facts of the situation, consider alternative interpretations, and then choose an interpretation of events more in line with those facts. Your emotions follow your new interpretation. In time, this process becomes automatic. When people improve their mental hygiene in this way—when they free themselves from the repetitive irrational thoughts that had previously filled so much of their consciousness—they become less depressed, anxious, and angry.

The parallel to formal education is clear: cognitive behavioral therapy teaches good critical-thinking skills, the sort that educators have striven for so long to impart. By almost any definition, critical thinking requires grounding one’s beliefs in evidence rather than in emotion or desire, and learning how to search for and evaluate evidence that might contradict one’s initial hypothesis. But does campus life today foster critical thinking? Or does it coax students to think in more-distorted ways?

Let’s look at recent trends in higher education in light of the distortions that cognitive behavioral therapy identifies. We will draw the names and descriptions of these distortions from David D. Burns’s popular book Feeling Good, as well as from the second edition of Treatment Plans and Interventions for Depression and Anxiety Disorders, by Robert L. Leahy, Stephen J. F. Holland, and Lata K. McGinn.
Higher Education’s Embrace of “Emotional Reasoning”

Burns defines emotional reasoning as assuming “that your negative emotions necessarily reflect the way things really are: ‘I feel it, therefore it must be true.’ ” Leahy, Holland, and McGinn define it as letting “your feelings guide your interpretation of reality.” But, of course, subjective feelings are not always trustworthy guides; unrestrained, they can cause people to lash out at others who have done nothing wrong. Therapy often involves talking yourself down from the idea that each of your emotional responses represents something true or important.

Emotional reasoning dominates many campus debates and discussions. A claim that someone’s words are “offensive” is not just an expression of one’s own subjective feeling of offendedness. It is, rather, a public charge that the speaker has done something objectively wrong. It is a demand that the speaker apologize or be punished by some authority for committing an offense.

There have always been some people who believe they have a right not to be offended. Yet throughout American history—from the Victorian era to the free-speech activism of the 1960s and ’70s—radicals have pushed boundaries and mocked prevailing sensibilities. Sometime in the 1980s, however, college campuses began to focus on preventing offensive speech, especially speech that might be hurtful to women or minority groups. The sentiment underpinning this goal was laudable, but it quickly produced some absurd results.

Among the most famous early examples was the so-called water-buffalo incident at the University of Pennsylvania. In 1993, the university charged an Israeli-born student with racial harassment after he yelled “Shut up, you water buffalo!” to a crowd of black sorority women that was making noise at night outside his dorm-room window. Many scholars and pundits at the time could not see how the term water buffalo (a rough translation of a Hebrew insult for a thoughtless or rowdy person) was a racial slur against African Americans, and as a result, the case became international news.

Claims of a right not to be offended have continued to arise since then, and universities have continued to privilege them. In a particularly egregious 2008 case, for instance, Indiana University–Purdue University at Indianapolis found a white student guilty of racial harassment for reading a book titled Notre Dame vs. the Klan. The book honored student opposition to the Ku Klux Klan when it marched on Notre Dame in 1924. Nonetheless, the picture of a Klan rally on the book’s cover offended at least one of the student’s co-workers (he was a janitor as well as a student), and that was enough for a guilty finding by the university’s Affirmative Action Office.

These examples may seem extreme, but the reasoning behind them has become more commonplace on campus in recent years. Last year, at the University of St. Thomas, in Minnesota, an event called Hump Day, which would have allowed people to pet a camel, was abruptly canceled. Students had created a Facebook group where they protested the event for animal cruelty, for being a waste of money, and for being insensitive to people from the Middle East. The inspiration for the camel had almost certainly come from a popular TV commercial in which a camel saunters around an office on a Wednesday, celebrating “hump day”; it was devoid of any reference to Middle Eastern peoples. Nevertheless, the group organizing the event announced on its Facebook page that the event would be canceled because the “program [was] dividing people and would make for an uncomfortable and possibly unsafe environment.”

Because there is a broad ban in academic circles on “blaming the victim,” it is generally considered unacceptable to question the reasonableness (let alone the sincerity) of someone’s emotional state, particularly if those emotions are linked to one’s group identity. The thin argument “I’m offended” becomes an unbeatable trump card. This leads to what Jonathan Rauch, a contributing editor at this magazine, calls the “offendedness sweepstakes,” in which opposing parties use claims of offense as cudgels. In the process, the bar for what we consider unacceptable speech is lowered further and further.

Since 2013, new pressure from the federal government has reinforced this trend. Federal antidiscrimination statutes regulate on-campus harassment and unequal treatment based on sex, race, religion, and national origin. Until recently, the Department of Education’s Office for Civil Rights acknowledged that speech must be “objectively offensive” before it could be deemed actionable as sexual harassment—it would have to pass the “reasonable person” test. To be prohibited, the office wrote in 2003, allegedly harassing speech would have to go “beyond the mere expression of views, words, symbols or thoughts that some person finds offensive.”

But in 2013, the Departments of Justice and Education greatly broadened the definition of sexual harassment to include verbal conduct that is simply “unwelcome.” Out of fear of federal investigations, universities are now applying that standard—defining unwelcome speech as harassment—not just to sex, but to race, religion, and veteran status as well. Everyone is supposed to rely upon his or her own subjective feelings to decide whether a comment by a professor or a fellow student is unwelcome, and therefore grounds for a harassment claim. Emotional reasoning is now accepted as evidence.

If our universities are teaching students that their emotions can be used effectively as weapons—or at least as evidence in administrative proceedings—then they are teaching students to nurture a kind of hypersensitivity that will lead them into countless drawn-out conflicts in college and beyond. Schools may be training students in thinking styles that will damage their careers and friendships, along with their mental health.
Fortune-Telling and Trigger Warnings

Burns defines fortune-telling as “anticipat[ing] that things will turn out badly” and feeling “convinced that your prediction is an already-established fact.” Leahy, Holland, and McGinn define it as “predict[ing] the future negatively” or seeing potential danger in an everyday situation. The recent spread of demands for trigger warnings on reading assignments with provocative content is an example of fortune-telling.

The idea that words (or smells or any sensory input) can trigger searing memories of past trauma—and intense fear that it may be repeated—has been around at least since World War I, when psychiatrists began treating soldiers for what is now called post-traumatic stress disorder. But explicit trigger warnings are believed to have originated much more recently, on message boards in the early days of the Internet. Trigger warnings became particularly prevalent in self-help and feminist forums, where they allowed readers who had suffered from traumatic events like sexual assault to avoid graphic content that might trigger flashbacks or panic attacks. Search-engine trends indicate that the phrase broke into mainstream use online around 2011, spiked in 2014, and reached an all-time high in 2015. The use of trigger warnings on campus appears to have followed a similar trajectory; seemingly overnight, students at universities across the country have begun demanding that their professors issue warnings before covering material that might evoke a negative emotional response.

In 2013, a task force composed of administrators, students, recent alumni, and one faculty member at Oberlin College, in Ohio, released an online resource guide for faculty (subsequently retracted in the face of faculty pushback) that included a list of topics warranting trigger warnings. These topics included classism and privilege, among many others. The task force recommended that materials that might trigger negative reactions among students be avoided altogether unless they “contribute directly” to course goals, and suggested that works that were “too important to avoid” be made optional.

It’s hard to imagine how novels illustrating classism and privilege could provoke or reactivate the kind of terror that is typically implicated in PTSD. Rather, trigger warnings are sometimes demanded for a long list of ideas and attitudes that some students find politically offensive, in the name of preventing other students from being harmed. This is an example of what psychologists call “motivated reasoning”—we spontaneously generate arguments for conclusions we want to support. Once you find something hateful, it is easy to argue that exposure to the hateful thing could traumatize some other people. You believe that you know how others will react, and that their reaction could be devastating. Preventing that devastation becomes a moral obligation for the whole community. Books for which students have called publicly for trigger warnings within the past couple of years include Virginia Woolf’s Mrs. Dalloway (at Rutgers, for “suicidal inclinations”) and Ovid’s Metamorphoses (at Columbia, for sexual assault).

Jeannie Suk’s New Yorker essay described the difficulties of teaching rape law in the age of trigger warnings. Some students, she wrote, have pressured their professors to avoid teaching the subject in order to protect themselves and their classmates from potential distress. Suk compares this to trying to teach “a medical student who is training to be a surgeon but who fears that he’ll become distressed if he sees or handles blood.”

However, there is a deeper problem with trigger warnings. According to the most-basic tenets of psychology, the very idea of helping people with anxiety disorders avoid the things they fear is misguided. A person who is trapped in an elevator during a power outage may panic and think she is going to die. That frightening experience can change neural connections in her amygdala, leading to an elevator phobia. If you want this woman to retain her fear for life, you should help her avoid elevators.

But if you want to help her return to normalcy, you should take your cues from Ivan Pavlov and guide her through a process known as exposure therapy. You might start by asking the woman to merely look at an elevator from a distance—standing in a building lobby, perhaps—until her apprehension begins to subside. If nothing bad happens while she’s standing in the lobby—if the fear is not “reinforced”—then she will begin to learn a new association: elevators are not dangerous. (This reduction in fear during exposure is called habituation.) Then, on subsequent days, you might ask her to get closer, and on later days to push the call button, and eventually to step in and go up one floor. This is how the amygdala can get rewired again to associate a previously feared situation with safety or normalcy.

Students who call for trigger warnings may be correct that some of their peers are harboring memories of trauma that could be reactivated by course readings. But they are wrong to try to prevent such reactivations. Students with PTSD should of course get treatment, but they should not try to avoid normal life, with its many opportunities for habituation. Classroom discussions are safe places to be exposed to incidental reminders of trauma (such as the word violate). A discussion of violence is unlikely to be followed by actual violence, so it is a good way to help students change the associations that are causing them discomfort. And they’d better get their habituation done in college, because the world beyond college will be far less willing to accommodate requests for trigger warnings and opt-outs.

The expansive use of trigger warnings may also foster unhealthy mental habits in the vastly larger group of students who do not suffer from PTSD or other anxiety disorders. People acquire their fears not just from their own past experiences, but from social learning as well. If everyone around you acts as though something is dangerous—elevators, certain neighborhoods, novels depicting racism—then you are at risk of acquiring that fear too. The psychiatrist Sarah Roff pointed this out last year in an online article for The Chronicle of Higher Education. “One of my biggest concerns about trigger warnings,” Roff wrote, “is that they will apply not just to those who have experienced trauma, but to all students, creating an atmosphere in which they are encouraged to believe that there is something dangerous or damaging about discussing difficult aspects of our history.”

In an article published last year by Inside Higher Ed, seven humanities professors wrote that the trigger-warning movement was “already having a chilling effect on [their] teaching and pedagogy.” They reported their colleagues’ receiving “phone calls from deans and other administrators investigating student complaints that they have included ‘triggering’ material in their courses, with or without warnings.” A trigger warning, they wrote, “serves as a guarantee that students will not experience unexpected discomfort and implies that if they do, a contract has been broken.” When students come to expect trigger warnings for any material that makes them uncomfortable, the easiest way for faculty to stay out of trouble is to avoid material that might upset the most sensitive student in the class.

Magnification, Labeling, and Microaggressions

Burns defines magnification as “exaggerat[ing] the importance of things,” and Leahy, Holland, and McGinn define labeling as “assign[ing] global negative traits to yourself and others.” The recent collegiate trend of uncovering allegedly racist, sexist, classist, or otherwise discriminatory microaggressions doesn’t incidentally teach students to focus on small or accidental slights. Its purpose is to get students to focus on them and then relabel the people who have made such remarks as aggressors.

The term microaggression originated in the 1970s and referred to subtle, often unconscious racist affronts. The definition has expanded in recent years to include anything that can be perceived as discriminatory on virtually any basis. For example, in 2013, a student group at UCLA staged a sit-in during a class taught by Val Rust, an education professor. The group read a letter aloud expressing their concerns about the campus’s hostility toward students of color. Although Rust was not explicitly named, the group quite clearly criticized his teaching as microaggressive. In the course of correcting his students’ grammar and spelling, Rust had noted that a student had wrongly capitalized the first letter of the word indigenous. Lowercasing the capital I was an insult to the student and her ideology, the group claimed.

Even joking about microaggressions can be seen as an aggression, warranting punishment. Last fall, Omar Mahmood, a student at the University of Michigan, wrote a satirical column for a conservative student publication, The Michigan Review, poking fun at what he saw as a campus tendency to perceive microaggressions in just about anything. Mahmood was also employed at the campus newspaper, The Michigan Daily. The Daily’s editors said that the way Mahmood had “satirically mocked the experiences of fellow Daily contributors and minority communities on campus … created a conflict of interest.” The Daily terminated Mahmood after he described the incident to two Web sites, The College Fix and The Daily Caller. A group of women later vandalized Mahmood’s doorway with eggs, hot dogs, gum, and notes with messages such as “Everyone hates you, you violent prick.” When speech comes to be seen as a form of violence, vindictive protectiveness can justify a hostile, and perhaps even violent, response.

In March, the student government at Ithaca College, in upstate New York, went so far as to propose the creation of an anonymous microaggression-reporting system. Student sponsors envisioned some form of disciplinary action against “oppressors” engaged in belittling speech. One of the sponsors of the program said that while “not … every instance will require trial or some kind of harsh punishment,” she wanted the program to be “record-keeping but with impact.”

Surely people make subtle or thinly veiled racist or sexist remarks on college campuses, and it is right for students to raise questions and initiate discussions about such cases. But the increased focus on microaggressions coupled with the endorsement of emotional reasoning is a formula for a constant state of outrage, even toward well-meaning speakers trying to engage in genuine discussion.

What are we doing to our students if we encourage them to develop extra-thin skin in the years just before they leave the cocoon of adult protection and enter the workforce? Would they not be better prepared to flourish if we taught them to question their own emotional reactions, and to give people the benefit of the doubt?
Teaching Students to Catastrophize and Have Zero Tolerance

Burns defines catastrophizing as a kind of magnification that turns “commonplace negative events into nightmarish monsters.” Leahy, Holland, and McGinn define it as believing “that what has happened or will happen” is “so awful and unbearable that you won’t be able to stand it.” Requests for trigger warnings involve catastrophizing, but this way of thinking colors other areas of campus thought as well.

Catastrophizing rhetoric about physical danger is employed by campus administrators more commonly than you might think—sometimes, it seems, with cynical ends in mind. For instance, last year administrators at Bergen Community College, in New Jersey, suspended Francis Schmidt, a professor, after he posted a picture of his daughter on his Google+ account. The photo showed her in a yoga pose, wearing a T-shirt that read I will take what is mine with fire & blood, a quote from the HBO show Game of Thrones. Schmidt had filed a grievance against the school about two months earlier after being passed over for a sabbatical. The quote was interpreted as a threat by a campus administrator, who received a notification after Schmidt posted the picture; it had been sent, automatically, to a whole group of contacts. According to Schmidt, a Bergen security official present at a subsequent meeting between administrators and Schmidt thought the word fire could refer to AK-47s.

Then there is the eight-year legal saga at Valdosta State University, in Georgia, where a student was expelled for protesting the construction of a parking garage by posting an allegedly “threatening” collage on Facebook. The collage described the proposed structure as a “memorial” parking garage—a joke referring to a claim by the university president that the garage would be part of his legacy. The president interpreted the collage as a threat against his life.

It should be no surprise that students are exhibiting similar sensitivity. At the University of Central Florida in 2013, for example, Hyung-il Jung, an accounting instructor, was suspended after a student reported that Jung had made a threatening comment during a review session. Jung explained to the Orlando Sentinel that the material he was reviewing was difficult, and he’d noticed the pained look on students’ faces, so he made a joke. “It looks like you guys are being slowly suffocated by these questions,” he recalled saying. “Am I on a killing spree or what?”

After the student reported Jung’s comment, a group of nearly 20 others e-mailed the UCF administration explaining that the comment had clearly been made in jest. Nevertheless, UCF suspended Jung from all university duties and demanded that he obtain written certification from a mental-health professional that he was “not a threat to [himself] or to the university community” before he would be allowed to return to campus.

All of these actions teach a common lesson: smart people do, in fact, overreact to innocuous speech, make mountains out of molehills, and seek punishment for anyone whose words make anyone else feel uncomfortable.
Mental Filtering and Disinvitation Season

As Burns defines it, mental filtering is “pick[ing] out a negative detail in any situation and dwell[ing] on it exclusively, thus perceiving that the whole situation is negative.” Leahy, Holland, and McGinn refer to this as “negative filtering,” which they define as “focus[ing] almost exclusively on the negatives and seldom notic[ing] the positives.” When applied to campus life, mental filtering allows for simpleminded demonization.

Students and faculty members in large numbers modeled this cognitive distortion during 2014’s “disinvitation season.” That’s the time of year—usually early spring—when commencement speakers are announced and when students and professors demand that some of those speakers be disinvited because of things they have said or done. According to data compiled by the Foundation for Individual Rights in Education, since 2000, at least 240 campaigns have been launched at U.S. universities to prevent public figures from appearing at campus events; most of them have occurred since 2009.Consider two of the most prominent disinvitation targets of 2014: former U.S. Secretary of State Condoleezza Rice and the International Monetary Fund’s managing director, Christine Lagarde. Rice was the first black female secretary of state; Lagarde was the first woman to become finance minister of a G8 country and the first female head of the IMF. Both speakers could have been seen as highly successful role models for female students, and Rice for minority students as well. But the critics, in effect, discounted any possibility of something positive coming from those speeches.

Members of an academic community should of course be free to raise questions about Rice’s role in the Iraq War or to look skeptically at the IMF’s policies. But should dislike of part of a person’s record disqualify her altogether from sharing her perspectives?

If campus culture conveys the idea that visitors must be pure, with résumés that never offend generally left-leaning campus sensibilities, then higher education will have taken a further step toward intellectual homogeneity and the creation of an environment in which students rarely encounter diverse viewpoints. And universities will have reinforced the belief that it’s okay to filter out the positive. If students graduate believing that they can learn nothing from people they dislike or from those with whom they disagree, we will have done them a great intellectual disservice.

What Can We Do Now?

Attempts to shield students from words, ideas, and people that might cause them emotional discomfort are bad for the students. They are bad for the workplace, which will be mired in unending litigation if student expectations of safety are carried forward. And they are bad for American democracy, which is already paralyzed by worsening partisanship. When the ideas, values, and speech of the other side are seen not just as wrong but as willfully aggressive toward innocent victims, it is hard to imagine the kind of mutual respect, negotiation, and compromise that are needed to make politics a positive-sum game.

Rather than trying to protect students from words and ideas that they will inevitably encounter, colleges should do all they can to equip students to thrive in a world full of words and ideas that they cannot control. One of the great truths taught by Buddhism (and Stoicism, Hinduism, and many other traditions) is that you can never achieve happiness by making the world conform to your desires. But you can master your desires and habits of thought. This, of course, is the goal of cognitive behavioral therapy. With this in mind, here are some steps that might help reverse the tide of bad thinking on campus.

The biggest single step in the right direction does not involve faculty or university administrators, but rather the federal government, which should release universities from their fear of unreasonable investigation and sanctions by the Department of Education. Congress should define peer-on-peer harassment according to the Supreme Court’s definition in the 1999 case Davis v. Monroe County Board of Education. The Davis standard holds that a single comment or thoughtless remark by a student does not equal harassment; harassment requires a pattern of objectively offensive behavior by one student that interferes with another student’s access to education. Establishing the Davis standard would help eliminate universities’ impulse to police their students’ speech so carefully.

Universities themselves should try to raise consciousness about the need to balance freedom of speech with the need to make all students feel welcome. Talking openly about such conflicting but important values is just the sort of challenging exercise that any diverse but tolerant community must learn to do. Restrictive speech codes should be abandoned.

Universities should also officially and strongly discourage trigger warnings. They should endorse the American Association of University Professors’ report on these warnings, which notes, “The presumption that students need to be protected rather than challenged in a classroom is at once infantilizing and anti-intellectual.” Professors should be free to use trigger warnings if they choose to do so, but by explicitly discouraging the practice, universities would help fortify the faculty against student requests for such warnings.

Finally, universities should rethink the skills and values they most want to impart to their incoming students. At present, many freshman-orientation programs try to raise student sensitivity to a nearly impossible level. Teaching students to avoid giving unintentional offense is a worthy goal, especially when the students come from many different cultural backgrounds. But students should also be taught how to live in a world full of potential offenses. Why not teach incoming students how to practice cognitive behavioral therapy? Given high and rising rates of mental illness, this simple step would be among the most humane and supportive things a university could do. The cost and time commitment could be kept low: a few group training sessions could be supplemented by Web sites or apps. But the outcome could pay dividends in many ways. For example, a shared vocabulary about reasoning, common distortions, and the appropriate use of evidence to draw conclusions would facilitate critical thinking and real debate. It would also tone down the perpetual state of outrage that seems to engulf some colleges these days, allowing students’ minds to open more widely to new ideas and new people. A greater commitment to formal, public debate on campus—and to the assembly of a more politically diverse faculty—would further serve that goal.

Thomas Jefferson, upon founding the University of Virginia, said:

This institution will be based on the illimitable freedom of the human mind. For here we are not afraid to follow truth wherever it may lead, nor to tolerate any error so long as reason is left free to combat it.

We believe that this is still—and will always be—the best attitude for American universities. Faculty, administrators, students, and the federal government all have a role to play in restoring universities to their historic mission.

Posted in News, Political, WKU Related | Comments Off on Pondering Trigger Warnings…

WKU Pathways to Sustainability Conference

WKU to host free Sustainability Fest

Posted 4 days ago

People interested in “slow” rather than fast food or in how to create a “food forest” – both current sustainability efforts – can attend a free festival in Bowling Green next week.

The “Pathways to Sustainability Festival,” hosted by Western Kentucky University, is scheduled Friday and April 18. Events will be at Downing Student Union, Room 3020; Corsair Distillery, 400 E. Main Ave.; and the new Baker Community Garden, 150 Guinn Court.

John All, WKU associate professor of geography, said that the festival – a precursor to Earth Day 2015 on April 22 – is a chance to check out new sustainability initiatives locally and across the nation.

For example, a “food forest” is created when like items are planted together in a confined area, such as a cherry tree, blackberry bushes, lettuce and potatoes, All explained. The key is putting together items that can survive together.

All said people are becoming more aware of the need to preserve resources, but they still aren’t familiar with the successful strategies available.

“We have begun to recognize the need, but the how is lacking. On Friday, we will look at the theoretical, and then on Saturday, we will get our hands in the dirt,” All said.

Organizers signed up guest speakers well known in sustainability circles, such as Jeff Poppen, also known as “the Barefoot Farmer,” a local food and agricultural biodynamics expert who lives in Tennessee; Bernie Ellis, who has been working with development of Tennessee’s medical marijuana legislation; and local artist Andee Rudloff.

There will be presentations and a roundtable discussion from 10 a.m. to 3 p.m. Friday at DSU. Speakers include Christian Ryan, WKU sustainability coordinator; All, who has an extensive background in environmental planning; Rhondell Miller of HOTEL INC; and Laura Goodwin of Slow Food Bowling Green.

The slow food movement in America is an effort to teach people how to appreciate a more relaxed pace of dining, All said.

Slow food works in concert with the locally grown food movement. WKU recently joined a locally grown food project where local growers provide fresh food to the university and the Fresh Food Company dining facility on campus.

“There are a lot of new initiatives in local food, community gardens and farmers markets in town,” All said. “This will be hands-on learning.”

From 6:30 to 9 p.m. Friday evening, Ellis will speak at Corsair Distillery, followed by distillery tours and music. On Saturday, Poppen will be part of a full day of hands-on workshops from 9 a.m. to 4 p.m. at the Baker Community Garden.

WKU Office of Sustainability is one of the sponsors. Other sponsors include Baker Arboretum, Slow Food Bowling Green, Corsair Distillery, the WKU Institute for Citizenship and Social Responsibility, and the WKU Master of Arts in Social Responsibility and Sustainable Communities. The WKU M.A. degree is an interdisciplinary program of study that provides students with tools to lead communities toward social justice and sustainability, according to the WKU website. It is designed especially for students inclined toward the humanities, social sciences and related fields, the website noted.

The Barefoot Farmer has quite a following, All said.

For the past 15 years Poppen has appeared on Nashville PBS’ television program “Volunteer Gardener,” and, for over 20 years, he has written a gardening column for the Macon County Chronicle, according to his website.

Poppen is the author of two books, “The Best of the Barefoot Farmer” Vol. 1 and Vol. 2. He runs a Community Supported Agriculture program with the food he grows using about 8 acres of his farmland at Long Hungry Creek Farm in rural Tennessee and has about 40 head of cattle.

All said Poppen has subsisted on his own farm-grown food for about 40 years.

Ellis this week saw committees in the House and Senate of the Tennessee Legislature delay until 2016 a bill to legalize marijuana for limited medicinal purposes. He and others have worked over at least the last two years to convince lawmakers there to pass the legislation.

All said the festival has commitments from at least 50 people; however, up to 100 could attend.

“We want to show people what they can do for themselves,” All said.

Pathways to Sustainability Schedule

Friday, April 17th Western Kentucky University
Downing Student Union, President’s Room 3020

10:00am Laura Goodwin Welcome & Slow Food
10:15am Rhondell Miller Community Gardens & Community Development
10:45am Christian Ryan Building Sustainability & Resiliency at WKU
11:15am Lunch at FRESH provided by the Anthropocene Research Group
11:30am Round Table Discussion:
“Responding to the Anthropocene: Moving from sustainability to resilience.”
Lunch will be provided at Fresh Foods in Downing Student Union for those participating in the World Café Model round table discussion. The World Café Model will foster conversations in a relaxed, informal, and creative atmosphere aimed at producing ideas and knowledge about resilience that can be put into practice.

Facilitators:

Wolfgang Brauner
John All
Fred Siewers
Christian Ryan
Molly Beth Kerby
Gayle Mallinger

1:00pm Albert Meier The Science of Permaculture
1:30pm John All Intentional Communities
2:00pm Blake Layne Mead Production
2:30pm Taylor Hutchison Simple Computer Tools for Farming

Friday, April 17th Corsair Distillery, 400 E. Main Street, BG, KY 42101
6:00pm Gather at Corsair Distillery
6:30pm-7:35pm Bernie Ellis
7:45pm-8:45pm Tours of the Distillery and Tastings
7:45-9pm Music by Dead Broke Barons
Saturday, April 18th Baker Community Garden
9:00am Welcome & Introductions
9:10am Andee Rudloff Community Art Project
9:30am Mark Whitley Tiny House Talk
10:00am Jeff Poppen History of Farming and Biodynamics
11:30-12:30 Lunch on your own and Art Project Fun
12:30pm Dr. Martin Stone Grafting Demo
1:00 WKU Beekeepers
1:30pm Jeff Poppen Soils, Minerals, Preparations, Q & A
3:30 David Garvin Trees for Life Ceremony

4:00-?? Drum Circle

Posted in Uncategorized | Comments Off on WKU Pathways to Sustainability Conference

The Deeper Message of the Flow Hive

Very lovely

Living with bees is not about hardware, hives and management techniques any more – it is ultimately about the survival of life on earth.

In the last few days, the media worldwide have become positively besotted with a new invention that has a powerful lure: it makes removal of honey from a hive so easy that, in the words of a press release, ‘there is … the potential for remotely activated or automatic honey extraction’.  There is also the implication that it helps bees, by allowing the beekeeper to ‘harvest in a bee-friendly way’. That’s what we all want, is it not: to be bee-friendly and less disruptive?

Let us pause for a moment: does taking honey need to be disruptive? Responsible beekeepers have long found that sharing genuinely surplus honey is one of the many ways in which they can sensitively interact with the bees in their care.  It need not be in any way disruptive, either for the bees or the beekeeper. Of course, we are not referring here to large scale commercial beekeepers, whose harvesting techniques can be brutal. The Flow Hive will not appeal to such operators, being too expensive and complex.  In other words, the bee-friendly sales pitch is aimed at would-be beekeepers who want honey but no hassle.

So much for the beekeeper; what about the bees?  In truth, the thinking behind the Flow Hive entirely ignores the bees’ perspective. The essence of the gadget lies in plastic combs that can be cracked open by plastic cams, all contained within the hive. The honey then flows out of the hive.  Now there arises a further question: do bees naturally make combs from plastic and what does it mean to the organism as a whole, the ‘Bee’, to introduce such artefacts into a beehive?

While all the individual bees are essential to the whole organism, so is the comb and all the functions the comb serves.  Let’s put that another way: the comb is an integral and essential part of the wholeness of the hive. Placing artificial combs, made of plastic, into the heart of the Bee is akin to placing an artificial heart or liver inside a human being.  One would do so only in case of dire medical need. One would certainly not do so simply for the convenience of another.  But that is what the Flow Hive does. Something utterly alien to the Bee is placed into its very heart.  Why?

Let us hear the voices of others around the world as they confront this question:

Conceptually, the idea that a beehive is like a beer keg you can tap is troublesome. A beehive is a living thing, not a machine for our exploitation. I’m a natural beekeeper and feel that honey harvests must be done with caution and respect. To us, beekeeping is -at the risk of sounding a little melodramatic– a sacred vocation. We are in relationship with our backyard hive, and feel our role is to support them, and to very occasionally accept the gift of excess honey. For new beekeepers, and for people who are not beekeepers, beekeeping is all about the honey. … But in our minds, the honey matters very little. What we get we consider precious, and use for medicine more than sweetening.    Erik Knutzen, USA

The so called Flow Hive adds another level of estrangement to the [replacement of] natural beeswax comb. Now it is an entire prostheses being implanted, replacing original tissue and organ-like structures of this being. And on top, this powerful implant/ prosthesis can be operated blindly, without having to enter any relationship with the Bee any more: The notion of a living being is redefined through the interface of the prostheses. …. In the end, we can take this as an encouragement to look deeper and open our life to a perspective, which is aware of the entanglement of all there is. We have resources to evolve … and there is no other option, but to learn and awaken to a new way of living on this planet – because we want to survive.    Michael Joshin Thiele, Gaiabees

We have meddled with bees far too much and it’s time we stopped. The new “Flow hive” is deplorable.  This takes the art of bee meddling to a new level and it shows a massive sense of disconnection to bees from the part of the inventors! Bees are highly intelligent and have the most incredible sense of the world around them. Let’s learn their language without interfering with them.    Jenny Cullinan, South Africa

But comb is far more than a tupperware container for somebody else’s lunch; it is the tissue and frame of the hive and as such it forms multiple functions. Cells have wall thicknesses of just 0.07mm, and are made from over 300 different chemical components. Wax removes toxins from the honey. The resonant frequency (230-270 Hz) of the comb is matched to the bees’ vibration sensors and acts as an information highway between bees on opposite sides of the comb. Bees manage the temperature of the cell rims to optimize transmissions of these messages. Wax holds history and memory via chemical signals put into it by the bees. Its smell and condition aid the bee in managing the hive.  It assists in the ripening and conditioning the honey and is the first line of defence against pathogens. Honey bees are able to recognize the smallest differences in wax composition for good reason. Wax is not polypropylene…..Over the last 100 years bee health has declined with every new beekeeping innovation. The principle reason is that most innovations focus on exploitation for honey harvesting and/or suppressing the preference of the bee. In this respect the Flow Hive is no different. Honey is a blessing and a curse for the bee.    Jonathan Powell, England

Bees want to build their own wax comb. It’s part of the bee superorganism. The wax is literally built from their bodies. The comb is the bee’s home, their communication system (which doesn’t work nearly as well if it’s made from plastic rather than wax, studies have shown), and functions as a central organ. The comb is the bee’s womb – it’s where they raise their brood. And given a choice, bees do not want a pre-built plastic womb, home or larder, any more than we would. It’s the birthright of bees to build comb. But that’s not all. The other concern we have with this device is that it encourages + celebrates beekeeper-centric beekeeping, and infers that bee stewardship is totally easy. It’s all about the punchline. Is it good for the bees? Who cares. We’ve got flowing honey. Actually, this conversation is not just about the Flow Hive. What we’re really talking about here is the wider, industrialised profit-driven approach to beekeeping (as exemplified by the Langstroth hive design), which places production above ethics + long-term bee health.    Kirsten Bradley, Australia

I teach beekeeping without veil or gloves. If you enter a hive without the overwhelming force of a bee suit, you actually have to care about how the bees are feeling today, and you have to be interested in any subtle messages they may give you about your actions. If my goal was making money or saving time, this would be terribly inconvenient. But my goal is to work with the bees, to see that they are alright. If they have extra honey and I remove it without a sting, even though I was completely vulnerable, I feel differently about everything afterwards. The bees and I were somehow working together as equals, both vulnerable, both benefiting from the relationship.     Jack Mills, USA

In these considerations of the Flow Hive’s effect on the bees, we find a true sense of the wholeness of the Bee and the ineffable oneness of Nature.  Is such collective, deeper, understanding the emergent message of the recent media frenzy?  If so, we may all take heart.

swarm - a being of love and abundance

When we step into the world of Apis mellifera, we are entering a multidimensional landscape of being. The life gesture of the honeybee is so unique and different from other life forms that a rational mind alone cannot provide adequate understanding of its nature. Rudolf Steiner described the world of bees as a “world enigma”. This points to the need for an understanding beyond rationality. It is an invitation into another mode of awareness.    Michael Joshin Thiele, Gaiabees

Posted in Uncategorized | Comments Off on

The Whitest Oscars Since 1998: Why the ‘Selma’ Snubs Matter

Headlines from the Daily Beast….
For the first time in almost two decades, no person of color received an acting nomination—the depressing icing on a white-washed, male-dominated Oscars cake. This sucks.
The annual eye-roll criticism of the Oscars is that because of its depressing demographic makeup—94 percent white, 76 percent male, an average of 63-years-old—the nominations for the most prestigious and important awards in entertainment reflect the movie tastes not of a complicated, modern, and diverse culture, but of a bunch of old white guys.

That might explain how the best reviewed Best Picture nominee of the year, Selma (Rotten Tomatoes score: 99), a beautifully passionate film about a key time in America’s civil rights movement, gets only two nominations while the middlingly reviewed and critically polarizing American Sniper (Rotten Tomatoes score: 73) ekes out six surprise nods, including Best Actor and Best Adapted Screenplay. Of course, then, it should come as no surprise that Selma is a film from an upstart visionary young black woman named Ava DuVernay while American Sniper comes from Hollywood’s favorite old white guy. Congratulations, Clint Eastwood!

After making major strides in diversity last year with its slate of nominations with a (sometimes excessive) celebration of 12 Years a Slave, the first Best Director win by a Hispanic man, and a nod for Captain Phillips star Barkhad Abdi, the biggest talking point of this year’s roster is how it is the Whitest Oscars Ever—or at least in nearly two decades.

The depressing and undeserving snub of Selma’s Ava DuVernay in Best Director and David Oyelowo in Best Actor combined with the surprising surge of support for the bland, macho, and, yes, white American Sniper is just the cherry on top of an unfortunately white-washed list, with no people of color receiving an acting nomination.

We’re not going to say the Academy is racist. (But you can.)

There is, naturally, something to be said for differences in taste. It’s certainly possible that a large body of cinema experts simply thought the elements of American Sniper were powerful than those of a film that The New York Times called “a triumph of efficient, emphatic cinematic storytelling”—a film that is not just stirring and expertly made but which is quite possibly the most relevant and culturally necessary movie of 2015, timed to the 50th anniversary of the Selma marches and the Voting Rights Act and released on the heels of racial turmoil in Ferguson, Cleveland, and New York.

Yes, it’s quite possible that voters weren’t impressed by the way DuVernay managed to capture the nobility and tumult of the Selma marches without making her film heavy-handed or schmaltzy or too grand. And there is a fair case to be made that the Academy thought Bradley Cooper’s performance in American Sniper was more powerful than David Oyelowo’s in Selma, or that Carmen Ejogo wasn’t as impactful in her film as Keira Knightley, Emma Stone, or Laura Dern were in theirs. (Though that case would be wrong.)

But when Selma was as good as Selma was this year, and when there had seemed to be so much progress made in diversity in the Academy, there really is no excuse for the fact that this is the whitest Oscars since 1998, the last time not a single person of color was nominated for an acting Oscar.

Even before Oyelowo and DuVernay were snubbed Thursday morning, critics and pundits were prepared to explain their depressing absences in their respective races. Explanations ranged from Best Actor being too crowded of a category to Selma not sending its For Your Consideration screeners out to the voting guilds early enough—an explanation that seems like it could be true based on the film’s poor performance at the guild awards, too.

Selmais a film from an upstart visionary young black woman named Ava DuVernay while American Sniper comes from Hollywood’s favorite old white guy. Congratulations, Clint Eastwood!

The most infuriating explanation was that attacks against Selma’s historical accuracy hurt it. In recent weeks, the film has suffered a bit of backlash from a segment of predominantly white liberals loyal to the legacy of President Lyndon Johnson who were none too pleased with the film’s portrayal of his reluctance to help Dr. Martin Luther King and seeming lack of empathy to the voting rights of blacks.

DuVernay, sharpening her skills as a former publicist, responded perfectly to the controversy, saying, “For the film to be, I think, reduced—reduced is really what all this is—to one talking point of a small contingent of people who don’t like one thing, I think is unfortunate.”

But the backlash against Selma for its apparent historical inaccuracies, and the idea that they’re the reason DuVernay and Oyelowo were snubbed, is hypocritical bullshit especially when the likes of The Imitation Game and especially American Sniper—two films with accuracy issues of their own—were showered with nominations Thursday morning. As IndieWire editor Sam Adams tweeted, “The Academy: Historical accuracy is important, unless your movie is about a white man killing Arabs.”

But it’s important to not just focus on DuVernay and Oyelowo, or the fact that Selma did not reap more nominations. No, there’s a plethora of depressing facts to talk about. For example, there were no female directors, screenwriters, or cinematographers nominated at all.

Gillian Flynn, who was at one point pegged to win for her Adapted Screenplay of Gone Girl, wasn’t even nominated. In fact, Gone Girl, which many predicted to be a major Oscars player, including in Best Picture, only managed one nomination for star Rosamund Pike. But perhaps that fact shouldn’t be a surprise, either, when you look at the eight nominees in that category: eight very masculine films with male leads and featuring nearly all-male casts. No female-driven films were major Oscar players this year—with the exception of Wild in the acting races—and only one Best Actress contender is in a Best Picture nominee. Yay, old white guys!

“The basic message to the industry from the Academy today was: don’t invest in women in any power roles,” tweeted awards guru Sasha Stone.

So here we have the whitest Oscars in nearly two decades. We have an infuriating male-dominated slate of nominees. And let’s not even talk about the Academy’s treatment of yellow people Thursday morning. (What did those little figurines in The LEGO Movie do to deserve that film’s snub?)

It’s a situation that’s all the more confused considering that the Academy president is a black woman (with major teleprompter issues…Dick Poop!), though that should of course never serve as a mandate or even provide an expectation that the body she governs reward diverse talent. There are people, too, who will say that the Oscars are just a silly Hollywood dog-and-pony show, and we shouldn’t care what a bunch of old white guys think about movies.

But the truth is that the Oscars do and should matter, because film matters. Film should challenge us and be used as a catalyst for, at best, provoking forward thinking or, at the very least, whisk us off to a different world, one that makes us contemplate our own. Selma proves that. Heck, even American Sniper proves that. (None of this is writing to say that American Sniper is not a film without merits, importance, or worth.)

It would be nice if the Academy, the governing body of this medium, would prove that, too.

http://www.thedailybeast.com/articles/2015/01/15/the-whitest-oscars-since-1998-why-the-selma-snubs-matter.html

Posted in News, Political | Comments Off on The Whitest Oscars Since 1998: Why the ‘Selma’ Snubs Matter

Shedding the Superwoman Myth

This is an interesting take on “where feminism went wrong” in the Chronicle of Higher Ed. I agreed with her points and found myself cheering as I read…then realized we were the same age with a similar history and understanding of feminism 🙂 As I read through the comments, I became acutely aware of the tension between second and “third” wave feminist. It’s hard for me to conceptualize a “third wave” feminism without the resolution of the issue of the second wave; the issues outlined in this article. It also made me take note that perhaps women’s history and feminist history are not the same thing. Here is a link…judge for yourself and look at the comments!

http://chronicle.com/article/Where-Feminism-Went-Wrong/141293/?cid=wb&utm_source=wb&utm_medium=en

September 2, 2013

Matt Roth for The Chronicle Review

By Debora L. Spar

In 2005, I was teaching a first-year class at Harvard Business School. As usual, slightly under a third of my students were women. As always, I was the only female professor.

So one evening, my female students asked me and one of my female colleagues to join them for cocktails. They ordered a lovely spread of hors d’oeuvres and white wine. They presented each of us with an elegant lavender plant. And then, like women meeting for cocktails often do, they—well, we, actually—proceeded to complain. About how tough it was to be so constantly in the minority. About how the guys sucked up all the air around the school. About the folks in career services who told them never to wear anything but a good black pantsuit to an interview.

Over the course of the conversation, though, things began to turn. The women stopped talking about their present lives and started to focus on their futures, futures that had little to do with conferences or pantsuits and everything to do with babies, and families, and men. Most of the women were frankly intending to work “for a year or two” and then move into motherhood. These were some of the smartest and most determined young women in the country. They had Ivy League degrees, for the most part, and were in the midst of paying more than $100,000 for an M.B.A. And yet they were already deeply concerned about how they would juggle their lives, and surprisingly pessimistic about their chances of doing so.

Can women pursue their dreams without losing their sanity?

Like many women of my so-called postfeminist generation, I was raised to believe that women were finally poised to be equal with men. That after centuries of oppression, exploitation, and other bad things, women could now behave more or less the way men do. Women of my generation, growing up in the 1970s and 1980s, no longer felt we had to burn our bras in protest. Instead, with a curt nod to the bra burners who had gone before us, we could saunter directly to Victoria’s Secret, buying the satin push-ups that would take us seamlessly from boardroom to bedroom and beyond.

Today, most major corporations—along with hospitals, law firms, universities, and banks—have entire units devoted to helping women (and minorities) succeed. There are diversity officers and work/family offices and gender-sensitivity training courses in all tiers of American society. The problem with these efforts is that they just don’t work.

Or, more precisely, even the most well-intentioned programs to attract women or mentor women or retain women still don’t deal with the basic issues that most women face. And that’s because the challenges that confront women now are more subtle than those of the past, harder to recognize and thus to remove. They are challenges that stem from breast pumps and Manolo pumps, from men whose eyes linger on a woman’s rear end and men who rush that same rear end too quickly out the door.

Ever since the publication of The Feminine Mystique, American women have been haunted by the problem of more. Spurred by Betty Friedan’s plaintive query, “Is this all?”—inspired by feminism’s struggle for expanded rights and access, seduced by Astronaut Barbie—we have stumbled into an era of towering expectations. Little girls want to be princesses. Big girls want to be superwomen. Old women want and fully expect to look young. We want more sex, more love, more jobs, more-perfect babies. The only thing we want less of, it seems, is wrinkles.

None of this, of course, can be blamed on feminism or feminists. Or, as one former radical gently reminded me recently, “We weren’t fighting so that you could have Botox.” Yet it was feminism that lit the spark of my generation’s dreams—feminism that, ironically and unintentionally, raised the bar for women so high that mere mortals are condemned to fall below it. In its original incarnation, feminism had nothing to do with perfection. In fact, the central aim of many of its most powerful proponents was to liberate women from the unreasonable, impossible standards that had long been thrust upon them.

As feminist ideals trickled and then flowed into mainstream culture, though, they became far more fanciful, more exuberant, more trivial—something easier to sell to the millions of girls and women entranced by feminism’s appeal. It is easy, in retrospect, to say that women growing up in that world should have seen through the fantasy to the underlying struggle, that they—we—should have realized the myths of Charlie (both the angels and the perfume) and fought from the outset for the real rights of women. But most of us didn’t, not because we were foolish, necessarily, but because it’s hard, coming of age, to embrace the struggles of your parents’ generation. And so we embraced the myth instead, planning, like Atalanta, to run as fast as the wind and choose the lives we wanted.

Meanwhile, none of society’s earlier expectations of women disappeared. The result is a force field of highly unrealistic expectations. A woman cannot work a 60-hour week in a high-stress job and be the same kind of parent she would have been without that job and all the stress. And she cannot save the world and look forever like a 17-year-old model.

No man can do that, either; no human can. Yet women are repeatedly berating themselves for failing at this kind of balancing act, and (quietly, invidiously) berating others when something inevitably slips. Think of the schadenfreude that erupts every time a high-profile woman hits a bump in either her career or her family life. Poor Condoleezza Rice, left without a boyfriend. Sloppy Hillary, whose hair is wrong again. Bad Marissa Mayer, who dared announce her impending pregnancy the same week she was named CEO of Yahoo. She could not pull it off (snicker, snicker). She paid for her success. She. Could. Not. Do. It. All.

Because they can’t possibly be all things at once, women are retreating to the only place they can, the only realm they have any chance of actually controlling. Themselves.

Rather than focusing on the external goals that might once have united them, women are micromanaging the corners of their lives and, to a somewhat lesser extent, those of their children. Think about it: How many stories will you find in women’s magazines about the pursuit of anything other than bodily or familial perfection?

To be sure, this turn to the personal is not restricted to women. It follows a trajectory that can be traced back to Woodstock, or, more precisely, to the jagged route that befell the members of the Me Generation. Along the way, the struggle for individual liberties was transformed into the mantle of individualism.

Just as Reagan and Thatcher led the fight to privatize markets, so, too, have women raised since the 1960s led the charge to privatize feminism. It’s not that we’re against feminism’s ideals. Indeed, younger women are (not surprisingly) far more likely to be in the work force than were their mothers. Younger women are wholeheartedly devoted to birth control and to sexual freedom. They account for a majority of this country’s college students and a growing chunk of its professional class. Sixty-six percent of mothers with children younger than 17 work outside the home.

Yet because these women are grappling with so many expectations—because they are struggling more than they care to admit with the sea of choices that now confronts them—most of them are devoting whatever energies they have to controlling whatever is closest to them. Their kids’ homework, for example. Their firm’s diversity program. Their weight.

My generation made a mistake. We took the struggles and the victories of feminism and interpreted them somehow as a pathway to personal perfection. We privatized feminism and focused only on our dreams and our own inevitable frustrations. Feminism was supposed to be about granting women power and equality, and then about harnessing that power for positive change. Younger generations of women have largely turned away from those external, social goals.

So what, then, do we do?

Two generations after Roe v. Wade, two generations after Title IX and sexual liberation, we are still circling around the same maddening questions. Can women really have it all? Is there another way, a real way, for women to balance their personal and professional lives? Can the lofty aspirations of the early feminists—for equality, opportunity, choice—be meshed with their daughters’ stubborn yearning for more-traditional pleasures, like white weddings and monogamy? And can women pursue their dreams—all their dreams—without losing their sanity?

Yes, I would argue, they can. But not along completely gender-blind lines. We need a revised and somewhat reluctant feminism, one that desperately wishes we no longer needed a women’s movement but acknowledges that we still do. A feminism based at least in part on difference.

Women need to realize that having it all means giving something up—choosing which piece of the perfect picture to relinquish, or rework, or delay.

Women, in other words, are not perfect. And they are not identical to men. They are physical and social beings, marked by flaws, programmed to reproduce, destined to age, and generally inclined to love. Any approach to women’s issues must start from the reality of women’s lives rather than from an idealized or ideological view of who they should be and what they should want. This does not in any way mean that women should lower their sights or accept anything less than total equality with men. But it does suggest that women’s paths to success may be different and more complicated than men’s, and that it is better to recognize these complications than to wish them away.

To begin with, we need to recognize that biology matters. Women are not in any way physically inferior to men, but they are distinctly and physically different. They have wombs and breasts and ovaries, physiological attributes that—for better or for worse—tend to affect the course of their lives. Feminism, for many good reasons, has tended to downplay these physical differences.

But a new look at feminism would suggest integrating biology more explicitly, and acknowledging the not-so-subtle ways in which women’s physiology can shape their destinies. Two areas are paramount: sexuality and reproduction. Although, of course, both women and men are involved in both sex and reproduction, the effects fall differently on women.

Let’s start with sex. Most women—not all, but most—approach sexual relations differently than men do. They are more interested in romantic entanglements than casual affairs, and more inclined to seek solace in relationships. Biologically, these preferences make sense, since it is women who benefit reproductively from relationships that extend beyond the moment of conception. Sociologically, though, they set the stage for an awful lot of workplace complications. If men and women are working together, some subset of them are liable to get involved in sexual encounters. For men—in general—the focus of those encounters is likely to be purely sexual. For women—in general, again—there is more of an emphasis on, or at least desire for, a relationship. Right from the outset, then, this imbalance puts women at a disadvantage.

To deal with those admittedly awkward possibilities, most organizations have enforced strict relationship policies over the past few decades. At some level, these restrictions make sense. But they don’t help women. In fact, by rigidly drawing attention to the perils of sexual attraction, they can drive men away from the kind of relationships that would help women advance—the kind of relationships that senior men regularly have with junior men. I will always recall a conversation with a senior executive who openly joked that he would never take a woman on a consulting trip. “My wife would kill me!” he said. “And so,” I muttered under my breath, “your wife is happy, but you’ll never promote a deserving woman.”

The way out of this mess is complicated but relatively clear. Organizations must be vigilant in promoting policies against sexual harassment. At the same time, though, they should be less puritanical about the possibilities of sex and sexual attraction. Because if all attraction constitutes harassment, and all relationships are marked by fear, then women will constantly be at a disadvantage.

The other physical difference that separates men from women is the act of reproduction. Having babies shapes women’s lives in ways that have barely been touched by the otherwise significant social changes of the past 50 years. Before women have children, they can compete fairly evenly across most segments of life. They can play sports and be educated and gain access to nearly every job or profession. These are the victories that feminism has wrought.

After women have children, however, the lines of their lives begin to depart from men’s. Even if they are lucky enough to have decent maternity leaves and good child care, women quickly find themselves pumping breast milk at the office and lumbering under the effects of too many sleep-deprived nights. They dodge meetings to make doctor’s appointments and suffer an onslaught of guilt every time they leave a crying child to attend a conference. These aspects of mothering defy government regulation and corporate policy; these are the pulls that feminism forgot. And they are not going away.

To deal with such tensions successfully, therefore, women (and men) need to be far more explicit about recognizing the specific dilemmas of motherhood. Yes, companies can and should strive to create generous maternity leaves and family-friendly workplaces. Yes, governments should aim to provide more accessible and affordable child care.

But at the end of the day, women who juggle children and jobs will still face a serious set of tensions that simply don’t confront either men (except in very rare cases) or women who remain childless. Women cannot avoid those tensions entirely, but they can make choices. They can choose, for instance, between high-paying jobs in far-off cities and lower-paying ones that might leave them closer to family and friends willing to help with the predictable crises of child rearing. They can choose careers with more or less flexibility, and husbands with more or less interest in shouldering child-care responsibilities. The point is that women need to make these choices and realize their impact rather than simply hope for the best.

Which brings me to the second category of things we can do to deal with the proverbial “women’s problem.” We can begin to redefine the meaning of choice.

For decades now, ever since the passage of Roe v. Wade, the word “choice” has been linked inextricably to the goal of giving women control over their bodies and reproductive rights. Those are vitally important concerns. But choice itself is a much bigger concept and needs to be understood by women in all its complexity. Today, women in the United States enjoy options that would have confounded their ancestors. They can get married, or not; have children, or not; pursue a profession, or not. They can choose the shape of their noses, the level of their education, the religion of their partner—even, if they want, the musical talents of their child’s egg donor. The problem, though, is that this multitude of choices can feel overwhelming.

The problem is not hard to fix. In theory, at least, it demands little more than a change in attitude, a societal ratcheting down of the great expectations that now engulf women. Women need to realize that having it all means giving something up—choosing which piece of the perfect picture to relinquish, or rework, or delay.

If women are ever to solve the “women’s problem,” they also need to acknowledge that they can’t, and shouldn’t, do it alone. Men must help.

Both genders need to be more forthright in discussing the obstacles that women face. All too often, women are scared of raising the topic of gender with men, thinking it will brand them as radicals or troublemakers, while men are terrified of saying or doing anything that might classify them as politically incorrect. The result is that no one says anything productive at all.

Finally, it is crucial to remember whence we came. Feminism was never supposed to be a 12-step program toward personal perfection. It’s time now to go back, to channel the passion of our political foremothers and put it again to good use. We need to focus less of our energies on our own kids’ SAT scores and more on fighting for better public schools; less time on competitive cupcake-baking and more on supporting those few brave women willing to run for office. We need fewer individual good works and more collective efforts.

Feminism already taught us how to organize, how to agitate, how to petition for things like equal pay and better incentives for child care. We—the women born after feminism’s rise, the women who may have discarded or disdained it—ought to get back on that wheel and figure out how to make it work. Moreover, and with the benefits of 50 years behind us, we can also move to what might be considered a softer and gentler form of feminism, one less invested in proving women’s equality (since that battle has more or less been won) and less upset with men.

Which brings me to my last point. The feminism that I recall was supposed to be joyous. It was about expanding women’s choices, not constraining them. About making women’s lives richer and more fulfilling. About freeing their sexuality and the range of their loves.

There was pain and sweat along the way, but the end point was idyllic, liberating women—liberating them—from the pains of the past and the present. Somewhere, though, the joy fell out of that equation, along with the satisfaction that true choice should bring. If women want to work in high-powered jobs, they should. If they want to work part time, or from home, or not at all if they can afford it, that’s perfectly all right, too. If they don’t want to be neurosurgeons or look like Barbie or hook up every weekend because it doesn’t give them pleasure, they should consciously and explicitly hold back, choosing not to indulge other people’s preferences. If they like to bake elaborate organic cupcakes, they should. And if they don’t, they should send Ring Dings to the bake sale and try not to feel guilty.

We need to struggle. We need to organize. And we need to dance with joy.

Debora L. Spar is president of Barnard College. This essay is adapted from her new book, Wonder Women: Sex, Power, and the Quest for Perfection (Sarah Crichton Books/Farrar, Straus and Giroux).

 

Posted in Uncategorized | 9 Comments