Showing posts with label Current Events. Show all posts
Showing posts with label Current Events. Show all posts

Monday, July 1, 2013

B(l)ack in the Kitchen: Food Network

0 comments
by Lisa Guerrero
 

The conversation surrounding Paula Deen and her use of the “N word” has simultaneously erased the accusations of job discrimination and harassment all while ignoring the larger issues of race and Food Network. In fact, Deen’s ultimate firing by the Food Network has allowed the network to position itself as anti-racist, as America’s moral conscience. Refusing to allow prejudice to stain its airwaves, the Food Network has situated itself as a progressive force of accountability and justice.

Deen, however, is reflective of their brand—one that normalizes and operationalizes whiteness all while reimagining the world of food as racially transcendent. Revelations regarding Deen burst that illusion. With this in mind, we are sharing an excerpt from Lisa Guerrero’s brilliant chapter from our recent book, African Americans on Television: Race-ing for Ratings



. . .

In all of its programming, even within programs where race is undeniably apparent, either because of the celebrity or the cuisine, food is presented as a race-neutral cultural object.  Unfortunately, in a race-based society, as the United Statesis, “race-neutral” invariably gets translated as “white.”  Food Network trades in the notion of the “racelessness” of food to create a commodified sense of neoliberal inclusion and equality, wherein the focus is placed on individuals and not on systems. Food is portrayed across the network as a “universal language;” but as discussed above, it is definitely constructed as a specifically class-based language, as well as a language constructed in specifically racialized terms.  To be fair, Food Network is no different from most other cable television networks where whiteness is predominant and becomes easily normalized and rendered invisible to most viewers.
Ironically, the relatability that Food Network carefully crafts around its personalities is almost completely belied by the “everyday” lifestyles many of the network celebrities are show to have as they are strategically integrated into their respective shows, most notably with Ina Garten, Giada DeLaurentis, and Bobby Flay.  While the wealth and whiteness displayed in these, and much of Food Network’s other programming is conspicuous, they are treated as commonplace, the effect of which is twofold:  1) it creates a socioracial standard when it comes to the act of food consumption; and 2) it suggestively endorses the idea of food as a racial and economic privilege. 
Through its successful erasure of race and class, Food Networkperpetuates certain understandings about the social landscape in which people think about food consumption and commodification as being generally equal amongst various populations, even as statistically and programmatically most people can see that food equality isn’t a reality.  But Food Network is able to maintain this profitable food fantasy by constructing its food narratives in a very particular sociohistorical vacuum that allows audiences to distance themselves from not only certain tediums surrounding daily food habits, but also the sociohistorical and socioeconomic systems of food production and preparation in the United States.  The strategic use of blackness on the network is one of the primary ways in which this distancing is enabled.
The relative absence of blackness on Food Network, while not unlike the relative absence of blackness on network television generally speaking, succeeds in denying the significant place African Americans have, both historically and contemporaneously, in the creation of American food culture and foodways.  This erasure, while creating an amputated impression of American food backgrounds, does so in deliberate ways that are in keeping with long histories of using whiteness to signify notions of expertise, virtuosity, superiority, propriety, and polish.  In other words, in order to cement the network’s guiding narrative of elevating food to a craft, an art, an aspiration, it needs to simultaneously elevate whiteness, usually white maleness. 
 Not surprisingly, the programming on Food Network frames American food in very Eurocentric terms, tracing food origins and traditions to primarily Western, European nations, while periodically recognizing the “exotic” fare of Latin America or Asia.  There is little to no recognition of African cuisines within programming, despite the growing popularity of African food and restaurants among American consumers sparked by growing numbers of African immigrants to the United States, and probably represented most notably by the often tokenized celebrity chef, Marcus Samuelsson, who was born in Ethiopia and raised in Sweden.  Neither is there much linkage drawn between the specificity of African American soul food and the development of much of what is considered American “southern food.”  The erasure of these African and African-American cultural linkages to American food habits and histories effectively reimagines a significant portion of American food architecture as almost exclusively white, a reimagining not supported by history. 
Now certainly Food Network isn’t The History Channel, and viewers aren’t necessarily expecting to be provided with critically accurate or developed histories of food origins, routes, or social significances.  Nonetheless, its lack of wider, more representative narrative frames within its programming results in two things:  First, there is a barely perceptible, encompassing whitening of both the network itself, as well as the perspectives it creates about food relationships within American populations.  Secondly, when racial “diversity” and representation do occur, they have the effect of “tokenism” rather than inclusion.  Nowhere is this latter effect more apparent than in the network’s small club of Black cooking personalities.
 The framing of Food Network and The Cooking Channel break down into simplistic terms as “The U.S.” and “The Global,” respectively.  As such, The Cooking Channel does appear to embrace diversity in a larger, more transparent way than Food Network.  However, the apparent differentials of framing are really only on a cosmetic level.  There are more people of color that appear regularly on The Cooking Channel, but only slightly more, and considering the overbearing whiteness of Food Network, it really wouldn’t take much to have “more” racial diversity.  But the neutralized by emphasizing the notion of  “the exotic.”  The people of color on The Cooking Channel are, by and large, not of the United States, creating a comforting distance between U.S. audiences and any troublesome considerations about racism. 
In scholarly terms, it wouldn’t be far off the mark to think about Food Network as “the colonial” and The Cooking Channel as “the postcolonial.”  In other words, Food Networkdenies race and its systems by trying to devalue and/or erase race altogether, while The Cooking Channel denies race and its systems by putting race on display in almost exhibitional terms so that audiences don’t relate to it as a “real” thing.  In both cases, whiteness is positioned as the fulcrum of food experiences and knowledges.  And ultimately, blackness, especially American blackness, is relegated to becoming the specialty ingredient that gets used sparingly in the recipe of televisual food programming for fear that its flavor won’t be palatable to American consumers.

 
Postscript: 

As we’ve seen over the last few days not only with the vociferous response by Deen supporters, but also with SCOTUS gutting the Voting Rights Act, Texas scrambling to capitalize on that decision by pushing through a Voter ID bill, the dehumanizing tactics of the defense counsel in the George Zimmerman trial, and the countless racist microaggressions the accounts of which we are bombarded with daily, Paula Deen’s words and behaviors are, in themselves, unsurprising and relatively unremarkable, but rather indicative of the banality of American racism.  As several scholars have articulately pointed out in response to the Deen controversy, (including David J. Leonard), and as I have tried to address in this piece in broader ways, while Deen should certainly be held responsible for the ways in which her actions contribute to the continuation of systemic and ideologic racisms in the United States, the problem is much bigger than her use of racial epithets and her disturbing bucolic nostalgia for the racial order of the antebellum South. 

Perhaps the biggest problem of which Deen is but one very small symptom, is a problem which will, in all likelihood strangle equality and freedom for allAmerican citizens; it is the problem of the United States’ misguided belief in its own magnanimity of race; the delusion that we have remedied our racial illnesses and no longer need to be vigilant about the sickness, and in fact, can be prideful about the “past tense” of our racial struggles.  This blind hubris (which Justice Ginsburg so aptly identified in her dissension to the Voting Rights Act decision), allows for people like Paula Deen to sincerely dislocate their actions from the insidiousness of racism…since racism has been fixed, (so it goes), then certainly what people do and to whom they do it can’t be considered racism. 

Unfortunately, this racist psychosis, the inability to see racism even as you are enacting it, supporting it, contributing to it, benefitting from it, is one of many deleterious side-effects of our post-racial nation, and is sure to kill us quicker than a Paula Deen recipe.   

. . . 


Lisa A. Guerrero is Associate Professor of Comparative Ethnic Studies at Washington State University Pullman.  She is the editor of Teaching Race in the 21st Century: College Professors Talk About Their Fears, Risks, and Rewards (Palgrave Macmillan, 2009) and co-editor of African Americans onTelevision: Race-ing for Ratings (Praeger Press) with David J. Leonard.

Wednesday, June 26, 2013

Affirmative Action Survives to See Another Day—For Now…

0 comments
The following is a piece from James A. Beckman, author of the forthcoming 2014 title Affirmative Action: Contemporary Perspectives and Associate Professor of Legal Studies at the University of Central Florida

The dust has settled from yet another constitutional battle involving the war over affirmative action in America. The United States Supreme Court rendered the latest of a long line of decisions spanning over three decades on Monday, June 24, 2013, again placing restrictions (but not outright eliminating) the practice of affirmative action in the case of Fisher v. University of Texas at Austin. Proponents of affirmative action can take solace in the fact that the concept of affirmative action still survives—at least until the next major challenge. In ruling in Fisher, the Court declined to overturn any of its landmark cases of affirmative action—like Grutter v. Bollinger in 2003 and Regents of the University of California v. Bakke in 1978—and continued to allow universities to use race in admissions decisions so long as no other “workable race-neutral alternatives would produce the educational benefits of diversity.”

The Supreme Court in Fisher, by a 7-1 ruling, avoided the most extreme path of entirely dismantling affirmative action, and instead opting for a “middle of the road” approach, which reversed the federal Fifth Circuit Court of Appeals (which had upheld the University of Texas affirmative action admission’s plan as constitutional) as not upholding the rigorous level of judicial review needed in race classification cases as the Supreme Court has previously mandated and required to be employed by courts reviewing these cases (as the Court said in Bakke in 1978 and Grutter in 2003), and remanding the case back to the lower courts for further review. 
 
Thus, while the Court reversed the lower federal court's decision as not meeting its exacting standards under “strict scrutiny,” the majority did however again decline to strike down the general practice of affirmative action as per se unconstitutional and refused to characterize the practice as no longer being needed in society. Indeed, going into the Fishercase, proponents of affirmative action were acutely aware that it was possible that a majority on the Court could have dismantled affirmative action outright, pronounced the complete prohibition on the use of race or ethnicity in admissions decisions (or related governmental actions), and declared America’s experiment with remedial race-conscious preferences to be at an end and no longer necessary in modern society.

There was nothing overly revolutionary or radical in today’s ruling, and the Court seems to reaffirm that diversity is a compelling governmental interest and that Bakke and Grutter decisions are still good law (despite Justice Scalia and Justices Thomas’ concurring opinions to the contrary). This alone should give some comfort to supporters of affirmative action—at least in the short term. Given that the Court has basically used the Fisher ruling to reaffirm its rules set out in Grutter—and specifically that “strict scrutiny” needs to be truly meaningful scrutiny, and not (as the Court says) “strict in theory and feeble in fact,” the standard for review in future cases will certainly need to be more exacting, and states will need to show that “no workable race-neutral alternatives would produce the educational benefits of diversity.” While this is a more exacting standard of review moving forward, the Court clearly did not decide that UT’s program in using race was unconstitutional. The decision also references and upholds the standards set forth in Bakke & Grutter—so Bakke and Grutter are still good law, and diversity in higher education still can be considered a permissible compelling governmental interest. The Court signaled that race based affirmative action plans can still be considered constitutional if implemented properly (and if no workable race neutral alternatives are available). 

Thus, the ruling in Fisher was a narrow one, saving the broader battle over affirmative action (and a possible final end point) for another day. However, while holding that affirmative action survives, the Supreme Court made clear that reviewing courts have the obligation to make their own independent judgments about whether the university’s critical mass determination is a valid one. That is, strict scrutiny requires real and meaningful searching inquiries on the part of the court; not deference to the institution at issue. Further, as diversity increases on campus, it should be harder for institutions to consider race and use affirmative action at all. 

Thus, through the settling haze, the practice of affirmative action still stands, alive, but battered. The practice has withstood the Court’s restrictions and caveats in such cases as the Regents of the University of California v. Bakke in 1978, Adarand v. Pena in 1995, Gratz v. Bollingerin 2003, Grutter v. Bollinger in 2003, and now Fisher v. University of Texas at Austin in 2013.  It is battered, bruised and wobbling—like a punch happy pugilist who is recoiling from one too many uppercuts to the jaw; but yet, still it stands. Weaker, more tempered, but still in the fight.  While judicial concepts like “strict scrutiny” have been further defined and the level of review has been increased, proponents of affirmative action can take solace in the fact that the concept of affirmative action still survives—at least until the next major challenge.

One final note: The next major challenge may not be too far off in the distance. The Supreme Court has already granted review of the next affirmative action case in Schuette v. Coalition to Defend Affirmative Action by Any Means Necessary. The case will be argued at the Supreme Court in the Fall 2013 term. This case deals with the propriety and fate of state law bans on the practice of affirmative action. This case deals with the constitutionality of Michigan Proposal 2, which amended the Michiganstate constitution to prohibit (as a matter of state law) public institutions within the state from utilizing racial-preference in admissions, employment, and contracting. In the petition to the Supreme Court requesting review, Michigan Attorney General Bill Schuette stressed that he was not asking the Court to constitutionally dismantle affirmative action itself (as was a possibility leading up to the Fisher ruling), but rather whether state governments can decide to do so on their own. Thus, according to Michigan Attorney General Schuette, “this case presents the different issue whether a state has the right to accept this Court’s invitation in Grutter to bring an end to all race-based preferences.” This “invitation” is clearly a reference to Justice O’Connor’s language in Grutter that affirmative action should not be a permanent program and should have a logical end point, and that end point should be within the next quarter century from the Grutter decision (i.e., by 2028). The stage is already set for this next battle over affirmative action. Stay tuned in the Fall.  

. . .

James A. Beckman (J.D., Ohio State,  LL.M. Georgetown University) is Associate Professor of Legal Studies at the University of Central Florida, where he also serves as the inaugural chair of the Department of Legal Studies. He is the author of Comparative Legal Approaches to Homeland Security and Anti-terrorism (2007) and Affirmative Action Now: A Guide for Students, Families, and Counselors (2006); he is also the General Editor of Affirmative Action: An Encyclopedia (2004). Before his entrance into academia in 2000, he served as an attorney-advisor for the Bureau of Alcohol, Tobacco & Firearms (ATF) at its headquarters in Washington, DC.  Among other awards, he was the recipient of the United States Department of Defense Meritorious Service Medal for his legal work as an active duty judge advocate from 1994–1998, and the Department of Justice Meritorious Service Award (1999) for legal work on behalf of the Department of Justice and ATF.

   



Monday, March 19, 2012

International Criminal Court's Historic First Conviction Puts Spotlight on Child Soldiers

0 comments
On March 14, the ICC found Congolese warlord Thomas Lubanga guilty for the war crime of using children under the age of 15 as active participants in hostilities in the Democratic Republic of the Congo between September 2002 and August 2003. This is the ICC's first conviction in its 10 year history. In his forthcoming book, Child Soldiers: A Reference Handbook, Dr. David M. Rosen tackles the complex legal and social questions surrounding this controversial global issue.


For more than 40 years, humanitarian and human rights groups have sought to ban the recruitment and use of child soldiers. Their efforts have produced mixed results. On one hand, they have had a profound effect on the development of international law prohibiting the recruitment of children, but on the other hand, such laws seem to have had only limited efficacy in reducing the actual number of child soldiers participating in conflicts throughout the world. This huge gap between the aspirations of law and the practical reality of child recruitment is one of the greatest problems in ending the recruitment of child soldiers.


International efforts to end the use of child soldiers first bore fruit in 1977. That year, for the first time in history, there were changes made to the so-called “laws of war” that placed restrictions on recruiting children into armed forces and groups. Since that first victory, the issue of child soldiers has developed and expanded in scope. What began in 1977 as a relatively narrow concern with protecting children under 15 years old from serving as armed combatants has evolved into an international effort to sever a broad range of connections between the military and any person under the age of eighteen. The entire concept of the “child soldier” has evolved to encompass a greater number of children engaged in a wider variety of activities than first imaged. This raises some powerful questions. Are there actually more child soldiers in the world today than in the past? Certainly, child soldiers have been integrally involved in the military for a very long time. Andrew Jackson, 7th President of the United States, joined the armed forces of the American Revolution at age 13, and he was far from alone in doing so. How have changing definitions of child soldiers affected our perception of the actual number of child soldiers in the world? Are all children who are involved in the military coerced or abused? Is it always the case that children would be better off away from military involvement? And finally, when children are involved in military activity, should they be held responsible for their actions in the same way as adult soldiers?

- David M. Rosen is a professor of anthropology and law at Fairleigh Dickinson University





Explore these new ABC-CLIO resources to learn more about the complex issues surrounding the use of child soldiers around the world:


Child Soldiers













Slavery in the Modern World













Issues: Understanding Controversy and Society

Tuesday, January 31, 2012

What’s causing all this trouble with the Greek economy?

0 comments
In the past couple of years, stock markets around the world have plummeted with each bit of news regarding Greece’s economic woes. The economic crisis that Greece faced in late 2009 started about 10 years earlier. The records that Greece presented prior to admission to the Eurozone in 2001 as well as afterward did not show the true picture of its economic condition. The Eurozone requires that countries fulfill certain membership criteria, including a budget deficit that does not exceed 3% of the gross domestic product (GDP) and a public debt limited to 60% of the GDP. Greece met these criteria but did so by leaving out certain expenses. Also, a derivatives deal that Goldman-Sachs first put in place in 2001 masked the true nature of Greece’s deficit.


When Greek Prime Minister Georgios Papandreou was elected in the fall of 2009, he discovered the extent of the deficit and announced it. Several debt-ratings agencies then downgraded Greece’s rating to the lowest level in the Eurozone.


In May 2010, Papandreou dropped another bomb. He reported that the Greek deficit was 13.6% of GDP for 2009, higher than the 12.9% originally reported. The members of the Eurozone became very concerned that Greece’s economic situation would implode, and they would suffer the fallout. Thus, in May 2010, the European Union (EU) countries and the International Monetary Fund (IMF) decided to get things under control by giving Greece a loan of 110 billion euros ($139.1 billion)*, but under the condition that Greece would institute reforms. A few months later another loan package of 130 billion euros ($165.1 billion) was approved, but as of January 2012, it had not been finalized.


People became angry about the reforms instituted by the Papandreou government, which included cutting the wages and benefits of government workers and retirees and raising taxes. Demonstrations and strikes exploded around the country.


Financial experts point out that the worldwide recession, as well as graft and nonpayment of taxes, has contributed to Greece’s deficit. Papandreou ran on the slogan “Change,” which included changing a corrupt political system. Gerry Hadden, of Public Radio International’s “The World,” on a program that was broadcast on May 11, 2010, asked vegetable vendor Fotini Stavrou who she blamed for the crisis. Stavrou grabbed a potato and responded, “See this potato. If I stole it I would end up in jail. Yet our politicians steal millions and nothing happens to them.” She continued, “The two main political parties here robbed us blind, but it’s our fault. We voted for them.”


Thomopoulos' recent title The History of Greece published in December 2011.


Some economists blame part of Greece’s bleak economic picture on the use of bribes and patronage. Fakelakia (little envelopes) are part of doing business in Greece. Money is stuffed in a fakelaki and slipped to officials to help gain access to medical services, to avoid taxes, and for building permits or driver’s licenses.


Widespread tax evasion has also contributed to the deficit. In September 2011, the government used a new tactic in its approach to this problem. The finance minister named 6,000 firms that owed about 30 billion euros ($38.1 billion).


In the face of continuing economic problems, Prime Minister Papandreou stepped down in November 2011. A coalition government with Lucas Papademos as prime minister took over. The Papademos government faces rising unemployment (in September the unemployment rate was 17.5%) and a declining economy (probably more than a 5.5% reduction in the GDP for 2011).


Representatives of the EU, the IMF, and the Central Bank (referred to as the troika) are set to arrive in Greece in mid-January. The troika will determine whether or not Greece will receive the second installment of the loan. On March 20, 2012, 14.4 billion euros ($18.29 billion) are due on bonds. Without the second installment of the loan, Greece will not be able to pay what it owes.


Nicholas Paphitis of Associated Press in a January 4 article entitled “Greek PM Warns of Default without Loan Deal” reported: “Papademos said the troika has called for a re-examination of labor costs, to boost lagging competitiveness and fight high unemployment, and warned that, unless significant action is taken, the country will not receive its next vital installment.”


Paphitis continues: “Key details of the second bailout deal are still being negotiated — above all the provision under which private creditors such as banks and investment firms would take a 50 percent cut in the face value of the Greek bonds they hold.”


Greece is in a quagmire. If the second loan is not forthcoming, default may be the only option. But even if Greece declares bankruptcy, the country has the potential to emerge from the current crisis a stronger and healthier nation. Greece weathered the declaration of bankruptcy in 1932, as well as the devastation of the country during World War II and the subsequent civil war. Her people have stamina and grit. I don’t believe that they will allow their country to falter.


*The exchange rate of dollar to euro is based on the rate for January 11, 2012:1.27.


About the Author
ELAINE THOMOPOULOS, PhD, is an independent scholar who has authored local history books and is editor of Greek-American Pioneer Women of Illinois. She has published articles about Greece and Greek Americans and is curator of the Greek Museum of Berrien County, Michigan. 

Wednesday, January 18, 2012

Censorship: The Web Goes Dark

0 comments
With minimal effort even casual Web users can locate information that many governments, including at times the U.S. government, would prefer to restrict: hardcore pornography, plans for making explosives or illegal drugs, home addresses of government officials and celebrities, unauthorized copies of copyrighted songs and movies, encouragement of racist violence and terrorism, and almost any imaginable type of objectionable content. Children researching homework assignments online may stumble upon Holocaust-denial and pro-genocide Web sites, unaware that what they are reading is not fact but paranoid delusion; criminals can find co-conspirators and all the information needed to plan their crimes.


At times the presence of this harmful content leads to demands for censorship, although harmful content is only a tiny portion of total online content—and many would disagree on whether particular content is harmful. Censorship carries a negative connotation; in the United States the power of the federal government to censor is severely restricted by the First Amendment to the U.S. Constitution, while the Fourteenth Amendment extends these restrictions to the state as well.


Even in pre-Internet times censorship was difficult. First, it was unpopular: the First Amendment is close to the core of the values lumped together under the heading "civil liberties"; censorship, in other words, is perceived as un-American. Second, the magnitude of the task was daunting even when information was published on paper. The Internet has done nothing to diminish the first problem, while expanding the second—the volume of information—by several orders of magnitude. In addition, the Internet provides new censorship challenges: encryption technology makes it easy to conceal content from government snoops; the lack of face-to-face contact provides children with access to the same information as adults; and the borderless nature of the Internet makes it easy for providers of content censored in one country to move that content to a server in another, more permissive country—while remaining accessible to web users in the first, censoring country.


This international character also makes it difficult for the government to control another type of information: content that infringes on the intellectual property of others. While the Digital Millennium Copyright Act provides measures to remove infringing content or shut down sites that promote illegal downloads, it applies only in the United States. US authorities and copyright holders can do little about Web sites that are based in foreign countries. In 2011, two bills were introduced in Congress to address this problem. The Stop Online Piracy Act (SOPA) and the Preventing Real Online Threats to Economic Creativity and Theft of Intellectual Property Act (PROTECT IP Act or PIPA) in the Senate attempt to stop copyright infringement via foreign Web sites by cutting off funds and access from the United States. While applauded by many rightsholders, such as film and music industry associations, the bills have been denounced by civil liberties groups and Internet technology companies as an avenue of government censorship that threatens the Internet itself. The controversy grabbed national headlines in January 2012 as many popular Web sites staged a protest on January 18, with such sites as Wikipedia, Reddit, and BoingBoing going completely dark while others including Craigslist and Google featured prominent messages against the bills urging the public to contact Congress.


---------------------------------------------------------------------------------------------------------
(Partial excerpt) Cornwell, Nancy C. "Censorship: Overview." Issues: Understanding Controversy and SocietyABC-CLIO, 2012. Web. 18 Jan. 2012.

Monday, January 9, 2012

Succession in North Korea: A Family Affair

0 comments
When Kim Jong Il died of a heart ailment on December 17, 2011, the international community focused its attention on his son and successor Kim Jong Un. Little is known about Kim Jong Il's third son, who is reported to be 27 years old and was recently made a full general. Kim Jong Un was not publicly acknowledged to be the successor to his father until 2010, in sharp contrast to how his grandfather, Kim Il Sung, carefully managed his own succession over a long period of time. This excerpt from the second edition of Dr. Spencer C. Tucker's The Encyclopedia of the Korean War: A Political, Social, and Military History recounts the details of North Korea's previous leadership change.

**************



Kim Il Sung had early on chosen his son to be his political heir. He wanted to avoid the years of confusion and the ultimate repudiation that followed the deaths of his Soviet and Chinese contemporaries. Throughout the 1970s the ground was carefully laid for Kim's succession of his father. The North Korean top leadership, including Pak Song Chol, O Chin U, Kim Yong Nam, Yi Chong Ok, Chon Mun Sop, and So Chol, supported the succession. In preparation for this eventuality, North Korean authorities went to extraordinary lengths to glorify Kim and his accomplishments.

North Koreans began placing Kim's portraits along with those of his father in their homes, offices, and workplaces. In September 1973, Kim became a secretary of the Central Committee of the North Korean Workers' Party (NKWP) and, the following year, a member of its Politburo. By then, songs were being sung about him among party cadres, which carried special notebooks to record his instructions. And a slogan came into being: "Let's give our fealty from generation to generation."

 Despite his prominence in the NKWP, Kim's rise to power and selection as his father's successor were unacknowledged for several years, and his activities were masked under the mysterious "Tang Chungang" (Party Center), who was given credit for wise guidance and great deeds. This veil was lifted at the Sixth Congress of the NKWP in October 1980, when Kim was publicly named to the Presidium of the Politburo, the Secretariat of the Central Committee, and the Military Commission. In other words, he was openly designated as successor to his father.


Kim received the title of Dear Leader, close to that of Kim Il Sung's Great Leader. Both Kims were addressed and referred to in specific honorific terms not used for anyone else. As was done for his father's birthday of April 15, Kim's birthday of February 16 came to be celebrated as a national holiday. In December 1991 Kim was named supreme commander of the North Korean armed forces. By the time of his father's death in July 1994, Kim had come to be ranked second in the leadership, behind his father and ahead of his father's old comrade-in-arms, O Chin U, who died of cancer in early 1995.




About Dr. Spencer C. Tucker:
A Senior Fellow in Military History for ABC-CLIO since 2003, Dr. Spencer C. Tucker has been instrumental in establishing ABC-CLIO as the premier military history reference publisher in the country. Tucker's interest in military history began while he was a student at the Virginia Military Institute (VMI) and was enhanced by a Fulbright Fellowship in France and while serving as a captain in military intelligence in the Pentagon during the Vietnam War. Although he concentrated on Modern European History in his graduate studies, he became interested in all periods of military history. Spence taught at the university and college level for 36 years, 30 of these at Texas Christian University and the last six as holder of the John Biggs Chair of Military History at VMI. Spence is particularly excited to be the editor of ABC-CLIO's award-winning series of war encyclopedias, which includes the 2nd edition of The Encyclopedia of the Korean War: A Political, Social, and Military History.

Friday, October 21, 2011

Occupy Everything: The Streets Are Alive with the Sound of Anarchism

0 comments
The headlines from recent weeks can be paraphrased like something out of an anarchist’s dream: “Leaderless Movement Confronts Powers-That-Be.” The Occupy movement has spread across the U.S. and around the world in rapid fashion, and while it would be an overstatement to proclaim that it is an exercise in anarchism through-and-through, there is no doubt that the basic framework of decentralized solidarity strongly recalls previous episodes of anarchy breaking out.

Indeed, the spontaneous, emergent, and self-determined nature of the Occupy demonstrations falls within the ambit of anarchism—from the “people’s assemblies” model of governance to the general refusal to engage in the “politics of demand” that often characterizes social movements. The anarchist anthropologist David Graeber, one of the initial organizers of the catalyzing Occupy Wall Street effort, expressly analyzed the phenomenon in a Washington Post interview in terms that anarchists will find quite familiar:

“It’s pre-figurative, so to speak. You’re creating a vision of the sort of society you want to have in miniature. And it’s a way of juxtaposing yourself against these powerful, undemocratic forces you’re protesting. If you make demands, you’re saying, in a way, that you’re asking the people in power and the existing institutions to do something different. And one reason people have been hesitant to do that is they see these institutions as the problem.”
While this bears a resemblance to precursor mobilizations in which an anarchist ethos has helped set the tone for widespread actions and organizing tactics—as prominently seen in the anti-globalization movement, for instance—this is also something completely new and different. We would be hard-pressed to identify another example of a movement that has caught hold so broadly and quickly, without a central charismatic figure or even a concrete set of unified demands. Instead, this movement taps into a deep well of accumulated resentment while at the same time retaining a celebratory feel, yielding a combined effect of “love and rage” that mirrors contemporary anarchist praxis.

The panoply of slogans in the movement tells the story. “The Beginning Is Near.” “Lost My Job But Found an Occupation.” “Yes We Camp.” “Don’t Feed the Greed.” “I Can’t Afford My Own Politician So I Made This Sign.” “Tear Down This Wall St.” “Born-Again American.” “The People: Too Big to Fail.” After seeing a photo of a young woman holding a sign that read “I Care About You,” author Naomi Klein visited Occupy Wall Street and declared it “the most important thing in the world” right now.

The power of the moment plainly excuses the resort to hyperbole. We have been waiting a long time for this resurgence of people power, here in the “belly of the beast.” While the world has been witnessing popular uprisings and throwing off tyrants, Americans have largely been insulated through our relative privilege, subsidized creature comforts, and a palpable cultural echo-chamber of self-aggrandizement.

No more. Occupy Wall Street has morphed into Occupy Main Street. It coheres as Occupy Together and decenters itself anew as Occupy Everything. Ultimately, it asks us to once again Occupy Earth—which sets a high bar toward changing the paradigm, since the consumptive one we’ve been living in has been steadily rendering the biosphere inhospitable if not outright uninhabitable


Interestingly, the Occupy movement sprang up in full force right after I finished writing Anarchism Today. Yet the seeds of the movement’s cosmology are eminently present throughout the text, and the strands of anarchist organizing from the past and present that are described in the book read like a how-to manual for the cutting-edge movements in the streets today. The spirit of anarchy is alive and well, and is apparently coming soon to an everything near you…






About the Author:

Randall Amster holds a J.D. from Brooklyn Law School and a Ph.D. in Justice Studies from Arizona State University. He teaches Peace Studies and is the Graduate Chair of Humanities at Prescott College, and serves as the Executive Director of the Peace & Justice Studies Association. He publishes widely in areas including anarchism, ecology, nonviolence, war and peace, social movements, homelessness, immigration, and sustainable communities. Dr. Amster is a member of the editorial advisory boards for the Contemporary Justice Review and the Journal of Sustainability Education. In addition, he is a regular columnist for the Daily Courier and a frequent contributor to numerous online publications, and is also the founder and editor of the news and commentary website, New Clear Vision. His forthcoming book Anarchism Today will be published by Praeger/ABC-CLIO in March 2012.

Wednesday, September 7, 2011

Free Online Resources Reflect on the 10th Anniversary of the September 11 Attacks

0 comments

ABC-CLIO brings you History and the Headlines, a series of free online resource collections that provide authoritative information and engaging activities to help students and patrons understand important events. Sign up for this free eNewsletter here.

TEN YEARS LATER: THE SEPTEMBER 11 ATTACKS
September 11, 2011, marks the 10th anniversary of the attacks on the World Trade Center in New York City and the Pentagon in Washington, D.C., by members of the Al Qaeda terrorist group led by Osama bin Laden. A decade later, the harrowing events of that day remain a pivotal moment in American history, the effects of which are still being felt today. 


Help your students explore the events, individuals, and issues surrounding September 11 and its aftermath with reliable reference content and primary sources that you have come to expect from ABC-CLIO. Content includes:
*An insightful Need to Know essay about the impact of September 11, 2001, on the American people, written by leading expert Frank Shanty 
*A thought-provoking Examine section containing discussion questions that promote critical thinking 
*Over 75 reference entries, images, and documents that boost understanding of this pivotal moment in history 


SIMPLY CLICK HERE TO GET STARTED! 

Tuesday, August 23, 2011

NATO's Role in the Libyan Revolution

0 comments
On August 22, the Libyan rebels made a dramatic entrance into Tripoli, marking the culmination of their six-month-long struggle against the regime of Muammar Qadhafi. Back in February 2011, the rebellion began as a series of peaceful protests for change that sprang in Tunisia and rocked the Islamic world. Encountering the Qadhafi regime’s crackdown, the protests escalated into an uprising that spread across the country, with the forces opposing Qadhafi establishing a government based in Benghazi named the National Transitional Council (NTC) and seeking the overthrow the Qadhafi-led government. Qadhafi’s bloody crackdown was quickly condemned by the United Nations, which froze the Libyan assets and, following further government attacks on its citizens, authorized member states to establish and enforce a no-fly zone over Libya. Despite the ensuing NATO air bombardment campaign, the Qadhafi regime proved to be resilient. In fact, the hastily organized and poorly led rebel forces were repeatedly rolled back by much more experienced and better armed government forces, especially the vaunted Khamis Brigade. Just two weeks ago, the revolt against Muammar Qadhafi and his regime appeared to have stalled. The rebel efforts to push west from Benghazi and Misrata were repelled, and the rebel leadership appeared to be turning on itself. And yet, the last week of August showed a remarkable turnaround.

Several factors contributed to this change. The sudden collapse of Qadhafi's forces in late August was preceded by steady attrition through months of air strikes and squeezed supply lines. The NATO air campaign, which conducted over 19,750 sorties, inflicted considerable damage on the military capability of the Qadhafi forces. The relentless bombardment of armor and artillery east of Zawiya greatly weakened government defenses and contributed to breaking down much of the resistance that could have halted the rebel advance.

But much more important work was done behind the scenes. Judging from available reports, foreign special forces—primarily from France, Great Britain, and the United States, but also from Qatar and Jordan—played a major role in training the inexperienced rebel forces, providing weapons and serving as forward air control to guide air strikes. Over the last few weeks, the rebel groups appeared to be better armed and forged a closer and more effective working relationship with the NATO jets above them. While most of the world’s attention had focused on Brega and Misrata, the turning point of the campaign seems to have take place in what had hitherto been considered a sideshow, the Nafusa highlands in the west, where the NATO trainers and regular deliveries of arms and equipment fused the disparate rebel elements into a fighting force. The offensive that began from this region in early August delivered a breakthrough in the stalemate as the rebels scored a major victory at Zawiya. It demonstrated better preparation and coordination among the rebel forces while an amphibious assault on Tripoli clearly revealed the extent of planning that underlay rebel operations. One cannot but suspect considerable Western involvement in this planning. This detracts nothing from the efforts that the NTC has undertaken in its struggle against the Qadhafi regime, but it does underscore the decisive impact of NATO's decision to serve as the rebel air force.

--Alexander Mikaberidze is assistant professor of history at Louisiana State University, Shreveport, LA, and an award-winning author of eight books. His most recent published work is Conflict and Conquest in the Islamic World: A Historical Encyclopedia.

Conflict and Conquest in the Islamic World
This comprehensive reference work documents the extensive military history of the Islamic world between the 7th century and the present day.

Thursday, August 18, 2011

Why is gold so valuable?

0 comments
The world of global finance is abuzz these days with dizzying attempts to determine whether the already historically high price of gold is poised for an uptrend or a downtrend. In the immediate aftermath of Standard & Poor’s downgrade of the U.S. credit rating from level AAA to AA+, gold prices soared to even higher record levels on August 9 to reach $1780.

This more than 40% increase in the price of gold over a 12-month period is attributed to investors’ desire to seek a safe haven amid uncertainties in foreign exchange markets in the context of national debt crises in Europe and the United States and concerns about inflation. Yet tracing patterns of levels of gold as a store of value relative to a range of stock indexes, money supply trends, commodity prices, inflation or deflation, monetary policy, political instability, consumer behavior, and other indicators can produce varied interpretations from a likewise wide range of viewpoints. Underlying all scientific attempts to pinpoint the direction in which gold prices are heading is the most important, and most elusive, question as to precisely why gold is valuable. This question becomes even more difficult to answer in a present-day geopolitical and global financial environment that is profoundly transformational on complex levels, a circumstance that leaves scholars, commentators, and policymakers a bit stumped in their attempts at identifying contemporary understandings of value as rooted in sociocultural or economic structures.

The high values ascribed to gold today may be due to the simple fact that humankind has turned to gold for reassurance in uncertain times for centuries. In The Creation and Destruction of Value (2009), Princeton University professor of history and international affairs Harold James stresses the fact that “crises lead to a fundamental uncertainty about what things are worth” and cites a long list of historical occasions on which individuals and investors turned to various material assets, perhaps chief among them gold, as a reaction to economic and political upheaval.

Contemporary market research by such organizations as the World Gold Council confirms the fact that gold prices evolve according to diverse factors and in response to myriad conditions that affect nations across the globe differently. The interplay of these conditions is difficult to predict, yet would seem essential to any tangible prediction of gold price given the dynamic relationship between the price of gold, foreign exchange, money supply, and general trust in sovereign solvency.

If it is so difficult to identify specific, empirical reasons for gold’s value in contemporary society, one might be led to consider if this question is even worth asking? It clearly is. History has shown that times of uncertainty often lead to periods of meaningful, productive scientific and philosophical inquiry into the institutional/political frameworks and human interactions that make the world go ‘round. It is my instinct to turn to an icon in the historical evolution of perceptions of money and value in Western society for answers to this worthwhile question—St. Thomas Aquinas (1225-1274), who provides us with as an artful articulation of man’s relationship to gold as a precious commodity in his discourse on value in the Summa Theologica. As with other commodities, Aquinas notes that gold is perceived to have an intrinsic value tied to its beauty and functionality, yet cautions that this is simply based on human assumptions about what is or is not real when he compares the value of pure gold to gold fabricated by alchemical arts, postulating that if “real gold were to be produced by alchemy, it would not be unlawful to sell it for the genuine article, for nothing prevents art from employing certain natural causes for the production of natural and true effects.”

---Shannon L. Venable, author of Gold: A Cultural Encyclopedia

Monday, August 15, 2011

Gen. David Petraeus to Become Director of CIA

0 comments
When Gen. David Petraeus assumes his duties as director of the Central Intelligence Agency (CIA) in September, he will do so as someone uniquely qualified to lead the nation’s most prominent collector of foreign intelligence. The CIA is already a high-performing organization boasting talented and dedicated agents and analysts serving around the world. You can expect Director Petraeus to take the organization to even greater heights. This isn’t unabashed adulation, but simply where the Petraeus’ record points. Consider what some naysayers have suggested:

Petraeus will overstep his bounds or try to make policy on his own. Doubtful. Petraeus is always keenly aware of the limits of his authority. He is always loyal to his chain of command. He will try to persuade, he will work to impose his will, and he will be aggressive. But Petraeus will follow the rules and will find ways to operate within the authorities of his office.

The CIA won’t accept him because Petraeus is a military officer. The CIA consists of professionals who share the same ambition as does Petraeus – to succeed. They know full well how difficult it is to sustain their credibility in a dangerous world (not to mention the dangers of DC bureaucratic struggles). So they will have no problem at all welcoming Petraeus because he brings credibility demonstrated by a proven track record working the toughest national security challenges. And what many do not realize is the extent to which Petraeus integrated the work of intelligence professionals into his military decision-making as the top general in both Iraq and Afghanistan.

Petraeus will try to ‘militarize’ the CIA. He won’t need to, and wouldn’t want to, either. Petraeus is the rare senior military officer that is very comfortable moving between cultures. Whether it is an elite university, an Army unit, or the CIA, Petraeus will adapt. He’s supremely confident in himself and his abilities. He will impose his priorities on the CIA, but he’ll do it smartly and creatively. Soon enough, the CIA will be embracing his changes.

The CIA isn’t a flawless organization. It’s big and faces such an array of national security challenges that it’s difficult to keep pace. But expect it to move forward aggressively under Petraeus’ leadership. And don’t heed the pundits who project that Petraeus is eager to enter the political ring. His focus will be on the CIA, and on providing and assessing intelligence to ensure the national security of the United States. There are other duties ahead of him – likely secretary of defense and/or state, director of national intelligence, or national security adviser. But political office isn’t on the horizon for him, not now, and likely not for a long while, if ever. Politicians, even presidents, wield their power for too short of a period. General, now director, Petraeus is playing a longer game.

----Bradley T. Gericke, PhD, is the author of David Petraeus: A Biography. Dr. Gericke is a military historian and U.S. Army strategist who is currently stationed with the 8th Army in Seoul, South Korea.


-----------------------------------------------------------------------------------------------------------

Bradley T. Gericke
11/2010

This in-depth and forthright biography examines the personal and professional life of General David Petraeus, today's most prominent military leader.

Monday, August 1, 2011

Ramadan

0 comments
As a way to mark the Night of Power (Lailat-ul-Qadr in Arabic)—when the Koran was first revealed to the Prophet Muhammad—fasting (sawm) every day during the ninth month (Ramadan) of the Islamic lunar calendar defines one of the Five Pillars of Islam. By the end of Ramadan, Muslims will have experienced hunger and thirst most of the time, and thus will be genuinely inclined to relieve the burden of the poor for whom this is a permanent plight. Muslims' awareness of the whole community of believers (Umma) should have increased over the month as a result of training to forgo the gratification of their own desires, since they and their fellow believers will have been radically curbing them for 29–30 days in a row. This discipline frees the spirit from its habitual patterns and reminds it of God's sovereignty and provident mercy.

As self-mastery for God's sake, Ramadan is an inner holy war against temptations, where valor is shown through endurance (sabr) against Satan and the strengthening of faith. But it is first and foremost an act of pure submission (the literal meaning of the word islam) to God's command, given in the sura (chapter) entitled Al Baqarah in the Koran. This is the only passage where a month is mentioned by name, with instructions to fast throughout the month during which the holy book was first "revealed as guidance to man and clear proof of the guidance, and criterion (of falsehood and truth)." ...

Thus, the fast regulates the entry into the body of all foreign substances, whether food, drink, smoke, or medication. All of these are banned between the first glimmer of dawn until the sun has completely set, at which time all these exchanges between inside and outside become licit again. These two moments of the start and end of the daily fasting period are signaled by cannon shots during Ramadan in the cities of many Islamic countries.

Just after sunset and the iftar prayer for the breaking of the fast has been said, it is usual to have a light snack, such as one or three dates as was Muhammad's custom; this evening "breakfast" is experienced as a kind of sacrament of brotherhood. Once the daily evening prayer has been completed, a full dinner may be consumed—obviously none too soon. In this context, a festive atmosphere overtakes Muslim neighborhoods as friends visit each other's families. Near bedtime, extra tarawih prayers for Ramadan follow the daily night prayer at home or at the mosque ….


SNEAK PEEK at ABC-CLIO's brand-new World Religions databases
Available August 15, 2011 

As adherents approach the end of Ramadan (in 2011, this occurs at the end of August), the time between sundown on the 29th and the next morning's Eid ul-Fitr communal prayer for the breaking of the fast is set aside for special takbir prayers of Allahu Akbar ("God is Most Great") said in common in a number of variants. This time is also set aside for giving Zakat ul-Fitr—the seasonal "poor due," or support of the needy, which the head of the family must donate on behalf of all of its members to the corresponding number of needy Muslims. Zakat is another one of the Five Pillars of Islam. ...

After a month of ascetic exertion, Muslims watch for the new moon of Eid ul-Fitr (the festival marking the end of the month of Ramadan) with a great deal of excitement. The day before its expected appearance, men spend the day at the mosque and women take the children to cemeteries to visit departed family members. The new moon must be sighted between the sunset of the 29th and the break of dawn on the following day, or else a 30th day of fasting is added. The same method is used at the end of the previous month of Shaban to determine the actual beginning of Ramadan. …

At its core, Ramadan is one of the most important of all Islamic holy events—in depriving the body, enriching the soul, honoring Muhammad and the Koran, and submitting to God's command—and it has connected Muslims across the world for millennia, and continues to do so today.


----------------------------------------------------------------------------------------------------------

Roy, Christian. "Ramadan 2011: Background." World Religions: Belief, Culture, and Controversy. ABC-CLIO, 2011. Web. 1 Aug. 2011.

 

Tuesday, July 19, 2011

Casey Anthony: The First"Trial of the Century"of the Social-Media Age

0 comments
The Casey Anthony trial will be remembered as the first “Trial of the Century” for the social-media age. In the summer of 2008, Casey Anthony, a single mother from Orlando, Florida, was accused of murdering her two-year-old daughter, Caylee, after failing to report that she had been missing for 31 days. After the skeletal remains of Caylee Anthony were discovered in a wooded area near Casey Anthony’s home, prosecutors indicted Anthony on first-degree murder charges (as well as lesser charges) and sought the death penalty.
The murder trial began on May 24, 2011, captivating audiences across the globe as viewers became intrigued with details sensationalized through social-media websites and such television news programs as "Nancy Grace." Prosecutors argued that Casey Anthony murdered her two-year-old child to allow more opportunity for her to live a free lifestyle of partying. In fact, Anthony had partied consistently during the 31 days when her daughter was missing. Prosecutors also used forensic evidence to argue that the trunk of Casey Anthony’s automobile had contained chemicals and hair roots consistent with the decomposition of human remains. In addition, Anthony had allegedly searched a home computer for information about “chloroform,” “neck breaking,” and “death.”


While a large majority of the case's followers were convinced of Casey Anthony’s guilt and discussed it publicly via social media, a jury of seven women and five men nonetheless acquitted Anthony of the more serious charges on July 5, 2011. Anthony was found not guilty of first-degree murder, aggravated manslaughter, and aggravated child abuse but she was convicted on four misdemeanor counts of lying to law enforcement. While the jurors believed that Anthony was somehow involved in the death of her daughter, they maintained that the prosecution’s case lacked direct evidence that a murder had been committed. Judge Belvin Perry, who presided over the trial, sentenced Casey Anthony to four years in jail and $4,000 in fines for the misdemeanor convictions. However, Anthony was awarded three years credit for time served as well as further credit for good behavior and was released from prison on Sunday, July 17, 2011.

—Dr. Scott P. Johnson

-------------------------------------------------------------------------------------------------------------


Dr. Scott P. Johnson's Trials of the Century: An Encyclopedia of Popular Culture and the Law explores five centuries of legal history by examining famous murder trials as well as historic trials that changed the political, legal, religious, social, and racial landscape of America from the 1690s through today.

Friday, July 15, 2011

Arlene Taylor Receives Distinguished Alumnus Award

0 comments
We extend our hearty congratulations to Libraries Unlimited author Arlene Taylor, for being recognized as an outstanding alumna from her alma mater, the University of Illinois.


Over the extent of her career, Dr. Taylor has taught at the University of Chicago, Columbia University, and the University of Pittsburgh. Countless library students, including such luminaries as Linda Smith, editor of Reference and Information Services: An Introduction, Fourth Edition have learned the refined skills of cataloging from her. Countless others have learned from her textbooks, Introduction to Cataloging and Classification,  now in its tenth edition and her ground-breaking Organization of Information. In addition, Dr. Taylor has mentored many doctoral students who are following in her footsteps; one is co-authoring Taylor's next edition of her cataloging textbook. Congratulations, Arlene! The recognition is well-deserved!

Tuesday, July 12, 2011

President Obama Awards Increasingly Rare Medal of Honor

0 comments
When Sgt. First Class Leroy Petry receives the Medal of Honor today from President Barack Obama at a White House ceremony, he will become only the ninth recipient of the nation's highest military decoration for actions in Afghanistan or Iraq. The award of the Medal of Honor has been exceedingly rare for conflicts since the end of the Vietnam War. There were no awards of the Medal of Honor during Grenada, Panama, Lebanon, or Desert Storm. Two Medals of Honor were awarded for action in Mogadishu, Somalia, in 1993. Of the nine recipients awarded the honor in the 21st century, seven of them received it posthumously. Besides Petry, the other recent recipient who lived to receive the award in person was Staff Sgt. Salvatore Giunta, who was honored last year by Obama, also for heroic actions in Afghanistan. 

To learn more about some of the most famous and heroic Medal of Honor recipients in our nation's history, check out ABC-CLIO's America's Heroes: Medal of Honor Recipients from the Civil War to Afghanistan. This book features the stories of 200 heroic individuals awarded the Medal of Honor for their distinguished military service while fighting for their country, from the Civil War to the conflicts in Iraq and Afghanistan.

- Pat Carlin, Manager, Editorial Development for Military History, ABC-CLIO