Postmodern News Archives 14

Let's Save Pessimism for Better Times.


Harper Bids to be Bush's "Poodle"
Support for missile defence shield contrary to official policy.

By Linda Mcquaig
From
Linda Mcquaig.com
2007

Perhaps the most notable thing Stephen Harper did at the G8 gathering last week was signal his intention to take over retiring British Prime Minister Tony Blair's role as George W. Bush's most helpful foreign ally.

The implications of this go far beyond whatever embarrassment Canadians may come to feel about our prime minister assuming the role of what has sometimes been referred to as Bush's "poodle". Members of our corporate and academic elite have long pushed for Canada's prime minister to adopt the "poodle" role (without of course calling it that), arguing that closer ties with the White House will bring us more influence in U.S. corridors of power.


But as Tony Blair's experience illustrated, the influence tends to go the other way - with the lesser power helping to advance Washington's agenda, rather than Washington advancing the agenda of its finely-furred friend.

Harper gave us a good example of this last week when he spoke out in favour of Bush's controversial plan to install a missile defence shield in Eastern Europe, and dismissed Russian concerns about the scheme.

Harper's intervention – coming from a country with a peaceful reputation - was extremely helpful to Bush, who has had trouble convincing the world that his missile shield won't just set off a new arms race.

So Harper helped Bush sell his missile shield to a skeptical world – even though Canada has refused to participate in the project.

While Harper himself has expressed support for the shield, he promised in the last federal election campaign that he wouldn't reverse Canada's opposition to it without a vote in the House of Commons, which he knows he could not win.

So, in prominently supporting Bush on the missile defence shield last week, Harper in effect did an end-run around Parliament and the Canadian public, and helped advance a position that is at odds with Canada's own official policy.

The significance of this goes beyond Harper's thumbing his nose at Canadian democracy, which is bad enough. Even more seriously, Harper lent Canadian credibility to a reckless scheme that threatens to increase the risk of nuclear war.

If this sounds far-fetched, it's only because of the confusion created by the word "defence" in "missile defence shield". In reality, the scheme isn't about defence at all, but rather about making it possible for the U.S. to initiate a nuclear war without fear of retaliation.

This was set out clearly by two U.S. military analysts in a major article called "The Rise of U.S. Nuclear Supremacy", which appeared last year in the prestigious U.S. journal Foreign Affairs. The analysts, Keir Lieber and Daryl Press, explained: "[T]he sort of missile defenses that the United States might plausibly deploy would be valuable primarily in an offensive context, not a defensive one – as an adjunct to a U.S. first-strike capability, not as a standalone shield."

The analysts noted that while a missile shield wouldn't be effective in an all-out war, it could prove useful if Washington were to initiate a nuclear attack against, for example, Russia or China, leaving the targeted country with only a tiny surviving arsenal: "At that point, even a relatively modest or inefficient missile-defense scheme might well be enough to protect against any retaliatory strikes."

So, as Bush goes about creating the capacity to initiate nuclear war – just in case he decides to eliminate some of the "evil" in the world – it seems he can count on support from his new best friend to the north.




It's Hard Out There for a Gay Gangsta
Queer Rap Challenges Hip-Hop Homophobia

By Mary O'Regan
From
Utne.com

Forget about the homophobic right. Anti-gay messages have been rampant in the hip-hop world for years. Artists like Eminem and 50 Cent pepper their lyrics with homophobic slurs and openly admit to disapproving of same-sex relationships. According to the Gay and Lesbian Alliance Against Defamation, Eminem's third album, Marshall Mathers LP contained the word "faggot" 18 times. Similarly, AlterNet reported in 2004 that in an interview with Playboy, 50 Cent declared, "I ain't into faggots."

Despite harsh words from prominent MCs, queer rappers around the world are taking center stage. "Times are changing and if openly gay rappers aren't invited then we are kicking the door in," the Los Angeles-based queer rapper Deadlee tells Britain's Gay.com. Deadlee is the headliner for Homorevolution Tour 2007, what Gay.com calls "the first ever organized regional tour of gay, lesbian, and bisexual Hip Hop artists." The tour will stop in ten US cities this spring and has nearly two-dozen queer artists on the bill.

Another gay rapper taking part in Homorevolution is using his lyrical prowess to spread the word about prejudice. British MC QBoy is featured in Coming Out to Class, a documentary about dealing with homosexuality as a student. PinkNews.co.uk reports that the television broadcast of the film has inspired seven members of parliament to sign a motion "to introduce legislation to require schools to protect gay and lesbian children from the emotional harm and impaired educational attainment that results from bullying."

Some gay rappers argue that the menacing words thrown around by Eminem, 50 Cent, and bullying schoolchildren no longer hold any weight. "There are more homophobic lyrics in recent days, even as there has been more of a gay presence in the media," Tori Fixx, a queer rapper and producer from Minneapolis, told City Pages last year, but "calling somebody a fag is different than literally saying all batty boys need to be destroyed."


D'oh Canada!

By CP
From
The Winnipeg Sun

If we forget our history tests, are we doomed to repeat them?

Oh, Canada, shame on you!

Most Canadians know so little about their country that they would flunk the basic test that new immigrants are required to take before becoming citizens, according to a poll released yesterday.

The survey commissioned by the Dominion Institute found 60% of Canadians don't have the basic knowledge to pass the test given to newcomers. Ten years ago, 45% of those polled failed an identical test.

"Canadians appear to be losing knowledge when it comes to the most basic questions about Canadian history, politics, culture and geography ... (they) performed abysmally on some questions," Ipsos-Reid said in a statement while releasing the results of its survey of 1,005 adults.


While 96% of Canadians correctly identified the national anthem as O Canada, just six in 10 of them could recall its first two lines.

Only 4% could name four federal political parties represented in the House of Commons and just 4% knew the three requirements a citizen had to meet to be able to vote.

Only one-third could identify the number of Canadian provinces and territories. Only 8% knew that Queen Elizabeth is the head of state.

The Dominion Institute, which aims to boost knowledge of Canadian history and values, said all high school students should have to pass a special citizenship exam before they can graduate.

"It is frankly disheartening to see the lack of progress made by our group and the countless other organizations working to improve civic literary of Canadians over the last 10 years," institute co-founder Rudyard Griffiths said.

The Ipsos-Reid survey of 1,005 adults was done June 5 -7 and is accurate to within 3.1 percentage points, 19 times out of 20.


Wooster Collective & Street Art

By Ian Lynam
From
PingMag

What is street art? According to Wikipedia, Street art is any “art” developed in public spaces — that is, “in the streets” — though the term usually refers to art of an illicit nature (as opposed to, for instance, government or community art initiatives). The term can include traditional graffiti artwork, though it is often used to distinguish modern public-space artwork from traditional graffiti and the overtones of gang territoriality and vandalism associated with it.

As far as this subculture goes, Wooster Collective is the definitive place for your work to reside online. The website is the handiwork of New Yorkers Marc and Sara Schiller, documentarians of street art from all over the world. The kids took some time out of their busy day to answer a few questions for Pingmag.

Marc and Sara, in a few words: what is the Wooster Collective?

The Wooster Collective is a group of artists and art lovers who work on various projects and events which document and celebrate ephemeral art. The central aspect of the Wooster project is the website which, each day, showcases new street art from around the world.

How many artists are you currently gaterhing on the Wooster site?

We launched it in 2003 in New York City and over the last four years that site has profiled over 2000 emerging artists from every country in the world.

What has sparked the idea to start documenting street art in the first place?

We moved in an area of downtown New York, Soho, that we discovered was exploding in the amount of street art that was being placed on the streets. The creativity that we found was extremely inspiring. But because the art was illegal, often it only lasted a few days, or even hours. We felt that the street art movement needed to be documented in some way. We wanted to show what we were seeing each day to our friends. Because of this we launched the Wooster website. Our first artist on the site was Adam Neate.

Are there any memorable pieces that come to mind from that time?

Our favorite posts on the site have been by David Choe. He’s an amazing artist and has a wicked sense of humor. His “Day In The Life” is our favorite thing ever published on the site.

What kind of people actually come to Wooster Collective? What is your main feedback from viewers?

The reaction has been amazing. People from all walks of life come to the site every day. It’s not just other street artist of graffiti artists who enjoy the site. It has appealed to everyone, which is nice. Over 100,000 people visit it every day. We have been amazed how popular it has become - all because of word of mouth. We don’t actively publicize it at all.

I think we really created a community, but we also created a place where people can see what is happening in cities all over the world. We gave people a vehicle to reach a wider audience- to get noticed for their good work.

I noticed your “how to” section, which can almost be seen as a “street art tutorial”! Logan Hicks’ stencil cutting demo is insane!! I had no idea that he cut those totally by hand! Who do you ask about creating those “how tos”?

We just emailed our friends. We like them all and Logan’s is good! The network of artists is about 1,000. We know from the emails that we receive that people come back to the site not only to get inspired, but also to learn new things. Almost every day someone emails us asking how to do various things that they see on the site but don’t know how to do… It ranges from how to make a sticker, how to create wheat paste…how to hit a high spot… etc. So, in follow-up to our past series including “A Day In The Life”, “Give ‘Em Props” and “My Workspace”, we thought it would be cool to reach out to a group of our favorite people around the world and start a new series of posts on the Wooster site called “Wooster’s How To….”.

Who are your current favorite street artists then?

Armsrock, Swoon, Banksy, Blu, Mark Jenkins….. so many.

Do you have to exercise much quality control in terms of what goes up?

Yes. The site is our minds’ eye. We don’t let anyone post to the website other than us. We select things that inspire us, or things that we want to share with our friends because it is clever or funny or thought-provoking .

Do you use discretion in terms of what you cover?

Sure. We don’t want to put up anything that can lead to revealing the artist’s identity or personal information. We protect the anonymity of the artists.

Any other projects that are in the works that’ll be associated with WC?

We’re planning on a new book featuring the art of Armsrock from Bremen, Germany.

There always seem to be some issues about what is considered graffiti and what is definitely street art… How do you actually define the difference?

Graffiti really only uses spray cans to apply art directly to the walls. There are strict rules to graffiti. Street art is open to using other materials like paper, stencils, metal, etc. as well.

Yeah, I got lost in your 3D category. Amazing varitey of stuff there! But do you think that street art has reached a critical mass in terms of how it is viewed by popular culture?

Not yet, but soon. It is peaking now…

How do you see it peaking?

Like anything, once an underground movement goes “mainstream” as street art has, the vibrancy starts to drop off.

Do you worry that street art will lose relevancy through mass exposure?

Yes, and no. It will lose some of it’s power, but people will constantly be hitting the streets doing new things. It will morph into something new. And this is exciting.

Thanks a lot for the interview and also for archiving those fleeting art works for all of us!



The Politics of Naming: Genocide, Civil War, Insurgency

By Mahmood Mamdani
From
London Review of Books

The similarities between Iraq and Darfur are remarkable. The estimate of the number of civilians killed over the past three years is roughly similar. The killers are mostly paramilitaries, closely linked to the official military, which is said to be their main source of arms. The victims too are by and large identified as members of groups, rather than targeted as individuals. But the violence in the two places is named differently. In Iraq, it is said to be a cycle of insurgency and counter-insurgency; in Darfur, it is called genocide. Why the difference? Who does the naming? Who is being named? What difference does it make?

The most powerful mobilisation in New York City is in relation to Darfur, not Iraq. One would expect the reverse, for no other reason than that most New Yorkers are American citizens and so should feel directly responsible for the violence in occupied Iraq. But Iraq is a messy place in the American imagination, a place with messy politics. Americans worry about what their government should do in Iraq. Should it withdraw? What would happen if it did? In contrast, there is nothing messy about Darfur. It is a place without history and without politics; simply a site where perpetrators clearly identifiable as ‘Arabs’ confront victims clearly identifiable as ‘Africans’.

A full-page advertisement has appeared several times a week in the New York Times calling for intervention in Darfur now. It wants the intervening forces to be placed under ‘a chain of command allowing necessary and timely military action without approval from distant political or civilian personnel’. That intervention in Darfur should not be subject to ‘political or civilian’ considerations and that the intervening forces should have the right to shoot – to kill – without permission from distant places: these are said to be ‘humanitarian’ demands. In the same vein, a New Republic editorial on Darfur has called for ‘force as a first-resort response’. What makes the situation even more puzzling is that some of those who are calling for an end to intervention in Iraq are demanding an intervention in Darfur; as the slogan goes, ‘Out of Iraq and into Darfur.’

What would happen if we thought of Darfur as we do of Iraq, as a place with a history and politics – a messy politics of insurgency and counter-insurgency? Why should an intervention in Darfur not turn out to be a trigger that escalates rather than reduces the level of violence as intervention in Iraq has done? Why might it not create the actual possibility of genocide, not just rhetorically but in reality? Morally, there is no doubt about the horrific nature of the violence against civilians in Darfur. The ambiguity lies in the politics of the violence, whose sources include both a state-connected counter-insurgency and an organised insurgency, very much like the violence in Iraq.


The insurgency and counter-insurgency in Darfur began in 2003. Both were driven by an intermeshing of domestic tensions in the context of a peace-averse international environment defined by the War on Terror. On the one hand, there was a struggle for power within the political class in Sudan, with more marginal interests in the west (following those in the south and in the east) calling for reform at the centre. On the other, there was a community-level split inside Darfur, between nomads and settled farmers, who had earlier forged a way of sharing the use of semi-arid land in the dry season. With the drought that set in towards the late 1970s, co-operation turned into an intense struggle over diminishing resources.

As the insurgency took root among the prospering peasant tribes of Darfur, the government trained and armed the poorer nomads and formed a militia – the Janjawiid – that became the vanguard of the unfolding counter-insurgency. The worst violence came from the Janjawiid, but the insurgent movements were also accused of gross violations. Anyone wanting to end the spiralling violence would have to bring about power-sharing at the state level and resource-sharing at the community level, land being the key resource.

Since its onset, two official verdicts have been delivered on the violence, the first from the US, the second from the UN. The American verdict was unambiguous: Darfur was the site of an ongoing genocide. The chain of events leading to Washington’s proclamation began with ‘a genocide alert’ from the Management Committee of the Washington Holocaust Memorial Museum; according to the Jerusalem Post, the alert was ‘the first ever of its kind, issued by the US Holocaust Museum’. The House of Representatives followed unanimously on 24 June 2004. The last to join the chorus was Colin Powell.

The UN Commission on Darfur was created in the aftermath of the American verdict and in response to American pressure. It was more ambiguous. In September 2004, the Nigerian president Olusegun Obasanjo, then the chair of the African Union, visited UN headquarters in New York. Darfur had been the focal point of discussion in the African Union. All concerned were alert to the extreme political sensitivity of the issue. At a press conference at the UN on 23 September Obasanjo was asked to pronounce on the violence in Darfur: was it genocide or not? His response was very clear:

Before you can say that this is genocide or ethnic cleansing, we will have to have a definite decision and plan and programme of a government to wipe out a particular group of people, then we will be talking about genocide, ethnic cleansing. What we know is not that. What we know is that there was an uprising, rebellion, and the government armed another group of people to stop that rebellion. That’s what we know. That does not amount to genocide from our own reckoning. It amounts to of course conflict. It amounts to violence.

By October, the Security Council had established a five-person commission of inquiry on Darfur and asked it to report within three months on ‘violations of international humanitarian law and human rights law in Darfur by all parties’, and specifically to determine ‘whether or not acts of genocide have occurred’. Among the members of the commission was the chief prosecutor of South Africa’s TRC, Dumisa Ntsebeza. In its report, submitted on 25 January 2005, the commission concluded that ‘the Government of the Sudan has not pursued a policy of genocide . . . directly or through the militias under its control.’ But the commission did find that the government’s violence was ‘deliberately and indiscriminately directed against civilians’. Indeed, ‘even where rebels may have been present in villages, the impact of attacks on civilians shows that the use of military force was manifestly disproportionate to any threat posed by the rebels.’ These acts, the commission concluded, ‘were conducted on a widespread and systematic basis, and therefore may amount to crimes against humanity’ (my emphasis). Yet, the commission insisted, they did not amount to acts of genocide: ‘The crucial element of genocidal intent appears to be missing . . . it would seem that those who planned and organised attacks on villages pursued the intent to drive the victims from their homes, primarily for purposes of counter-insurgency warfare.’

At the same time, the commission assigned secondary responsibility to rebel forces – namely, members of the Sudan Liberation Army and the Justice and Equality Movement – which it held ‘responsible for serious violations of international human rights and humanitarian law which may amount to war crimes’ (my emphasis). If the government stood accused of ‘crimes against humanity’, rebel movements were accused of ‘war crimes’. Finally, the commission identified individual perpetrators and presented the UN secretary-general with a sealed list that included ‘officials of the government of Sudan, members of militia forces, members of rebel groups and certain foreign army officers acting in their personal capacity’. The list named 51 individuals.

The commission’s findings highlighted three violations of international law: disproportionate response, conducted on a widespread and systematic basis, targeting entire groups (as opposed to identifiable individuals) but without the intention to eliminate them as groups. It is for this last reason that the commission ruled out the finding of genocide. Its less grave findings of ‘crimes against humanity’ and ‘war crimes’ are not unique to Darfur, but fit several other situations of extreme violence: in particular, the US occupation of Iraq, the Hema-Lendu violence in eastern Congo and the Israeli invasion of Lebanon. Among those in the counter-insurgency accused of war crimes were the ‘foreign army officers acting in their personal capacity’, i.e. mercenaries, presumably recruited from armed forces outside Sudan. The involvement of mercenaries in perpetrating gross violence also fits the occupation in Iraq, where some of them go by the name of ‘contractors’.


The journalist in the US most closely identified with consciousness-raising on Darfur is the New York Times op-ed columnist Nicholas Kristof, often identified as a lone crusader on the issue. To peruse Kristof’s Darfur columns over the past three years is to see the reduction of a complex political context to a morality tale unfolding in a world populated by villains and victims who never trade places and so can always and easily be told apart. It is a world where atrocities mount geometrically, the perpetrators so evil and the victims so helpless that the only possibility of relief is a rescue mission from the outside, preferably in the form of a military intervention.

Kristof made six highly publicised trips to Darfur, the first in March 2004 and the sixth two years later. He began by writing of it as a case of ‘ethnic cleansing’: ‘Sudan’s Arab rulers’ had ‘forced 700,000 black African Sudanese to flee their villages’ (24 March 2004). Only three days later, he upped the ante: this was no longer ethnic cleansing, but genocide. ‘Right now,’ he wrote on 27 March, ‘the government of Sudan is engaged in genocide against three large African tribes in its Darfur region.’ He continued: ‘The killings are being orchestrated by the Arab-dominated Sudanese government’ and ‘the victims are non-Arabs: blacks in the Zaghawa, Massalliet and Fur tribes.’ He estimated the death toll at a thousand a week. Two months later, on 29 May, he revised the estimates dramatically upwards, citing predictions from the US Agency for International Development to the effect that ‘at best, “only” 100,000 people will die in Darfur this year of malnutrition and disease’ but ‘if things go badly, half a million will die.’

The UN commission’s report was released on 25 February 2005. It confirmed ‘massive displacement’ of persons (‘more than a million’ internally displaced and ‘more than 200,000’ refugees in Chad) and the destruction of ‘several hundred’ villages and hamlets as ‘irrefutable facts’; but it gave no confirmed numbers for those killed. Instead, it noted rebel claims that government-allied forces had ‘allegedly killed over 70,000 persons’. Following the publication of the report, Kristof began to scale down his estimates. For the first time, on 23 February 2005, he admitted that ‘the numbers are fuzzy.’ Rather than the usual single total, he went on to give a range of figures, from a low of 70,000, which he dismissed as ‘a UN estimate’, to ‘independent estimates [that] exceed 220,000’. A warning followed: ‘and the number is rising by about ten thousand a month.’

The publication of the commission’s report had considerable effect. Internationally, it raised doubts about whether what was going on in Darfur could be termed genocide. Even US officials were unwilling to go along with the high estimates propagated by the broad alliance of organisations that subscribe to the Save Darfur campaign. The effect on American diplomacy was discernible. Three months later, on 3 May, Kristof noted with dismay that not only had ‘Deputy Secretary of State Robert Zoellick pointedly refused to repeat the administration’s past judgment that the killings amount to genocide’: he had ‘also cited an absurdly low estimate of Darfur’s total death toll: 60,000 to 160,000’. As an alternative, Kristof cited the latest estimate of deaths from the Coalition for International Justice as ‘nearly 400,000, and rising by 500 a day’. In three months, Kristof’s estimates had gone up from 10,000 to 15,000 a month. Six months later, on 27 November, Kristof warned that ‘if aid groups pull out . . . the death toll could then rise to 100,000 a month.’ Anyone keeping a tally of the death toll in Darfur as reported in the Kristof columns would find the rise, fall and rise again very bewildering. First he projected the number of dead at 320,000 for 2004 (16 June 2004) but then gave a scaled down estimate of between 70,000 and 220,000 (23 February 2005). The number began once more to climb to ‘nearly 400,000’ (3 May 2005), only to come down yet again to 300,000 (23 April 2006). Each time figures were given with equal confidence but with no attempt to explain their basis. Did the numbers reflect an actual decline in the scale of killing in Darfur or was Kristof simply making an adjustment to the changing mood internationally?

In the 23 April column, Kristof expanded the list of perpetrators to include an external power: ‘China is now underwriting its second genocide in three decades. The first was in Pol Pot’s Cambodia, and the second is in Darfur, Sudan. Chinese oil purchases have financed Sudan’s pillage of Darfur, Chinese-made AK-47s have been the main weapons used to slaughter several hundred thousand people in Darfur so far and China has protected Sudan in the UN Security Council.’ In the Kristof columns, there is one area of deafening silence, to do with the fact that what is happening in Darfur is a civil war. Hardly a word is said about the insurgency, about the civilian deaths insurgents mete out, about acts that the commission characterised as ‘war crimes’. Would the logic of his 23 April column not lead one to think that those with connections to the insurgency, some of them active in the international campaign to declare Darfur the site of genocide, were also guilty of ‘underwriting’ war crimes in Darfur?

Newspaper writing on Darfur has sketched a pornography of violence. It seems fascinated by and fixated on the gory details, describing the worst of the atrocities in gruesome detail and chronicling the rise in the number of them. The implication is that the motivation of the perpetrators lies in biology (‘race’) and, if not that, certainly in ‘culture’. This voyeuristic approach accompanies a moralistic discourse whose effect is both to obscure the politics of the violence and position the reader as a virtuous, not just a concerned observer.

Journalism gives us a simple moral world, where a group of perpetrators face a group of victims, but where neither history nor motivation is thinkable because both are outside history and context. Even when newspapers highlight violence as a social phenomenon, they fail to understand the forces that shape the agency of the perpetrator. Instead, they look for a clear and uncomplicated moral that describes the victim as untainted and the perpetrator as simply evil. Where yesterday’s victims are today’s perpetrators, where victims have turned perpetrators, this attempt to find an African replay of the Holocaust not only does not work but also has perverse consequences. Whatever its analytical weaknesses, the depoliticisation of violence has given its proponents distinct political advantages.

The conflict in Darfur is highly politicised, and so is the international campaign. One of the campaign’s constant refrains has been that the ongoing genocide is racial: ‘Arabs’ are trying to eliminate ‘Africans’. But both ‘Arab’ and ‘African’ have several meanings in Sudan. There have been at least three meanings of ‘Arab’. Locally, ‘Arab’ was a pejorative reference to the lifestyle of the nomad as uncouth; regionally, it referred to someone whose primary language was Arabic. In this sense, a group could become ‘Arab’ over time. This process, known as Arabisation, was not an anomaly in the region: there was Amharisation in Ethiopia and Swahilisation on the East African coast. The third meaning of ‘Arab’ was ‘privileged and exclusive’; it was the claim of the riverine political aristocracy who had ruled Sudan since independence, and who equated Arabisation with the spread of civilisation and being Arab with descent.

‘African’, in this context, was a subaltern identity that also had the potential of being either exclusive or inclusive. The two meanings were not only contradictory but came from the experience of two different insurgencies. The inclusive meaning was more political than racial or even cultural (linguistic), in the sense that an ‘African’ was anyone determined to make a future within Africa. It was pioneered by John Garang, the leader of the Sudan People’s Liberation Army (SPLA) in the south, as a way of holding together the New Sudan he hoped to see. In contrast, its exclusive meaning came in two versions, one hard (racial) and the other soft (linguistic) – ‘African’ as Bantu and ‘African’ as the identity of anyone who spoke a language indigenous to Africa. The racial meaning came to take a strong hold in both the counter-insurgency and the insurgency in Darfur. The Save Darfur campaign’s characterisation of the violence as ‘Arab’ against ‘African’ obscured both the fact that the violence was not one-sided and the contest over the meaning of ‘Arab’ and ‘African’: a contest that was critical precisely because it was ultimately about who belonged and who did not in the political community called Sudan. The depoliticisation, naturalisation and, ultimately, demonisation of the notion ‘Arab’, as against ‘African’, has been the deadliest effect, whether intended or not, of the Save Darfur campaign.

The depoliticisation of the conflict gave campaigners three advantages. First, they were able to occupy the moral high ground. The campaign presented itself as apolitical but moral, its concern limited only to saving lives. Second, only a single-issue campaign could bring together in a unified chorus forces that are otherwise ranged as adversaries on most important issues of the day: at one end, the Christian right and the Zionist lobby; at the other, a mainly school and university-based peace movement. Nat Hentoff of the Village Voice wrote of the Save Darfur Coalition as ‘an alliance of more than 515 faith-based, humanitarian and human rights organisations’; among the organisers of their Rally to Stop the Genocide in Washington last year were groups as diverse as the American Jewish World Service, the American Society for Muslim Advancement, the National Association of Evangelicals, the US Conference of Catholic Bishops, the US Holocaust Memorial Museum, the American Anti-Slavery Group, Amnesty International, Christian Solidarity International, Physicians for Human Rights and the National Black Church Initiative. Surely, such a wide coalition would cease to hold together if the issue shifted to, say, Iraq.

To understand the third advantage, we have to return to the question I asked earlier: how could it be that many of those calling for an end to the American and British intervention in Iraq are demanding an intervention in Darfur? It’s tempting to think that the advantage of Darfur lies in its being a small, faraway place where those who drive the War on Terror do not have a vested interest. That this is hardly the case is evident if one compares the American response to Darfur to its non-response to Congo, even though the dimensions of the conflict in Congo seem to give it a mega-Darfur quality: the numbers killed are estimated in the millions rather than the hundreds of thousands; the bulk of the killing, particularly in Kivu, is done by paramilitaries trained, organised and armed by neighbouring governments; and the victims on both sides – Hema and Lendu – are framed in collective rather than individual terms, to the point that one influential version defines both as racial identities and the conflict between the two as a replay of the Rwandan genocide. Given all this, how does one explain the fact that the focus of the most widespread and ambitious humanitarian movement in the US is on Darfur and not on Kivu?

Nicholas Kristof was asked this very question by a university audience: ‘When I spoke at Cornell University recently, a woman asked why I always harp on Darfur. It’s a fair question. The number of people killed in Darfur so far is modest in global terms: estimates range from 200,000 to more than 500,000. In contrast, four million people have died since 1998 as a result of the fighting in Congo, the most lethal conflict since World War Two.’ But instead of answering the question, Kristof – now writing his column rather than facing the questioner at Cornell – moved on: ‘And malaria annually kills one million to three million people – meaning that three years’ deaths in Darfur are within the margin of error of the annual global toll from malaria.’ And from there he went on to compare the deaths in Darfur to the deaths from malaria, rather than from the conflict in Congo: ‘We have a moral compass within us and its needle is moved not only by human suffering but also by human evil. That’s what makes genocide special – not just the number of deaths but the government policy behind them. And that in turn is why stopping genocide should be an even higher priority than saving lives from Aids or malaria.’ That did not explain the relative silence on Congo. Could the reason be that in the case of Congo, Hema and Lendu militias – many of them no more than child soldiers – were trained by America’s allies in the region, Rwanda and Uganda? Is that why the violence in Darfur – but not the violence in Kivu – is named as a genocide?

It seems that genocide has become a label to be stuck on your worst enemy, a perverse version of the Nobel Prize, part of a rhetorical arsenal that helps you vilify your adversaries while ensuring impunity for your allies. In Kristof’s words, the point is not so much ‘human suffering’ as ‘human evil’. Unlike Kivu, Darfur can be neatly integrated into the War on Terror, for Darfur gives the Warriors on Terror a valuable asset with which to demonise an enemy: a genocide perpetrated by Arabs. This was the third and most valuable advantage that Save Darfur gained from depoliticising the conflict. The more thoroughly Darfur was integrated into the War on Terror, the more the depoliticised violence in Darfur acquired a racial description, as a genocide of ‘Arabs’ killing ‘Africans’. Racial difference purportedly constituted the motive force behind the mass killings. The irony of Kristof’s columns is that they mirror the ideology of Arab supremacism in Sudan by demonising entire communities.[*]

Kristof chides Arab peoples and the Arab press for not having the moral fibre to respond to this Muslim-on-Muslim violence, presumably because it is a violence inflicted by Arab Muslims on African Muslims. In one of his early columns in 2004, he was outraged by the silence of Muslim leaders: ‘Do they care about dead Muslims only when the killers are Israelis or Americans?’ Two years later he asked: ‘And where is the Arab press? Isn’t the murder of 300,000 or more Muslims almost as offensive as a Danish cartoon?’ Six months later, Kristof pursued this line on NBC’s Today Show. Elaborating on the ‘real blind spot’ in the Muslim world, he said: ‘You are beginning to get some voices in the Muslim world . . . saying it’s appalling that you have evangelical Christians and American Jews leading an effort to protect Muslims in Sudan and in Chad.’

If many of the leading lights in the Darfur campaign are fired by moral indignation, this derives from two events: the Nazi Holocaust and the Rwandan genocide. After all, the seeds of the Save Darfur campaign lie in the tenth-anniversary commemoration of what happened in Rwanda. Darfur is today a metaphor for senseless violence in politics, as indeed Rwanda was a decade before. Most writing on the Rwandan genocide in the US was also done by journalists. In We wish to inform you that tomorrow we will be killed with our families, the most widely read book on the genocide, Philip Gourevitch envisaged Rwanda as a replay of the Holocaust, with Hutu cast as perpetrators and Tutsi as victims. Again, the encounter between the two seemed to take place outside any context, as part of an eternal encounter between evil and innocence. Many of the journalists who write about Darfur have Rwanda very much in the back of their minds. In December 2004, Kristof recalled the lessons of Rwanda: ‘Early in his presidency, Mr Bush read a report about Bill Clinton’s paralysis during the Rwandan genocide and scrawled in the margin: “Not on my watch.” But in fact the same thing is happening on his watch, and I find that heartbreaking and baffling.’

With very few exceptions, the Save Darfur campaign has drawn a single lesson from Rwanda: the problem was the US failure to intervene to stop the genocide. Rwanda is the guilt that America must expiate, and to do so it must be ready to intervene, for good and against evil, even globally. That lesson is inscribed at the heart of Samantha Power’s book, A Problem from Hell: America and the Age of Genocide. But it is the wrong lesson. The Rwandan genocide was born of a civil war which intensified when the settlement to contain it broke down. The settlement, reached at the Arusha Conference, broke down because neither the Hutu Power tendency nor the Tutsi-dominated Rwanda Patriotic Front (RPF) had any interest in observing the power-sharing arrangement at the core of the settlement: the former because it was excluded from the settlement and the latter because it was unwilling to share power in any meaningful way.

What the humanitarian intervention lobby fails to see is that the US did intervene in Rwanda, through a proxy. That proxy was the RPF, backed up by entire units from the Uganda Army. The green light was given to the RPF, whose commanding officer, Paul Kagame, had recently returned from training in the US, just as it was lately given to the Ethiopian army in Somalia.

Instead of using its resources and influence to bring about a political solution to the civil war, and then strengthen it, the US signalled to one of the parties that it could pursue victory with impunity. This unilateralism was part of what led to the disaster, and that is the real lesson of Rwanda. Applied to Darfur and Sudan, it is sobering. It means recognising that Darfur is not yet another Rwanda. Nurturing hopes of an external military intervention among those in the insurgency who aspire to victory and reinforcing the fears of those in the counter-insurgency who see it as a prelude to defeat are precisely the ways to ensure that it becomes a Rwanda. Strengthening those on both sides who stand for a political settlement to the civil war is the only realistic approach. Solidarity, not intervention, is what will bring peace to Darfur.

The dynamic of civil war in Sudan has fed on multiple sources: first, the post-independence monopoly of power enjoyed by a tiny ‘Arabised’ elite from the riverine north of Khartoum, a monopoly that has bred growing resistance among the majority, marginalised populations in the south, east and west of the country; second, the rebel movements which have in their turn bred ambitious leaders unwilling to enter into power-sharing arrangements as a prelude to peace; and, finally, external forces that continue to encourage those who are interested in retaining or obtaining a monopoly of power.

The dynamic of peace, by contrast, has fed on a series of power-sharing arrangements, first in the south and then in the east. This process has been intermittent in Darfur. African Union-organised negotiations have been successful in forging a power-sharing arrangement, but only for that arrangement to fall apart time and again. A large part of the explanation, as I suggested earlier, lies in the international context of the War on Terror, which favours parties who are averse to taking risks for peace. To reinforce the peace process must be the first commitment of all those interested in Darfur.

The camp of peace needs to come to a second realisation: that peace cannot be built on humanitarian intervention, which is the language of big powers. The history of colonialism should teach us that every major intervention has been justified as humanitarian, a ‘civilising mission’. Nor was it mere idiosyncrasy that inspired the devotion with which many colonial officers and archivists recorded the details of barbarity among the colonised – sati, the ban on widow marriage or the practice of child marriage in India, or slavery and female genital mutilation in Africa. I am not suggesting that this was all invention. I mean only to point out that the chronicling of atrocities had a practical purpose: it provided the moral pretext for intervention. Now, as then, imperial interventions claim to have a dual purpose: on the one hand, to rescue minority victims of ongoing barbarities and, on the other, to quarantine majority perpetrators with the stated aim of civilising them. Iraq should act as a warning on this score. The worst thing in Darfur would be an Iraq-style intervention. That would almost certainly spread the civil war to other parts of Sudan, unravelling the peace process in the east and south and dragging the whole country into the global War on Terror.


Footnotes
* Contrast this with the UN commission’s painstaking effort to make sense of the identities ‘Arab’ and ‘African’. The commission’s report concentrated on three related points. First, the claim that the Darfur conflict pitted ‘Arab’ against ‘African’ was facile. ‘In fact, the commission found that many Arabs in Darfur are opposed to the Janjawiid, and some Arabs are fighting with the rebels, such as certain Arab commanders and their men from the Misseriya and Rizeigat tribes. At the same time, many non-Arabs are supporting the government and serving in its army.’ Second, it has never been easy to sort different tribes into the categories ‘Arab’ and ‘African’: ‘The various tribes that have been the object of attacks and killings (chiefly the Fur, Massalit and Zeghawa tribes) do not appear to make up ethnic groups distinct from the ethnic groups to which persons or militias that attack them belong. They speak the same language (Arabic) and embrace the same religion (Muslim). In addition, also due to the high measure of intermarriage, they can hardly be distinguished in their outward physical appearance from the members of tribes that allegedly attacked them. Apparently, the sedentary and nomadic character of the groups constitutes one of the main distinctions between them’ (emphasis mine). Finally, the commission put forward the view that political developments are driving the rapidly growing distinction between ‘Arab’ and ‘African’. On the one hand, ‘Arab’ and ‘African’ seem to have become political identities: ‘Those tribes in Darfur who support rebels have increasingly come to be identified as “African” and those supporting the government as the “Arabs”. A good example to illustrate this is that of the Gimmer, a pro-government African tribe that is seen by the African tribes opposed to the government as having been “Arabised”.’ On the other hand, this development was being promoted from the outside: ‘The Arab-African divide has also been fanned by the growing insistence on such divide in some circles and in the media.’

Mahmood Mamdani is Herbert Lehman Professor of Government and a professor of anthropology at Columbia University. His most recent book is Good Muslim, Bad Muslim: America, the Cold War and the Roots of Terror.


Canada's Deadly Trade in Asbestos

By Mark Bourrie
From
Third World Network

Canada is starting work this summer on a billion dollar project to renovate its parliamentary buildings and cleanse them of asbestos, which has been found to cause cancer.

The project will take six years to complete but, in the meantime, Canadian government agents are still pushing exports of the fibre. Canada even has gone so far as to argue a challenge at the World Trade Organization that a proposed French ban on asbestos imports would be an illegal trade practice.

Despite recent warnings that asbestos was the cause of 500,000 cancer victims in western Europe alone, Canadian asbestos producers continue to promote and sell their fibre worldwide - especially to developing nations.


Asbestos is used as a binder in cement, as insulation, and in anti-fire walls. It is also a potent carcinogen with a long, well-documented legacy of death.

The danger comes when small asbestos fibres are released and inhaled by labourers. The fibres cause cancerous growths in the lungs, lung lining and abdomen but can take 20 years or more to manifest.

In 1997, Canada exported 430,000 tonnes of asbestos - more than 96% of production - most of it to the developing world. Canada is the world's second-largest exporter of asbestos after Russia.

Union activists, who have visited India and other developing countries say, however, that the public relations efforts of the government and the asbestos industry are simply window-dressing to hide the fact that most people who work with the natural mineral fibre risk cancer.

Critics of Canada's asbestos exports say the country is exporting death to protect the profits of a handful of companies and the jobs of 1,600 miners. "What's the difference between land mines and asbestos?" asks Dr. Barry Castleman, author of a respected book on the danger of asbestos. "A key difference, of course, is that Canada doesn't export land mines."

At the heart of the issue is Canada's own precarious political situation. All of the asbestos mines in Canada are in Quebec, a predominantly French-speaking province with a separatist government.

Federal and provincial politicians are pushing asbestos exports to prove that they are successful at developing overseas markets, and are protective of Quebec workers. Critics of asbestos exports say the industry would probably be allowed to die if it was centred in any other part of the country.

"Personally, I believe this is all about Quebec politics," says Canadian Auto Workers Health and Safety director Cathy Walker. "The Canadian and Quebec governments are competing with one another to show just how prepared they all are to protect Quebec jobs." The real costs will be borne by the developing world, she says.

Walker just returned from India, where she saw unprotected workers slashing open bags of asbestos fibres. In places where the asbestos was being mixed into cement, clouds of the carcinogenic fibres swirled around workers.

In Britain, the Cancer Research Campaign said in January that its study into the European asbestos-linked cancer epidemic should sound alarm bells everywhere, "particularly in the developing world where uncontrolled asbestos is still very common," said CRC director Gordon McVie.

Seven of Canada's top 10 markets are Third World countries. Still, the Canadian government, the asbestos industry and lobby groups are trying to put a good face on the asbestos industry.

Recently, diplomats stationed here were flown to asbestos- producing regions on an all-expense-paid first-class junket. Journalists have been cultivated with similar perks.

Philip Landrigan, of New York's Mount Sinai School of Medicine - the centre that first linked cancer to asbestos in the 1960s - says the asbestos lobby's claim that the fibre is safe is "absolutely untrue."

"Asbestos remains an important cause of human illness," says Landrigan. "All forms of asbestos are carcinogenic, and that includes Canadian chrysotile."

Julian Peto, head of epidemiology at the University of London, who wrote the January study on the Euro-epidemic, says there's no safe way to use asbestos in developed nations. In developing nations, where there is little money for protective clothing and ventilation systems, workers are being poisoned by the thousands, he said.

"There is no way you can control it in Britain, let alone the third world," Peto says.

"The thing about building materials is that they are completely uncontrollable. They are often used casually, by not very skilful people, who break them and drill them and cut them in small parcels."

In Canada, people working with asbestos are forced to limit their exposure to the fibres. Consumer products that release asbestos fibres into the atmosphere are banned, and the sale of loose asbestos to consumers is prohibited under law. However, most Canadian asbestos exports are of loose fibres, which are shipped in large reinforced paper bags.

Canadian asbestos producers say they're training foreign workers in the safe handling of asbestos, through the Asbestos Institute. The institute, founded in 1984 by the federal and Quebec governments and the asbestos industry, has been the beneficiary of more than $10 million in Canadian government funding.

Ten European Union members have banned asbestos. France, which banned it in 1997 for health reasons, now faces a Canadian challenge at the WTO. Canada argues the ban violates Canada's rights under international trade rules.

In a speech delivered last year before an audience of occupational health professionals from around the world who had gathered in Italy, Dr. Joseph LeDou of the University of California's Medical School attacked Canada's asbestos-promoting efforts.

LeDou said Canada was engaged in "the exploitation of ignorance and poverty" in Asia, Africa and Latin America." He accused Canadian policy makers of "setting up the developing world "for an epidemic of asbestos-related disease, the costs of which will fall on countries that can ill afford it."




Tapped Resources: The Dirty Truth about Bottled Water

By Ashley Walters
From
Briarpatch Magazine
2007

“We sell water . . . so we’ve got to be clever.”

Senior vice president of Nestle Waters’ Global Marketing and Communications division


It Often Sells for three times the price of gasoline, and more and more of us are guzzling it — even though we can get the equivalent for next to nothing simply by tapping into the publicly owned infrastructure. It’s bottled water, and it’s a lucrative business. But why would we pay 240 to 10,000 times more for something that we can get for less than a penny by simply turning on a tap?

Marketing has a lot to do with it. As public concern over the state of the environment grows, private companies are quick to exploit those concerns with expensive individual solutions of dubious merit.

Bottled water sales are on the rise. A December 2006 report published by Sustain, a sustainable food and farming group, claims that bottled water consumption worldwide has increased 250 percent (from 58 billion litres in 1994 to 144 billion in 2002), largely as a result of successful marketing. The report says that bottled water is marketed as a fashion accessory for health-conscious consumers with discerning palates. Laurie Ries, a marketing consultant quoted in the Polaris Institute’s 2005 “Inside the Bottle” report, describes bottled water as “America’s most affordable status symbol.”


As bottled water is sold to the upper and middle classes, support for public drinking water infrastructure begins to trickle away. In 2001, the bottled water industry fought and won a dispute over a bill proposed to impose a five-cent tax on every bottle of water sold in Texas to fund improvements in the public water infrastructure. The same year, the World Wildlife Fund issued a report stating that the annual US$22 billion spent globally on bottled water could fund the municipal water systems of 2,000 cities with populations of four million people each — more than the entire world population.

It is no secret that bottled water is a hot spring of cash for big business and industry. Already worth US$400 billion, the bottled water industry is 30 percent larger than the pharmaceutical industry. “Inside the Bottle” describes the bottled water industry as “one of the fastest growing and least regulated industries in the world.” The Polaris Institute report focuses on the “big four” bottled water giants, Nestlé, Danone, PepsiCo and Coca-Cola, who, in 2004, collectively held over 58 percent of the US market.

Even though business is flourishing and millions of people drink bottled water, the industry remains very loosely regulated. The US Food and Drug Administration has assigned “one half of a staff person (full-time equivalent) to bottled water regulation, and less than one to ensuring bottled water compliance,” as cited by Eric Olsen, a senior attorney with the Natural Resource Defense Council, referenced in “Inside the Bottle.”

The Natural Resource Defense Council (NRDC) estimates that American bottled water plants undergo inspection once every five or six years, while the Canadian Food Inspection Agency estimates that Canadian plants are inspected once every three years. This lags significantly behind the norm for tap water testing in developed countries, where tests are run several times daily.

Adding to this discrepancy are the vast regulatory differences between the two agencies that regulate tap and bottled water in the US. The Food and Drug Administration (FDA) looks after bottled water, while the Environmental Protection Agency (EPA) regulates tap water. Although the FDA claims that it has adopted many of the standards of the EPA, a comparison of the two in a 1999 NRDC study proves otherwise. The study compared EPA and FDA standards and discovered some notable discrepancies between the two:

> FDA regulations don’t apply to bottled water produced and sold within the same state, which accounts for 60 to 70 percent of bottled water in the US.

> The FDA allows for some levels of fecal coliform (a sign of likely contamination with fecal matter) and E.coli within bottled water while the EPA demands that tap water is free of these contaminates.

> The FDA doesn’t oblige companies to test their bottled water for parasites such as cryptosporidium or giardia while the EPA does.

Although bottled waters are considered safe for human consumption, some can pose risks to individuals with compromised immune systems. Tap water carries little to no risk for such individuals and contamination issues are required to be made public immediately, while there is no such requirement for bottled water.

“So pure we promise nothing.” This is the slogan from Pepsi’s 2003 marketing campaign for their Aquafina bottled water. Oddly enough, this phrase leaks the truth about the relative purity of bottled water.


“To me, it’s like an epidemic,” John Rudnickas, the manager of water quality for the City of Toronto, told Briarpatch. “A lot of people are saying they drink bottled water out of convenience, because I think they realize that they are being duped.” In 1999, the US Natural Resource Defense Council released a four-year study on 103 popular brands of bottled water and found that one-fourth were nothing more than bottled tap water, while one-third were contaminated with levels of toxins above state or industry standards. In 2004, the American Society of Microbiology tested 68 commercial mineral bottled waters sold throughout the world, and found that 40 percent contained bacteria or fungi, while 21 samples could support bacteria growth in lab cultures. The NRDC report concluded that bottled water is no better than tap water, and could even be worse, as tap water undergoes more rigorous testing, disinfecting and filtering processes and is required to conform to stricter EPA standards than its bottled counterpart.

In 1990, the Toronto Department of Public Health and the Environmental Protection Office conducted a study similar to that of the NRDC, comparing tap water, bottled water and water that runs though home filtration systems. The study concluded that due to stricter regulations and daily testing, tap water was the safest and cheapest alternative to bottled water. “The bottom line was that their recommendation was to drink tap water,” said Rudnickas.

Both Pepsi and Coke admit that their bottled water is tap water that undergoes filtration processes. What they don’t mention is that one of their filtration methods can result in the creation of a toxic by-product. Ozonation is a filtration process by which ozone, a gas, is injected into water to eliminate bacteria and maintain freshness. When ozone is added to spring or underground source water (which contains the naturally occurring chemical salt, bromide) the carcinogenic chemical bromate can emerge as a by-product.

In 2001, the FDA issued a warning regarding acceptable levels of bromate in bottled water, after which Nestlé stopped using the ozonation process in its Perrier sparkling water. According to “Inside the Bottle,” Pepsi and Coke continue to use ozonation as a means to disinfect tap water in the US and selected countries abroad, though in 2004 Coca-Cola recalled 500,000 of its Dasani bottles distributed throughout Britain after the water was found to contain harmful concentrations of bromate. Tap water undergoes more rigorous testing, disinfecting and filtering processes and is required to conform to stricter EPA standards than its bottled counterpart.

Although it is associated with pristine lakes, rivers and snow-capped mountains, bottled water inflicts substantial (and largely unnecessary) damage on the environment. Plastic bottles of water are transported to various parts of the world by fossil fuel-burning vehicles, and the bottles themselves are made of plastic derived from crude oil. The Earth Policy Institute estimates that in the US, 1.5 million barrels of crude are used annually for this purpose.

“Energetically speaking, the life cycle of bottled water is just ridiculous,” says Dr. Keith Solomon, a toxicologist at the University of Guelph in Ontario. “In most first-world countries the local water is perfectly safe to drink.”

The problems with bottled water are compounded in developing countries, where the public infrastructure for water purification and distribution is often inadequate — or even non-existent. International financial institutions, first-world governments, and the transnational corporations that stand to profit from lack of access to potable water have teamed up to pressure poor countries to slash public waterworks funding — or even sell the infrastructure to a for-profit corporation.

As a key component of “structural adjustment,” the International Monetary Fund and World Bank often make their loans contingent on the privatization of a nation’s water supply. In September 2001, the World Bank approved a loan for US$100 million for “structural adjustment” in Ghana. Before releasing the funds, however, the World Bank insisted that the nation’s government increase water and electricity tariffs by 94 to 96 percent to pay for “operating costs.” The government complied and as a result much of the population was not able to afford access to clean water. Currently the Ghana Water Company is leased to Aqua Vitens Rand Ltd., a product of the mergence of the Dutch water company Vitens with the South African water distributor Rand Water.

Many communities within the US and Canada are increasingly wary of allowing bottled water companies to deplete local water supplies to fuel their growing industry. Citizens of Maine, Florida, Wisconsin, New Hampshire and Ontario have all protested local and international bottled water companies that have wanted to set up shop in their towns and drain local water reserves for wider distribution. The Corporate Accountability International Organization reports that Coca-Cola, vendor of numerous bottled waters, has depleted water supplies in some areas of India, leaving affected communities struggling for survival.

Clean drinking water is already out of reach for over one billion people around the world. With water increasingly being sold as a private commodity, the lives of impoverished people in water-scarce regions are increasingly in the hands of corporations. But communities saddled with suddenly-sky-high water costs as a result of privatization schemes are fighting back. In 1999, the World Bank advised the Bolivian government to sell off its public water infrastructure to private corporations. With California-based Bechtel Corporation at the helm, the price of water in the city of Cochabamba suddenly doubled, and Bolivians took to the streets in protest. The resulting civil unrest — the “water war,” as it became known — eventually forced the government to break its US$200 million contract with Bechtel in April 2000.

This scenario is not unrelated to the booming North American market for bottled water. Each time a bottle of water is purchased, transnational corporations, whose thirst for profit outweighs concern for public welfare, grow stronger. If the people of Bolivia had to resort to massive civic revolt to regain access to this vital natural resource, perhaps we had better start examining what sort of corporate practices our purchases endorse. We may discover that our future may not be that far away from the “water wars” waged in Bolivia.


Most 'Arrested by Mistake'

Coalition intelligence put numbers at 70% to 90% of Iraq prisoners, says a February Red Cross report, which details further abuses.

By Bob Drogin
From
Common Dreams
2004

Coalition military intelligence officials estimated that 70% to 90% of prisoners detained in Iraq since the war began last year "had been arrested by mistake," according to a confidential Red Cross report given to the Bush administration earlier this year.

Yet the report described a wide range of prisoner mistreatment — including many new details of abusive techniques — that it said U.S. officials had failed to halt, despite repeated complaints from the International Committee of the Red Cross.


ICRC monitors saw some improvements by early this year, but the continued abuses "went beyond exceptional cases and might be considered as a practice tolerated" by coalition forces, the report concluded.

The Swiss-based ICRC, which made 29 visits to coalition-run prisons and camps between late March and November last year, said it repeatedly presented its reports of mistreatment to prison commanders, U.S. military officials in Iraq and members of the Bush administration in Washington.

The ICRC summary report, which was written in February, also said Red Cross officials had complained to senior military officials that families of Iraqi suspects usually were told so little that most arrests resulted "in the de facto 'disappearance' of the arrestee for weeks or even months."

The report also described previously undocumented forms of abuse of prisoners in U.S. custody. In October, for example, an Iraqi prisoner was "hooded, handcuffed in the back, and made to lie face down" on what investigators believe was the engine hood of a vehicle while he was being transported. He was hospitalized for three months for extensive burns to his face, abdomen, foot and hand, the report added.

More than 100 "high-value detainees," apparently including former senior officials in Saddam Hussein's regime and in some cases their family members, were held for five months at the Baghdad airport "in strict solitary confinement" in small cells for 23 hours a day, the report said.

Such conditions "constituted a serious violation" of the Third and Fourth Geneva Conventions, which set minimum standards for treatment of prisoners of war and civilian internees, the report said. U.S. intelligence agencies, including the CIA and the Defense Intelligence Agency, conducted interrogations at the site, but Army units were in charge of custody operations, officials said Monday.

Portions of the ICRC report were published last week. The full 24-page report, which The Times obtained Monday, cites more than 250 allegations of mistreatment at prisons and temporary detention facilities run by U.S. and other occupation forces across Iraq.

The report also referred to, but provided no details of, "allegations of deaths as a result of harsh internment conditions, ill treatment, lack of medical attention, or the combination thereof."

Spokesmen at the Pentagon and at U.S. Central Command headquarters said they had not seen the ICRC report and could not comment on specific charges. ICRC officials in Geneva said they regretted that the document became public. The ICRC usually shares its findings only with governments or other authorities to maintain access to detainees held in conflicts around the world. Among the abusive techniques detailed in the report was forcing detainees to wear hoods for up to four consecutive days.

"Hooding was sometimes used in conjunction with beatings, thus increasing anxiety as to when blows would come," the report said. "The practice of hooding also allowed the interrogators to remain anonymous and thus to act with impunity."

In some cases, plastic handcuffs allegedly were so tight for so long that they caused long-term nerve damage. Men were punched, kicked and beaten with rifles and pistols; faces were pressed "into the ground with boots." Prisoners were threatened with reprisals against family members, execution or transfer to the U.S. lockup at Guantanamo Bay, Cuba. The report also provides new details about the now-notorious Abu Ghraib prison, the focus of the prisoner abuse scandal.

During a visit to the "isolation section" of Abu Ghraib prison in October, ICRC delegates witnessed prisoners "completely naked in totally empty concrete cells and in total darkness, allegedly for several consecutive days."


A military intelligence officer, who is not identified in the report, told the ICRC monitors that such treatment was "part of the process" in which prisoners were given clothing, bedding, lights and toiletries in exchange for cooperation.

The ICRC sent its report to the military police brigade commander in charge of Abu Ghraib after the October visit, and the commander responded Dec. 24, a senior Pentagon official said last week. But the Pentagon did not launch a formal investigation into abuses at the prison until a low-ranking U.S. soldier approached military investigators Jan. 13 and gave them a computer disc of photos.

The ICRC report also describes torture and other brutal practices by Iraqi police working in Baghdad under the U.S.-led occupation. It cites cases in which suspects held by Iraqi police allegedly were beaten with cables, kicked in the testicles, burned with cigarettes and forced to sign confessions.

In June, a group of men arrested by Iraqi police "allegedly had water poured on their legs and had electrical shocks administered to them with stripped tips of electrical wires," the report notes. One man's mother was brought in, "and the policeman threatened to mistreat her." Another detainee "was threatened with having his wife brought in and raped."

"Many persons deprived of their liberty drew parallels between police practices under the occupation with those of the former regime," the report noted.


How Free is the Free Market?

By Noam Chomsky
From
Lip Magazine
05.15.97

THE FREE MARKET IS SOCIALISM FOR THE RICH. The public pays the costs and the rich get the benefit—markets for the poor and plenty of state protection for the rich.

There's a conventional doctrine about the era we're entering and the promise that it's supposed to afford. In brief, the story is that the good guys won the Cold War and they're firmly in the saddle. There may be some rough terrain ahead, but nothing that they can't handle. They ride off into the sunset, leading the way to a bright future, based on the ideals that they've always cherished: democracy, free markets and human rights.


In the real world, however, human rights, democracy and free markets are all under serious attack in many countries, including the leading industrial societies. Power is increasingly concentrated in unaccountable institutions. The rich and the powerful are no more willing to submit themselves to market discipline or popular pressures than they ever have been in the past.

Basic Rights
Let's begin with human rights, because it's the easiest place to start: they're actually codified in the Universal Declaration of Human Rights, passed unanimously by the United Nations General Assembly in December, 1948. In the United States there's a good deal of very impressive rhetoric about how we stand for the principle of the universality of the Universal Declaration, and how we defend the principle against backward, Third World peoples who plead cultural relativism.

All this reached a crescendo about a year ago, at the Vienna Conference. But the rhetoric is rarely besmirched by any reference to what the Universal Declaration actually says. Article 25, for example, states: "Everyone has the right to a standard of living adequate for the health and well-being of himself and his family, including food, clothing, housing and medical care and necessary social services, and the right to security in the event of unemployment, sickness, disability, widowhood, old age or other lack of livelihood."

How are these principles upheld in the richest country in the world, with absolutely unparalleled advantages and no excuses for not completely satisfying them? The US has the worst record on poverty in the industrialized world: a poverty level which is twice as high as England's. Tens of millions of people are hungry every night, including millions of children who are suffering from disease and malnutrition. In New York City 40% of children live below the poverty line, deprived of minimal conditions that offer some hope of escape from misery and destitution and violence.

Work & Debt in the New World Marketplace
Let's turn to Article 23. It states: "Everyone has a right to work under just and favourable conditions." The International Labor Organization (ILO) has just published a report estimating the level of global unemployment—understood to mean the position of not having enough work for subsistence—in January 1994 at about 30%. That, it says accurately, is a crisis worse than that in the 1930s. It is, moreover, just one part of a general worldwide human rights catastrophe. UNESCO estimates that about 500,000 children die every year from debt repayment alone. Debt repayment means that commercial banks made bad loans to their favourite dictators, and those loans are now being paid by the poor, who have absolutely nothing to do with it, and of course by the taxpayers in the wealthy countries, because the debts are socialized. That's under the system of socialism for the rich that we call free enterprise: nobody expects the banks to have to pay for the bad loans that's your job and my job.

Meanwhile, the World Health Organisation estimates that 11 million children die every year from easily treatable diseases. WHO's head calls it a silent genocide: it could be stopped for pennies a day.

In the US, of course, there is currently a recovery. But it's remarkably sluggish, with less than a third of the job growth of the previous six recoveries. Furthermore, of the jobs that are being created, an enormous proportion more than a quarter in 1992 are temporary jobs and most are not in the productive part of the economy. Economists welcome this vast increase in temporary jobs as an "improvement in the flexibility of labour markets". No matter that it means that when you go to sleep at night you don't know if you're going to have work the next morning it's good for profits, not people, which means that it's good for the economy in the technical sense.

Another aspect of the recovery is that people are working longer for less money. The workload is continuing to increase, while wages are continuing to decline which is unprecedented for a recovery. US wages as measured by labour costs per unit output are now the lowest in the industrial world, except for Britain. In 1991 the US even went below England, although England caught up and regained first place in the competition to crush poor and working people. Having been the highest in the world in 1985 (as one might expect in the world's richest country), US labour costs are today 60% lower than Germany's and 20% lower than Italy's. The Wall Street Journal called this turnaround "a welcome development of transcendent importance". It is usually claimed that these welcome developments just result from market forces, like laws of nature, and the usual factors are identified, such as international trade and automation. To put it kindly, that's a bit misleading: neither trade nor automation has much to do with market forces.

The Myth of Free Trade
Take trade. One well-known fact about trade is that it's highly subsidized with huge market-distorting factors, which I don't think anybody's ever tried to measure. The most obvious is that every form of transport is highly subsidized, whether it's maritime, aeronautical, or roads or rail. Since trade naturally requires transport, the costs of transport enter into the calculation of the efficiency of trade. But there are huge subsidies to reduce the costs of transport, through manipulation of energy costs and all sorts of market-distorting fashion. If anybody wanted to measure this, it would be quite a job.

Take the US Pentagon, a huge affair. A very substantial part of the Pentagon is intervention forces directed at the Middle East, across the whole panoply of intimidation devices to make sure nobody gets in the way if the US tries to intervene. And a large part of the purpose of that is to keep oil prices within a certain range. Not too low, because the US and British oil companies have to make plenty of profit, and these countries also have to earn profits which they can then send back to their masters in London and New York. So, not too low. But also not too high, because you want to keep trade efficient. I'm not even mentioning so-called externalities, like pollution and so on. If the real costs of trade were calculated, the apparent efficiency of trade would certainly drop substantially. Nobody knows how much.

Furthermore, what's called trade isn't trade in any serious sense of the term. Much of what's called trade is just internal transactions, inside a big corporation. More than half of US exports to Mexico don't even enter the Mexican market. They're just transferred by one branch of General Motors to another branch, because you can get much cheaper labour if you happen to cross a border, and you don't have to worry about pollution. But that's not trade in any sensible sense of the term, any more than if you move a can of beans from one shelf to another of a grocery store. It just happens to cross an international border, but it's not trade. In fact, by now it's estimated that about 40% of what's called world trade is internal to corporations. That means centrally-managed transactions run by a very visible hand with major market distortions of all kinds, sometimes called a system of corporate mercantilism, which is fairly accurate.


GATT and NAFTA just increase these tendencies, hence harming markets in incalculable ways. And if we proceed, we find that the alleged efficiencies of trade are to a large extent an ideological construction. They don't have any substantive meaning. With automation, for instance, there's no doubt that it puts people out of work. But the fact of the matter is that automation is so inefficient that it had to be developed in the state sector for decades meaning the US military system. And the kind of automation that was developed in the state sector at huge public cost and enormous market distortion was a very special kind. It was designed in order to de-skill workers and to enhance managerial power. This has nothing to do with economic efficiency; it's to do with power relations.

There have been a number of academic and management-affirmed studies which have shown over and over that automation is introduced by managers, even when it increases costs when it's inefficient just for power reasons.

Take containerization. It was developed by the US Navy that is, by the state sector in the economy masking market distortions. In general, invocation of market forces, as if they were laws of nature, has a large element of fraud associated with it. It's a kind of ideological warfare. In the post WWII period, this includes just about everything; electronics, computers, biotechnology and pharmaceuticals, for instance, were all initiated and maintained by enormous state subsidies and intervention otherwise they would not exist. Computers, for example in the 1950s, before they were marketable were virtually 100% supported by the taxpayer. About 85% of all electronics was state-supported in the 1980s. The idea is that the public is supposed to pay the cost. If anything comes out of it, you hand it over to the corporations. It's called free enterprise!

All of this quite sharply increased under the Reagan administration. The state share of GNP rose to new heights in the first couple of years of the Reagan administration. And they were proud of it. To the public they had all kinds of free-market talk, but when they were talking to the business community, they talked differently. So James Baker, when he was Secretary of the Treasury, announced with great pride to a business convention, that the Reagan administration had offered more protection to US manufacturers than any of the preceding post-war administrations, which was true, but he was being too modest; it actually offered more protection than all of them combined.

One of the reasons why Clinton had unusual corporate support for a Democrat is that he planned to go even beyond that level of market distortion and market interference, for the benefit of domestic-based capital. His Secretary of Treasury, Lloyd Bentsen, was quoted in the Wall Street Journal as saying, "I'm tired of this level playing field business. We want to tilt the playing field in favour of US industry." Meanwhile, there's a lot of very passionate rhetoric about free markets but, of course, that's free markets for the poor, at home and abroad.

The fact is that people's lives are being destroyed on an enormous scale through unemployment alone. Meanwhile, everywhere you turn you find work that these people would be delighted to do if they had a chance. Work that would be highly beneficial both for them and their communities. But here you have to be a little careful. It would be beneficial to people, but it would be harmful to the economy, in the technical sense. And that's a very important distinction to learn. All of this is a brief way of saying that the economic system is a catastrophic failure. There's a huge amount of needed work. There's an enormous number of idle hands of suffering people, but the economic system is simply incapable of bringing them together. Now of course this catastrophic failure is hailed as a grand success. And indeed it is for a narrow sector of privileged; profits are skyrocketing. The economy is working just fine for some people, and they happen to be the ones who write the articles, and give the speeches, so it all sounds great in the intellectual culture.

Globalized Currencies
Looking at these major tendencies, especially in the past twenty years, one crucial event was Richard Nixon's demolition of the Bretton Woods system in the early 1970s. That was the post-war system for regulating international currencies, with the US serving as a kind of international banker. He dismantled that with a lot of consequences.

One effect of the de-regulation of currencies was a huge increase of capital and financial markets. The World Bank estimated it at about 14 trillion dollars, which totally swamps government. And the amount of capital that's being transferred daily is increasing. It's probably now about a trillion dollars a day again swamping government.

In addition to a huge increase in the amount of unregulated capital, there's also a very radical change in its composition. John Eatwell, an economist at Cambridge, and a specialist on finance, pointed out recently that in 1970—before Nixon dismantled the system—about 90% of the capital used in financial transactions, internationally, was for long-term investment trade and about 10% for speculation. Now figures have reversed. It's 90% for speculation, and about 10% for investment and trade. Eatwell suggested that that may be a big factor in the considerable decline in growth rates since this happened in 1970.

The USA is the richest country in the world and it can't carry out even minimal economic planning because of the impact of speculative, unregulated capital. For a Third World country the situation is hopeless. There's no such thing as economic planning. Indeed the new GATT agreements are designed to undercut those possibilities by extending the so-called liberalization, and what they call "services," meaning that big Western banks the Japanese, British and American banks can displace the banks in smaller countries, eliminating any possibility of domestic national planning.

The accelerating shift from a national to a global economy has the effect of increasing polarization across countries, between rich and poor countries, but also, even more sharply, within the countries. It also has the effect of undermining functioning democracy. We're moving to a situation in which capital is highly mobile, and labour is immobile, and becoming more immobile. It means that it's possible to shift production to low- wage, high repression areas, with low environmental standards. It also makes it very easy to play off one immobile, national labour force against another.

During the NAFTA debate in the United States just about everybody agreed that the effect of NAFTA would be to lower wages in the United States for what are called unskilled workers, which means about 70% or 75% of the workforce. In fact, to lower wages you don't have to move manufacturing, you just have to be able to threaten to do it. The threat alone is enough to lower wages and increase temporary employment.

The Growth of Transnational Rule
Consider the matter of democracy. Power is shifting into the hands of huge transnational corporations. That means away from parliamentary institutions. Furthermore, there's a structure of governance that's coalescing around these transnational corporations. This is not unlike the developments of the last couple of hundred years, when national states more or less coalesced around growing national economies. Now you've got a transnational economy, you're getting a transnational state, not surprisingly. The Financial Times described this as a de facto world government, including the World Bank and the IMF, and GATT, now the World Trade Organisation, the G7 Executive, and so on. Transnational bodies remove power from parliamentary institutions. It's important to keep the technocrats insulated—that's World Bank lingo for you want to make sure you have technocratic insulation. The Economist magazine describes how it's important to keep policy insulated from politics.

Power is drifting not only to corporations but into the structures around them—all of them completely unaccountable. The corporation itself has got a stricter hierarchy than exists in any human institution. That's a sure form of totalitarianism and unaccountability, the economic equivalent of fascism which is exactly why corporations are so strongly opposed by classical liberals.

Thomas Jefferson, for example, who lived just about long enough to see the early development of the corporate system, warned in his last years that what he called banking institutions, money and corporations would simply destroy liberty and would restore absolutism, eliminating the victories of the American Revolution. Adam Smith was also concerned about their potential power, particularly if they were going to be granted the rights of "immortal persons".

The end of the Cold War accelerates all this. The Financial Times, for example, had an article called "Green shoots in communism's ruins"; one of the good things it saw going on was that the pauperization of the workforce and a high level of unemployment were offering new ways to undercut "pampered Western European workers" with their "luxurious lifestyles".

A British industrialist explained in the Wall Street Journal that when workers see jobs disappearing it has a salutary effect on people's attitudes. This was part of an article praising the Thatcher reforms for bringing about a low-wage, low-skill economy in England with great labour flexibility, and wonderful profits. Take General Motors, already the biggest employer in Mexico it is now moving into Eastern Europe but in a very special way. When General Motors set up a plant in Poland they insisted upon high tariff protection; similarly, when Volkswagen sets up a plant in the Czech Republic it insists on tariff protection and also externalization of costs. They want the Czech people and the Czech Republic to pay the costs; they just want the profit and they get it. That's the tradition: markets for the poor and plenty of state protection for the rich.

The biggest test is Poland. A country where multinational corporations can get people who are well-trained and well-educated and they'll have blue eyes and blond hair unlike in the Third World, and they'll work for 10% of your wages, with no benefits, because of the effectiveness of capitalist reforms in pauperizing the populations and in increasing unemployment.

That in fact tells us something about what the Cold War was about. We learn a lot about what it was about just by asking a simple question: Who's cheering and who's despairing? If we take the East. Who's cheering? The old Communist Party hierarchy, they think it's wonderful. They are now working for international capitalism. What about the population? Well, they lost the Cold War, they're in despair, despite their victory over the Soviet experiment.

What about the West? There's a lot of cheering from corporations and banks and management firms about the experts who were sent to Eastern Europe to clinch a friendly takeover, as the Wall Street Journal put it, but ran away with all the aid, it turns out. Very little of the aid got there; instead it went into the pockets of the Western experts and management firms. The workers in General Motors and Volkswagen lost the Cold War because now the end of the Cold War just gives another weapon to undermine their "luxurious lifestyles".

These misnamed free trade agreements, GATT and NAFTA, carry that process forwards. They are not free trade agreements but investor rights agreements and they are designed to carry forward the attack on democracy. If you look at them closely, you realize they are a complicated mixture of liberalization and protectionism carefully crafted in the interests of the transnational corporations. So, for example, GATT excludes subsidies except for one kind: military expenditures.

Military expenditures are a huge welfare system for the rich and an enormous form of government subsidy that distort markets and trade. Military expenditures are staying very high: under Clinton they're higher in real terms than they were under Nixon and they are expected to go up. That is a system of market interference and benefits for the wealthy.

Ownership of Information
Another central part of the GATT agreement, and NAFTA, is what are called intellectual property rights which is protectionism: protection for ownership of knowledge and technology. They want to make sure that the technology of the future is monopolized by huge and generally government-subsidized private corporations. GATT includes an important extension of patents to include product patents; this means that if someone designs a new technique for producing a drug, they can't do it because they violate the patent. The product patents reduce economic efficiency and cut back technical innovation. France, for example, had product patents about a century ago and that was a reason why it lost a large part of its chemical industry to Switzerland which didn't, and therefore could innovate.

It means that a country like India, where there is a big pharmaceutical industry which has been able to keep drug costs very low simply by designing smarter processes for producing things, cannot do that any longer.

Right after his NAFTA triumph Clinton went off to the Asia Pacific summit in Seattle where he proclaimed his "grand vision" of the free-market future. Corporations to emulate were the Boeing Corporation, for example, and in fact he gave a speech about the grand vision in a hangar of the Boeing Corporation. That was a perfect choice, as Boeing is an almost totally subsidized corporation. In fact, the aeronautical industry the leading export industry in the 1930s couldn't survive, and then the war came along and it made a huge amount of money, but it was understood right after the Second World War that they were not going to survive in the market. If you read Fortune magazine, it would explain how the aeronautical industry can't survive in the market. The public has to come in and subsidize them, and in fact the aircraft industry, which includes avionics and electronics and complicated metallurgy, is simply subsidized through the Pentagon and NASA. This is the model for the free-market future. The profits are privatized and that's what counts it's socialism for the rich: the public pays the costs and the rich get the profits. That's what the free market is in practice.



Arsenal Of Illusion

Hollywood know-how is helping to create new kinds of military weapons that target the brain—but not with a bullet.

By Jake MacDonald

From The Walrus
2007

Nearly four years after the destruction of the World Trade Center, I surrendered to a long-held curiosity and joined the United States Army. There’s a popular misconception that you can walk into a recruiting station and sign up. But the American army is the most sophisticated fighting force in history and it doesn’t accept just anyone. After a rigorous interview process and several hours studying the materials, I climbed onto the recruiting bus and headed off to basic training at Fort Benning, Georgia.

At boot camp I learned to handle the m16, the fearsome saw, and other modern weapons. After qualifying on the shooting range, I donned night-vision goggles and stalked through the spooky corridors of the urban-warfare facility, firing by instinct at pop-up targets of swarthy enemy soldiers or sometimes a shopkeeper armed only with a bagel. After twelve weeks of training, my outfit, the 22nd Infantry Regiment, shipped out to Iraq. Two days later, I got my first taste of combat. I was on patrol near Baji when my Bradley Fighting Vehicle came under sniper fire. I pursued the gunman into a village before realizing we’d been drawn into an ambush. Bullets whizzed by; a rocket-propelled grenade struck me in the chest, transforming my upper body into a mushroom cloud of pink mist and ricocheting my head off a nearby wall. At this point it occurred to me that fighting the war on terror was going to be more challenging than I expected. With a click of the mouse, I went back to reboot camp and started over, humbled but not discouraged. In this man’s army—a computer game called "America’s Army"—getting killed in action is nothing more than a temporary embarrassment.

"America’s Army" is financed and produced by the United States Department of Defense and is designed to lure young men into the forces. But the technology used to create the video game is at the centre of a much larger question that many Americans are beginning to ask themselves: like the teenage boys seduced into playing America’s Army, are they too going to be corrupted just as subtly by the Pentagon’s growing use of digital technology to create false realities? Digital technology has enabled military scientists working at the intersection of fantasy and reality to develop radical new weapons that will target the brain not with a bullet, but through the creation of a seamless fabricated reality. This tactic will, according to psychological war experts, help the American military not only exert behavioural control over the enemy on the battlefield, but, more ominously, over American public opinion.

The US Army used to call this sort of strategy “psyops” (psychological operations) and it even maintains a department of psychological warfare at Fort Bragg, North Carolina. Once dismissed as an idiot uncle of the military establishment, the Civil Affairs and Psychological Operations Command has mutated into a hydra with tentacles in every level of the military. Psyops can now manufacture eerie simulacra of reality, meaning that in the future it will become increasingly difficult to separate real news from combat footage, communiqués, and hostage videos fabricated by all sides for their own purposes. After all, why influence the news when you can invent it and have a digitally created Dan Rather present it? Thomas X. Hammes, a counter-insurgency expert with the US Marine Corps, says these weapons are being employed today to fight the war on terror and will be used even more in the future. “The notion that we can win this fight with a lot of [conventional] war toys is a fantasy,” he says. “It’s really important for people to understand that we’re no longer fighting foreign wars with guns and bombs. We’re fighting with ideas.”


During the 1980s, President Ronald Reagan increased defence spending by 35 percent, to more than $400 billion (US) a year, and promoted the idea of a futuristic missile shield over North America—a notion some scholars believe was inspired by the Paul Newman movie Torn Curtain. The Soviet Union, burdened by an increasingly inefficient economy, couldn’t keep up with US military spending and by 1991 had collapsed. Many hoped that the demise of communism would usher in a new era of global co-operation, but, with the Soviets vanquished, the United States launched its plan to remake the world in its own image.

In 1997, a number of people who are now top officials in the current US administration, including Secretary of Defense Donald Rumsfeld, Vice President Dick Cheney, and national security strategists Paul Wolfowitz and Richard Perle, launched a think tank called the Project for the New American Century. The group argued that it was time to take pre-emptive action to enforce US interests abroad, including removing unfriendly governments. “As the twentieth century draws to a close,” according to the project’s statement of principles, “the United States stands as the world’s pre-eminent power. Having led the West to victory in the Cold War, America faces an opportunity and a challenge. Does the United States have the resolve to shape a new century favorable to American principles and interests?”

Reshaping the world in America’s image would not only involve massive funding to produce new futuristic weapons, it would also require the Pentagon to enlist the support of Hollywood, where the arsenal of digital technology is advancing almost daily. Soon after coming to power in 2001, President George W. Bush acted on the first leg of this strategy when he announced that he was pumping hundreds of millions of dollars into such organizations as darpa, the Defense Advanced Research Projects Agency, which grants research money to weapons developers. Since then, development has begun on dozens of weapons that close the gap between old-fashioned military hardware and the virtual future. One of the most promising, in the Pentagon’s view, is the Brain Machine Interface, a system of embedded neural transmitters and computer software that bridges thought and action. It is being developed by Duke University scientists, who have already created a computerized system in which a lab monkey can move a robotic arm in a laboratory 1,000 kilometres away just by thinking about it. In the future, military commanders with brain implants will use more advanced versions of this technology to deploy unmanned gun ships and robotic tanks in battlefields half a world away.

Bush has also revived plans to develop new real-world weapons systems, including fighter jets that do not require pilots and a new generation of smart bombs. And he agreed to spend billions on the missile-defence program envisioned by Reagan twenty years earlier. In support of the plan, defence contractor Lockheed Martin is building an airship twenty-five times larger than the Goodyear blimp. The airship will serve as a communications platform where attacks on enemy missiles will be coordinated.


The military is also planning unmanned spaceships that will carry huge tungsten bolts, nicknamed “rods from God,” that can be dropped with devastating impact on even the smallest target anywhere on the planet. Recently, retired Air Force Secretary James G. Roche described these space weapons as mandatory for any twenty-first-century arsenal. “Space capabilities in today’s world are no longer nice to have,” he said. “They’ve become indispensable at the strategic, operational, and tactical levels of war. Space capabilities are integrated with and affect every link in the kill chain.”

As futuristic and powerful as this new generation of weapons will be, Bush, perhaps more than any other recent president, is guided by an idea once espoused by Napoleon: “There are but two powers in the world, the sword and the mind. In the long run the sword is always beaten by the mind.” According to many strategists, even if the United States wins on the battlefield, it must ultimately win over the minds of the citizens of a country it is invading with propaganda in order to remake the world in its own image. Hammes, who has trained insurgents around the world, believes it was precisely the military’s failure to win over the hearts and minds of its enemies that led to the United States’s defeat in a number of conflicts over the past thirty years. Today, it is no closer to winning over Iraq than it was when it invaded in 2003. “We were defeated in Vietnam, Lebanon, and Somalia, and we’ll lose in Iraq the same way,” says Hammes. “We’ll win the battles, but we’ll lose the war [of ideas].”

With the United States engaged in a protracted war against terrorism and bogged down in Iraq, the Pentagon is keenly aware of these past failures. William Arkin, an author and former military affairs analyst for the Los Angeles Times, says that the military is growing frustrated with its inability to stay ahead of the terrorist threat, and is anxious to enlist Hollywood and its digital expertise in its fight. “Traditionally, the military has been an innovative force in technological development,” he says. “But about ten years ago, with the digital revolution, the civilian world really began pulling ahead of the military. The army just can’t compete with Hollywood or Microsoft when it comes to digital wizardry.” Microsoft alone spent $2 billion (US) developing its Xbox game technology. It is that kind of muscular research spending and product development that has convinced the Pentagon that it must break down the walls between the military and the entertainment industry.

The first of several recent high-profile Pentagon initiatives in Hollywood came in 1996, when top military officers travelled to Los Angeles to brainstorm with executives from Industrial Light & Magic, Intel, and Paramount about storylines for their combat simulators. This wasn’t the first time the military had gone to Hollywood. During the 1960s, the cia was intrigued by the emergence of television and by experiments indicating that moving images produce a shift from left-brain to right-brain neural activity, which in turn induces a sort of chemical trance that suppresses judgment and heightens suggestibility. The researchers learned that once viewers “suspend their disbelief,” they become vulnerable to the values and messages embedded in the drama.

So it wasn’t surprising that soon after the meeting in 1996, the Pentagon proposed a working partnership with Hollywood. Three years later, it announced that it would build a new $45-million (US) production house in Los Angeles and that it intended to hire many of the screenwriters and producers who had attended the meeting. The new facility was designed by Herman Zimmerman, the award-winning designer of a number of Star Trek episodes, and dubbed the Institute for Creative Technologies. The institute soon became a sandbox for forty-five writers, directors, and special-effects technicians, many of them Academy Award nominees.


Their first project was the development of a total-immersion simulator that gives soldiers a preview of real-life combat situations. The simulator consists of a virtual-reality theatre with a 150-degree screen and a Dolby sound system. Inside, young soldiers-in-training can pick their way through a number of spooky combat environments. A typical program recreates a blown-up building strewn with garbage, jagged rebar, concrete, and splintered furniture. Through a hole in the virtual wall the young trainee can peer out at a wasted city, where sparrows dart through the smoke, Arabic music filters up from the street, and a helicopter gunship thunders overhead.

After the attack on the World Trade Center in 2001, the military returned to Hollywood—this time with new urgency—to again meet with studio heads and producers. Their goal: to enlist the entertainment industry in a sweeping campaign to rally public support for the military and the war in Iraq.

According to the entertainment trade paper Variety, those attending the meeting at the Pentagon’s studio included the presidents of cbs, hbo Films, Warner Brothers Television, and prominent producers and writers such as Steven E. de Souza (Die Hard), Joseph Zito (Delta Force One), and Spike Jonze (Being John Malkovich). One of the producers at that October 2001 meeting was Lionel Chetwynd (The Apprenticeship of Duddy Kravitz). “There was a feeling around the table,” he later recalled, “that something is wrong if half the world thinks we’re the Great Satan. Americans are failing to get our message across to the world.”

The meeting was off-limits to the media, and Chetwynd revealed little else. But a White House spokesperson later said that the government was asking movie moguls for their help in selling America’s image to audiences around the world. Said the spokesperson: “The administration will share with studio executives the themes we’re communicating at home and abroad, of patriotism, tolerance, and courage.” Military officials also reminded the producers of certain “resources we might have in government [that would] be helpful to them.”

David Robb, a former investigative journalist with the Hollywood Reporter and author of Operation Hollywood: How the Pentagon Shapes and Censors the Movies, explains what those “resources” might be. “The government,” he said, “is basically offering filmmakers access to expensive military equipment in exchange for editorial control over their scripts.” He sees a dangerous and growing interdependence between the film industry and the military. “It’s all about money,” he says. “If you’re making a movie that requires F-14 Tomcats or combat helicopters, you can save millions of dollars by making a quid pro quo arrangement with the Pentagon. They’ll loan you the equipment for peanuts—for the price of fuel, let’s say—if you let them control the script. Everybody is happy. The filmmakers get access to war toys. And the military establishment gets to flog its pro-war message to millions of moviegoers.”


As a result of this partnership, a string of new movies partially subsidized by the Pentagon will soon be showing up at theatres. In the high-profile No True Glory: The Battle for Fallujah, scheduled for release in 2006, Harrison Ford will play a heroic American general leading his troops into a hornet’s nest of insurgents in the Iraqi city. It’s unclear to what extent the Pentagon influenced the script, but the screenwriter said the movie “will focus on the bravery of our soldiers and point out why our military can be relied upon to do the right thing.”

To ensure their pictures cast US soldiers in the best possible light, producers who want access to military hardware must submit their scripts to the Pentagon. And according to Robb, military censors “always” insist on a rewrite. “They don’t ask for revisions in the script,” he says. “They tell you.” There are countless stories of the Pentagon trying to bully producers. Clint Eastwood, for example, was infuriated when the Pentagon refused to support Heartbreak Ridge because it contained a scene in which Eastwood’s character shoots a wounded Cuban soldier.

Once the revised script satisfies the military, the Pentagon dispatches a “minder” to the set, to make sure the story isn’t changed at the last minute. “Their main criterion,” says Robb, “is that a script has to ‘aid in the retention and recruitment of personnel.’ But Hollywood has crossed the line into the glorification of war. We’re getting a steady diet of this kind of propaganda, and I honestly believe it’s making us into a more warlike people.”

During the 1990s, before aligning itself with Hollywood, the military had conducted digital morphing experiments at Los Alamos National Laboratory in Santa Fe, New Mexico, the birthplace of the atom bomb. They attempted to recreate individual voices by “dragging and dropping” taped words into sentences, but the results invariably sounded phony and robotic. George Papcun, an expert in phonetic synthesis at Los Alamos, later improved the technology and used it to develop several fictive scenarios, including one in which General Colin Powell had been kidnapped by terrorists. Before a group of officers gathered for the demonstration, Powell announced, “I am being treated well by my captors.” In another demonstration, Papcun played an audiotape that had supposedly just been received from General Carl W. Steiner, former commander of the Special Operations Command. “Gentlemen!” said Steiner. “We have called you together to inform you that we are going to overthrow the United States government.”

But these experiments were amateurish compared to the work being done by technicians in Hollywood and Silicon Valley. Their sophisticated digital morphing techniques first appeared on movie screens in 1991, when audiences across the world gasped as a snake-eyed killer robot in a policeman’s uniform morphed out of a tile floor in Terminator 2: Judgment Day. Two years later, Jurassic Park used the same technology. It is the expertise behind the creation of these lifelike digital scenarios that the Pentagon covets. In the movie In the Line of Fire, for example, Clint Eastwood plays an aging Secret Service agent who happened to be on duty in Dallas on the day President John Kennedy was shot. To send Eastwood’s character back to 1963, Hollywood computer specialists used digital morphing to lift Eastwood from an early Dirty Harry movie, gave him a military haircut and a skinny tie, and dropped him into an actual news clip of Kennedy’s assassination, now showing Eastwood rushing to the president’s side.

Ford Motor Company also blurred the boundary between fantasy and reality when it raised Steve McQueen from the dead to advertise its 2005 Mustang. In a scenario cribbed from the movie Field of Dreams, a young farmer builds a winding racetrack on his farm. He circles the track a few times in his new Mustang, then McQueen (who died in 1980) comes sauntering through the cornstalks. The farmer flips the keys to McQueen, who roars off in the car, which Ford designed in homage to the 1968 muscle car McQueen drove in the classic movie Bullitt.

So would the military use similar technology to fabricate news clips, communiqués from insurgent fighters, and videotaped confessions from foreign villains? Perhaps the better question is, why wouldn’t it? Imperial powers have always used disinformation and deceit to advance their military goals. And the American military openly admits that deception will be an important tool in the wars of the future.

Creating virtual worlds to control public opinion and influence the battlefield was the disturbing theme of a paper entitled “Psyop Operations in the 21st Century,” published by the United States Army War College in 2000. The author enthuses over the possibility of using digital morphing techniques to create “simulated and reproduced voices, fabricated provocative speeches delivered by virtual heads of state, and projected images of actual life situations.” The paper concludes ominously that the twenty-first century will be “an amazing place” for achieving “mind and behavior control.”

Imagine, for example, the digitally reproduced president of a small country the United States is fighting appearing on his country’s television network and ordering his army to put down their weapons, or a similarly recreated leader of a democratic faction in Iran inviting the American army into the country to rescue them and dismantle Iran’s growing nuclear program. In fact, Arkin notes that during the Gulf War, this is precisely the technology psychological war planners wanted to deploy in a bid to destroy Saddam Hussein’s reputation with his allies. “They considered faking a video that showed Saddam indulging in sexual perversions, crying like a baby, and exhibiting other types of unmanly behaviour,” he says. “But they backed off because they were concerned about a bad reaction from their Arab partners.” Arkin says the United States also crafted a plan to project a huge holographic image of Allah into the skies over Baghdad, urging Iraqis to overthrow Hussein.

In the end, the military didn’t proceed with these initiatives, but Arkin says the Pentagon’s reservations were strictly logistical. “They scrapped the hologram because it required huge mirrors,” he says. “And there were other concerns, such as what is Allah supposed to look like? I don’t think ethics played any role at all in the decision to back off. These people tend to put military objectives ahead of ethics. And that’s worrisome.” While Arkin says the Pentagon has yet to directly target Americans, it has the growing capability—and perhaps motive—to do so. “There’s no evidence they’ve cooked up faked videos to influence public opinion here at home,” he says. “But there’s a danger that in the ongoing fight for hearts and minds, [it] may prove too tempting to resist.”

There are disturbing precedents. The head of the cia resigned last year after acknowledging that intelligence reports concerning Hussein’s nuclear weapons arsenal, which were used to justify the attack on Iraq, may have been faulty. Some analysts believe they were deliberately fabricated. If such false information can be used to sway public opinion today on such a critical issue, it’s not hard to imagine, for instance, North Korean leader Kim Jong Il appearing digitally on the cbs Evening News boasting about his arsenal of nuclear weapons and his plans to use them against the United States.

Some veteran psychological war operatives believe the military has already crossed that boundary and is moving toward manufacturing virtual newscasts. Retired Army Colonel John B. Alexander is a former intelligence officer with the US Army and author of Future War: Non-lethal Weapons in Modern Warfare. Does he believe that the Pentagon would invent the news Americans are watching to achieve military objectives? “As sure as a heart attack,” he says, without hesitation. “I guarantee they’re doing it already.”

There may not be evidence to prove Alexander’s allegations, but the military to enshroud the American public in a cloud of digital illusion. Clear indications of this surfaced in February 2000, when Colonel Christopher St. John, commander of one of the army’s psychological operations groups, gave a speech in which he called for “greater co-operation between the armed forces and media giants.” With some pride, he revealed that his team had managed to embed some psychological war operatives from Fort Bragg at cnn. They were doing editorial work. While it isn’t clear what the intent of the operation was, they were recalled when their presence at cnn was revealed. Admitted Major Thomas Collins of the US Army Information Service: “They worked as regular employees of cnn and helped in the production of news.”

It’s impossible to run an empire without young men who are willing to risk their lives on foreign shores. With troops stationed in more than 120 nations around the world, the US military is now more widely deployed than at any time since World War II. Virtually every soldier with combat training has been sent overseas, and the Army Reserve, comprised basically of weekend soldiers, makes up about 40 percent of the troops in Iraq. The army estimates that it will need at least 74,000 fresh recruits annually to sustain these troop levels. But since the reinstatement of the draft is widely viewed as politically unworkable, where will all those young recruits come from?

The hope is that such video games as America’s Army, which was the brainchild of United States Military Academy professor Colonel Casey Wardynski, will seduce young men into joining. The army budgeted $7 million to develop the project, and Wardynski partnered up with Michael Capps, a virtual-reality engineer at the United States Military Academy with four degrees ranging from mathematics to creative writing. After three years in the marketplace, America’s Army has proved to be a smash hit, with players having logged more than 60 million hours of online combat.

To promote the game, real soldiers hold tournaments at video-gaming conventions and visit youth-oriented events such as nascar stock car races, where they set up kiosks stuffed with army paraphernalia, real weapons, and computer terminals at which kids can try America’s Army. The US Air Force, the Marines, and the Special Forces have also produced their own games. All this is clearly part of a much broader strategy—one in which digital games, so effective in bringing teenage boys into the army, are expanding to turn the field of battle, and the struggle for the public mind, into a virtual game in which reality and fiction merge.

Of course, as it stands today, reality is still all too real for soldiers in the field. The day Patrick Resta, twenty-six, arrived in Iraq, one of the soldiers in his unit had half his head blown off by a roadside bomb. They bagged his body and set up camp as local villagers shot rocket-propelled grenades at their encampment. “In the confusion,” he says, “this car came down the road, dragging a piece of metal and throwing off sparks. The next thing you know, thirty guys from my unit opened fire on the car, which, as it turned out, contained three innocent civilians, one of them a twelve-year-old boy. This is all in my first three hours in the country. My entire tour of duty was a complete clusterfuck.” He now volunteers for a group of ex-soldiers called Veterans for Peace, visiting schools in the Philadelphia area and telling kids about the reality of war. “It’s not a video game. You’re shooting real human beings, and it’s a horrific thing. These army recruiters show up in their crisp uniforms to talk about adventure, heroism, free college tuition, and so on. The kids are young, and they don’t think their own government would lie to them. But I tell them, ‘Hey, they’re lying, to everybody. Don’t believe any of it.’” - Published July 2005


Jake MacDonald lives in Winnipeg. his latest book, With the Boys (Greystone, 2005), is a collection of essays chronicling the author's experiences among men and nature.