![]() |
Home
| Databases
| WorldLII
| Search
| Feedback
International Journal for Crime, Justice and Social Democracy |
Raphael Cohen-Almagor[1]
University of Hull, United Kingdom
Abstract
This article aims to address two questions: how does hate speech manifest
on North American white supremacist websites; and is there
a connection between
online hate speech and hate crime? Firstly, hate speech is defined and the
research methodology upon which the
article is based is explained. The ways that
‘hate’ groups utilize the Internet and their purposes in doing so
are then
analysed, with the content and the functions of their websites as well
as their agenda examined. Finally, the article explores the
connection between
hate speech and hate crime. I argue that there is sufficient evidence to suggest
that speech can and does inspire
crime. The article is based in the main on
primary sources: a study of many ‘hate’ websites; and interviews and
discussions
with experts in the field.
Keywords
Bigotry; hate crime; hate site; hate speech; racism; violence.
|
Please cite this article as:
Cohen-Almagor R (2018) Taking North American white supremacist groups seriously: The scope and challenge of hate speech on the Internet. International Journal for Crime, Justice and Social Democracy 7(2): 38-57. DOI: 10.5204/ijcjsd.v7i2.517.
This work is licensed under a Creative Commons Attribution 4.0 Licence. As an open access journal, articles are free to use, with proper attribution, in educational and other non-commercial settings. ISSN: 2202-8005
You may write me down in history
With your bitter, twisted lies,
You may trod me in the very dirt
But still, like dust, I’ll rise.
... You may shoot me with your words,
You may cut me with your eyes,
You may kill me with your hatefulness,
But still, like air, I’ll rise. Angelou (1978)
On 12 August 2017, James Alex Fields Jr rammed his car into a crowd of anti-fascist protesters united against a white supremacist rally, Unite the Right, in Charlottesville, Virginia, United States of America (USA). Fields killed 32-year-old Heather Heyer and injured dozens others. Prior to this attack, Fields associated himself with the alt-right movement,[2] which includes white supremacists and neo-Nazis. On his Facebook account, Fields expressed support for racism and extreme right-wing movements. Photos showed Fields with members of Vanguard America, a neo-Nazi group that is part of the Nationalist Front.
This incident illustrates the danger that the white supremacist movement poses to American society, and the close connection between hate online and hate crimes. Yet many liberals, from the USA in particular, tend to object to general hate speech regulation. They believe that legal restrictions on racist or hate speech are not warranted because they violate the speaker’s autonomy. Baker (1992, 1997), for instance, argues that almost all of the harm inflicted by free speech is eventually mediated by the mental processes of the audience. The audience decides its reaction to speech. The listeners determine their own response. Any consequences of the listeners’ response to hate speech must be attributed, in the end, to the listeners. The result is the right of speakers to present their views even if assimilation by the listeners leads to or constitutes serious harm. Baker (1997, 2012), like many American liberal philosophers and First Amendment scholars, wishes to protect freedom of expression notwithstanding the harm that the speech might inflict on the audience (Abrams 2017; Cohen 1993; Hardin 2002; Meiklejohn 1965, 2000; Nye, Zelikow and King 1997; Richards 1986; Scanlon 1979, 1995; Stone 2005; Volokh 2003, 2015; also interviews and discussions with those listed in Appendix 2 with citation identifier A). Consequently, many of my interviewees argue that American liberals thus tend to underestimate the harm in hate speech (interviews and discussions with those listed in Appendix 2 with citation identifier B).
Rather than speculating about what racists are saying, this article presents evidence that is largely based on primary sources. First, it reports on the author’s long-term study of ‘hate’ websites, documenting what radical members of the White movement are saying. In this respect, the article aims to illustrate the apparent mind-set, concerns and language of the people who contribute to the hate sites. Second, it is informed by interviews and discussions with over 40 specialists (identified in Appendix 2) in seven countries over the period 2006-2016. The article documents links between speech and action, arguing that hate speech should not be dismissed as ‘mere speech’ and that the preferred American liberal approach of fighting ideas with ideas, speech with speech, is insufficient. Hate speech needs to be taken more seriously by the legal authorities than it currently is.
This article addresses two questions: how does hate speech manifest on North American white supremacist websites; and is there evidence of a connection between online hate speech and hate crime? This discussion is timely because hate speech is a significant problem worldwide, especially on the Internet. A better understanding through research and for intelligent methods of interference is required, taking into account the rights of freedom of expression and people’s need for self-expression. The article is largely descriptive, providing evidence about the magnitude of white racism—an alternative term for white supremacy—on the Internet and its relationships with hate crimes. Elsewhere I have discussed the various ways and means by which it is possible to fight hate and bigotry on the Internet (Cohen-Almagor 2011), provided a critical analysis of the American protection of hate speech (Cohen-Almagor 2016), and analysed the question of whether law is appropriate to regulate hateful and racist speech (Cohen-Almagor 2012b). Having analysed the growing role of social media in spreading anti-social and violent speech (Cohen-Almagor 2015) I have concluded that the Internet as such is not the problem. The Internet, like most technologies, can be abused by users. It is cheap, easily accessible, reaches wide audiences, diffused, largely unregulated, provides tools to anonymize speakers and empowers users to have their say outside the traditional media.
This article firstly defines the concepts of hate speech and akrasia (acting against one’s better judgement) and explains the methodology. I then analyse the ways that hate groups are utilizing the Internet and their purposes in doing so. Internet hate can be found on thousands of websites, file archives, chat rooms, newsgroups and mailing lists. To understand the challenge of responding to this hate, it is important to reflect on the content and agenda of hate mongers. It is also important to examine the functions of hate sites. Like many other social groups, hate groups utilize the Internet to socialize and to link with other people. Like other ideological and political groups, hate sites also aim to raise funds, share a particular worldview, propagate ideas and recruit new members. Like terrorist groups, hate groups also engage in the promotion of violence (Cohen-Almagor 2012a, 2017a).
The final part of the article emphasises the connection between hate speech and hate crime. Hate crime concerns the threat or the use of physical harm motivated by prejudice (Perry 2001, 2005: 121-137; USLegal.com website). Not all forms of hate speech lead to hate crimes while hate crimes can, but do not necessarily, contain hate speech. Still, there is sufficient evidence to suggest that speech can and does inspire crime.
Hate speech is not a simple concept. Hate speech definitions concern both legal and illegal speech. The same speech might be illegal in Canada but legal in the USA. For example, while Canada does not permit the establishment of a Nazi party, the USA does not ban the American Nazi Party (Cohen-Almagor 2005; Neier 1979; Village of Skokie v The National Socialist Party of America 1978). Different countries also have different stands on Holocaust denial, which is a form of hate speech (Behrens, Jensen and Terry 2017; Cohen-Almagor 2009; Appendix 2, citation identifier C). The concept of hate speech thus contains a variety of speech and behavior on the Internet, as well as various motivations for production of that speech. For this reason, the sources of hate speech are manifold. Here my concern is solely with racist, white supremacist hate speech. Previously, I have defined hate speech as ‘a bias-motivated, hostile, malicious speech aimed at a person or a group of people because of some of their actual or perceived innate characteristics’ (Cohen-Almagor 2011). Hate speech expresses ‘discriminatory, intimidating, disapproving, antagonistic and/or prejudicial attitudes toward those characteristics, which include sex, race, religion, ethnicity, colour, national origin, disability, or sexual orientation’ (Cohen-Almagor 2011). Hate speech is intended to ‘injure, dehumanize, harass, intimidate, debase, degrade, and victimize the targeted groups, and to foment insensitivity and brutality against them’ (Cohen-Almagor 2011). A hate site is defined as a site that carries any form of hateful textual, visual or audio-based rhetoric.
Akrasia means incontinence or lack of mastery. Akratic people are motivated by emotions and passions rather than reason. When people lack self-control, they may act against their better judgment. In Nicomachean Ethics, Book 7, Aristotle (350 BCE [1999]) distinguished between two kinds of akrasia: impetuosity and weakness. Impetuous people do not go through a process of deliberation and do not make a reasoned choice; they ‘are led by their emotion’. Keen and excitable people suffer especially from the impetuous form of incontinence. People who hate are motivated by anger (passion) and have a simplistic view of the world, a view that is not informed by facts but by prejudice. People who hate either do so in full knowledge that they inflict harm and that they should not be doing it or they act from ignorance. In the former case, anger and violent passion can cause a lapse in reason which leads people away from what they know is good action. People are normally held responsible by others for this kind of clear-eyed akrasia. In the latter case, the question is whether their ignorance is culpable. Ignorance is culpable if people could reasonably have been expected to take measures that would have corrected or avoided their doing the wrong action, given their capabilities and the opportunities provided to them by their social context (home, community, society at large), but they failed to do so due to weaknesses such as overconfidence, arrogance, dismissiveness, laziness, dogmatism, incuriosity, self-indulgence and contempt (FitzPatrick 2008; Nussbaum 1990; Sher 2009). As with clear-eyed akrasia, failure to recognize the wrongness or imprudence of one’s conduct does not relieve one of responsibility. In short, akrasia helps reflection on the abrogation of responsibility when it comes to online hate.
The article is based on a close study of hate sites over a ten-year period (see Appendix 1 for a listing of websites cited in this article). This is not an easy study, not only because of its troubling and quite upsetting content but also because the sites are very unstable and constantly in flux. During the past decade, many of the websites under examination were moving around the Internet changing names, locations and forums. This means that quite a few sites discussed here are now defunct. When I started my study in 2007, research by Chau and Xu (2007) named more than thirty blogs. None was in existence when I wrote this article. Similarly, Franklin compiled ‘The Hate Directory’ in October 2002. The vast majority of the web pages in that directory are no longer operative. In November 2009, Franklin published an updated version of the Hate Directory, which had almost doubled in size from the 2002 one. Again, many of the sites no longer exist or have been relocated to other servers, prompting Franklin to issue another hate directory in April 2010 (Franklin 2010). A silver lining resulting from the Charlottesville tragedy mentioned above is that many Internet service providers and web-hosting companies that were once friendly to racial propaganda—one example of acting in an irresponsible akrasian way— have now ousted websites associated with white supremacism and neo-Nazism from their servers (Associated Press 2017). Consequently, many websites that were accessed prior to August 2017 are no longer available. Despite these limitations, the analysis I present here reflects on some of the most notorious sites that I have visited and revisited over this period and particularly during the past six years (2012-2017). Unless otherwise said, all websites were accessed in September 2017.
The article is also informed by more than 40 semi-structured interviews and discussions I have held in Canada, the USA, Israel, France, England, Ireland and Portugal during the last decade (2006-2016). The interviews and discussions were with leading Internet scholars, security experts, and human rights activists and experts. They were designed to learn about the scope of Internet hate and what can be done to counter hate mongers’ activities. Interviews varied in length from one to two hours and 30 minutes. The interviewees and discussants provided information and insight about the structure and functions of the Internet, the possibilities it opens for abuse, the ways the Internet has been utilised by hate organizations and individuals, and the dangers of hate speech and its links to violence and to other criminal activities. Appendix 2 compiles the names of the interviewees and discussants cited in this article together with locations and dates.
The devil is in the details. To comprehend the seriousness of the challenge that white supremacists pose to society, it is essential to understand their agenda, aims, priorities and mode of operation. Hate groups use the Internet as other users do but their intentions are sinister, anti-social and violent. Here I focus on propaganda, socializing, linking, fundraising, recruitment and the promotion of violence.
White supremacist websites promote messages of racial superiority and attack certain religions or gays and lesbians. White supremacist groups such as the Ku Klux Klan, skinheads, neo-Nazis and National Association for the Advancement of White People have websites, blogs, ‘rants and raves’ forums, discussion groups, photos and videos on the Internet. These websites (see Appendix 1) are readily accessible through Internet search engines. While a main website is set up by a group’s leader, multiple sites are also set up by district or state chapters as well as by individual members. These sites usually contain the history of the sponsoring group, a mission statement, and text by group members. To attract the reader, they offer eye-catching teasers such as symbols and pictures.
For example, Northwest Front has a clear agenda to create a White Homeland in the Pacific Northwest, which they promote through a distinct flag (blue, white and green), a constitution and a set of principles on migration and citizenship. They believe in a clear demarcation in accordance with ‘white blood and race’ and argue for ousting those who do not belong. Yet they claim they are not about promoting hatred. They are about promoting freedom: ‘We don’t stand for hating people, we stand for freeing people—our people—from a yoke of tyranny and oppression that has become impossible for us to live with. We stand for preserving our race from biological and cultural extinction’ (Northwest Front website, ‘Dear White American’). Non-White and Jewish people and homosexuals are not welcome in their new country called The Northwest American Republic (Northwest Front website, ‘Nationhood and Citizenship’).
Hate mongers talk to each other, thereby reinforcing their commonly held views, empowering people who share their beliefs and identifying ways to offend their targets. This dichotomy between ‘us’ and ‘them’ is necessary as it fulfils both functions of creating a sense of belonging and of marking the bounds of unity. White supremacist websites and chat groups such as Stormfront promulgate the belief that White people are the oppressed group and that society is in danger of being overrun by ignorant, welfare-loving minorities who desire White women.
Much effort is invested in appealing to young people through video games and music that teach children that violence is acceptable (Anti-Defamation League 2012; Appendix 2, citation identifier D). For example, Ethnic Cleansing (DigitalCoprolites 2011a) is a game that is currently available where players kill Black people and Hispanics in order to gain access to the subway where the Jewish people are hiding. Comparable popular titles include: Nazi Wolf 3D (ComplottoG 2010), a fan modified version of the anti-Nazi killing game Wolfenstein3D but where the Nazis are the dominant force; Zog’s Nightmare (DigitalCoprolites 2011b); KZ Manager Millenium [sic] in which the player assumes the role of a death camp manager who needs to run it efficiently (HanzVonStickyhooves 2013); Border patrol in which you get points for shooting down immigrants trying to cross the border (Yepss Videos 2015); and Shoot The Blacks, described as ‘Blast away the darkies as they appear. An excellent little shooter style game’ (obsolete Resist website).
In this hate propaganda, the racial ‘other’ is represented as a social polluter. They are metaphorically associated with disease and cast as a viral presence whose very existence on (often American) soil is sufficient to undermine its social stability and the values which have made it a strong and powerful nation. The foreigner is the enemy (Roversi 2008: 93-94). The Internet allows the ignorant and the prejudiced to send these anonymous messages to those whom they despise (Delgado and Stefancic 2004: 24). For example, Jewish Ritual Murder (Holywar website) claimed that the two principal feast-days of Judaism, Purim and Passover, were associated with the murder of Christians.
The writings of Dr William Pierce have become glorified and much celebrated within these circles to promote hatred against Jewish people (see, for example, National Alliance website). Pierce’s Turner Diaries, published in 1978 under the pseudonym Andrew Macdonald, provides a fictional account of a race war by white supremacists against government officials, intellectuals, Jewish and Black people in order to establish an Aryan world. Timothy McVeigh, the Oklahoma City bomber, actively promoted the book and appeared to have carefully read some of Turner’s instructions prior to the 1995 bombing that resulted in the death of 168 people: ‘The plan roughly is this ... Unit 8 will secure a large amount of explosives ... We will then drive into the FBI building's freight-receiving area, set the fuse, and leave the truck’ (Macdonald 1978: Chapter IV). Organizations dedicated to fighting hatred are concerned with The Turner Diaries (Appendix 2, citation identifier E).
Encouraging interpersonal socialization in the offline world is a key strategy of white supremacist websites. For instance, the Hammerskin Nation is one of the most organized and most violent neo-Nazi skinhead group in the USA (Foxman and Wolf 2013: 13). In 2017, its website proclaimed ‘REBEL HELL TOTAL WAR’ and invited people to ‘Beers, Bands and Brotherhood,’ an exciting event that also includes merchandise and a raffle (Hammerskins website). Similarly, the Nordic Fest (obsolete Southside Antifa website) is an annual white patriotic rally and music festival in Dawson Springs Kentucky. The group claimed that comrades from all over the world travelled to this and other events held by the IKA (Imperial Klans of America). In 2017, The Brotherhood of Klans Knights of the Ku Klux Klan also convened a Summer Unity Gathering (Stormfront forum). These kinds of websites cultivate a sense of community and offer interested parties opportunities for mingling and socializing. It is one thing to find like-minded people on the Internet. It is quite another to actually meet and strike more than virtual friendship. Hate groups strive for both.
White power rock n’ roll has been instrumental to the racist movement. Some sites have free downloadable music with lyrics promoting hate. The lyrics are violent and derogatory, calling for a racial war and the murder of Black and gay people and other ‘undesired’ groups. For example, Resistance Records, which offered merchandise as well as music, was a financially successful label founded in 1993 that pioneered the music of a dozen high-quality Skinhead bands, such as Cute Girl (Atkins 2011). It was selling some 70,000 CDs a year in the early 2000s (Beirich 2013). Its products could be ordered directly over the Internet. Resistance Records is a well-known Canadian white nationalist/racialist label, founded in 1993 by George Eric Hawthorne 4. The label later moved to Detroit, Michigan, partly to avoid Canadian hate crime laws.
The Internet enables hate organizations to connect with each other worldwide. It is a powerful tool to reach an international audience, linking diverse extremist groups (Cohen-Almagor 2015; Conway 2016; Gerstenfeld, Grant and Chau-Pu 2003; Perry and Olsson 2009; Appendix 2, citation identifier F). Hate activists use cyberspace as a free space to create and sustain movement culture and coordinate collective action (see, for example, Our People, The Aryans website). The cyber-presence of the White Power Movement intersects with and enhances their real world activities by offering multiple opportunities for access and coordination (Simi and Futrell 2006).
Hate mongers are able to make blatant appeals for funding over their websites because none has been designated as a terrorist organization by official state agencies; therefore, they are not as concerned about state interference with funding channels. Appeals come in three forms: general appeals for funds needed to sustain operations; fund-raisers for legal representation for members who have been arrested; and donations towards ‘official membership,’ which entitles subjects to additional material, such as newsletter subscriptions (Gruen 2004: 139; Appendix 2, citation identifier G). Many hate sites also generate revenue through product sales (coins, jewellery, belt buckles, t-shirts, hats, patches, pins, flags, sports items, music, videos, comic books, memorabilia, decorations, knives, survival defence (see, for example, Final Conflict website; Third Reich Books website; Tightrope website). One typical website, Aryan Wear (no longer active), explained that ‘Through our support of Altermedia.info and Newsnet14.com Aryan Wear helps get out news and information that is hidden by the controlled media.’
The Internet introduces people to new ideas. People surf the Internet, encounter intriguing ideas and get interested. Often this makes the Internet the starting point for further contact. Some people then continue to explore and may initiate contact with people who are more experienced. Then they become identifiable targets for recruitment (Angie et al. 2011; Foxman and Wolf 2013; Appendix 2, citation identifier H). Online hate sites are also used to recruit individuals to offline hate groups and coordinate group efforts (Chan, Ghose and Seamans 2016; Hall et al. 2017; Ibanga 2009; Wines and Saul 2015). Much of this is geared to teenage and young adult males. As mentioned above, music plays an important role in this recruitment. When children and youth surf the Internet for music, they may chance on sites that offer hate music, sometimes free of charge. Such sites are often linked to hate newsgroups and chat rooms (Chernynkaya 2010; Shekhvtsov 2013).
White supremacists recognised the recruitment power of the Internet early on. As far back as 1998, the founder of Stormfront, Don Black, said he recruited people online whom he otherwise would not have been able to reach (Media Smarts website; Reuters 2009). In 2015, he claimed that there were more ‘people actively working in some way to promote our cause. Because they don’t have to join an organization now that we have this newfangled Internet’ (Wines and Saul 2015). For many years, Stormfront has been the most visited hate website in the world. In 2014, between 200,000 and 400,000 Americans visited the site every month (Stephens-Davidowitz 2014) and it has been estimated to have more than 300,000 registered users (Hatewatch Staff 2017). The site openly promotes racial violence and was used, along with The Daily Stormer, to organize and encourage participation in the fatal Unite the Right rally in Charlottesville (Lawyers’ Committee for Civil Rights Under Law 2017). Subsequently, Stormfront website was briefly shut down in August 2017 but reopened in October 2017 (Schulberg, Liebelson and Craggs 2017).
Racist websites provide links to information on terrorism. The now obsolete Aryan Nations website celebrated the 11 September 2001 Islamist extremist terrorist attacks in the USA which killed many thousands of people, the rationale being that the enemy of the USA is, ‘for now at least, our friend’ (Wallace 2001). Linked to Aryan Nations website, was Aryan Underground (also now obsolete), a clearinghouse of information for those wishing to take action. According to the Aryan Nations website, Aryan Underground’s political agenda was clear, speaking of ‘racial purity’ and promoting the argument that ‘violence solves everything’. The ‘Christian Guide to Small Arms’ shared space on the Aryan Underground site, with articles detailing how to build bombs, and how to go underground. Aryan Underground also provided information on explosives and mail bombs. Numerous computer virus files and downloadable versions of various anarchy and terrorism manuals such as William Powell's The Anarchist Cookbook (1971) and The Terrorist Handbook (Anonymous 1994; see also Ray and Marsh II 2001) were available as well (Appendix 2, citation identifier I). There is compelling evidence of direct connections between these manuals and violent, terrorist actions (Chan, Ghose and Seamans 2016; Cohen-Almagor 2017b). In the following section, I consider this link between hate speech and violence.
On average, USA residents experienced approximately 250,000 hate crime victimizations each year between 2004 and 2015, of which about 230,000 were violent victimizations (Masucci and Langton 2017). Those who are opposed to hate speech regulation argue that venting hate speech is preferable to violent action (Baker 1997, 2012; Richards 1986). They support freedom of speech and net neutrality,[3] notwithstanding its most troubling content. Further arguments are that regulation of hate speech is ineffective, futile, makes martyrs out of haters and might even help them achieve their goals (Henthoff 1992: 134; Appendix 2, citation identifier J). But, absolute net neutrality in itself constitutes a form of clear-eyed akrasia because it entails an abrogation of moral and social responsibility for Internet content. Indeed, the trouble with these arguments, as Allport (1954) and others have observed, lies with their empirical assumptions.
Furthermore, government failure to act against hate speech helps to normalize or even authorize the relevant hate speakers to carry on doing what they are doing (this reinforcing the original akrasic inaction). Victims of unrecognised hate speech end up lacking protection (Brown 2017: 604). In his critique of First Amendment scholars in the USA, Waldron (2012: 165) rightly notes that hate speech harms the dignity of its targets by undermining public assurance and support. Waldron (2012: 171) explains: ‘To the extent that the message conveyed by the racist already puts them on the defensive, and distracts them from the ordinary business of life ... to that extent, the racist speech has already succeeded in one of its destructive aims’. In contrast, supporters of free speech such as Baker (1992, 1997, 2012) give no convincing reason why society should tolerate hate speech when the pain may be so strong, so immediate, so penetrating and so instant, that people do not have the luxury of choosing their response.
A recent study by Chan, Ghose and Seamans (2016) found that some 14,000 Internet sites contained hate-related content. Using a large-scale dataset and econometric techniques, they found a positive relationship between Internet penetration and offline racial hate crime. This correlation is most evident in areas with higher levels of racism than others, indicated by higher levels of segregation and a higher propensity for people in those areas to search for racially charged words. Chan, Ghose and Seamans (2016) also observed a link between online hate sites and the incidence of racial hate crimes executed by lone wolf perpetrators. My own research spanning two decades concludes there is evidence for this link. For example, in 1999, 21-year-old Benjamin Nathaniel Smith shot and killed two innocent people and wounded eight others after being exposed to Internet racial propaganda (Anti-Defamation League 2003a; Apologetics Index website; Berkowitz 1999; Greyhavens 2007; Church of the Creator website (registration required); Appendix 2, citation identifier K). Smith said: ‘It wasn’t really ‘til I got on the Internet, read some literature of these groups that ... it really all came together’ (Wolf 2004b). He maintained: ‘It’s a slow, gradual process to become racially conscious’ (Wolf 2004a, 2004b; Chan, Ghose and Seamans 2016).
The same year Buford Furrow embarked on a hate-motivated shooting spree after visiting hate sites, including Stormfront and a macabre site called Gore Gallery on which explicit photos of brutal murders were posted (Levin 2002: 959). Furrow killed one person and wounded five others.
Throughout the 2000s, there were numerous cases in the USA of active users of white supremacist Internet sites committing offences of racial violence intimidation (Fuoco 2001; Gruen 2004: 128; United States v Magleby 2001; for discussion on the cross burning phenomenon, see Bell 2004; Gey 2004; Newton 2014). In Canada in 2006, the Canadian Human Rights Commission concluded that the materials used by such offenders were likely to expose those of the Jewish faith, Aboriginal peoples, francophones, Black people and others to hatred and contempt: ‘They are undoubtedly as vile as one can imagine and not only discriminatory but threatening to the victims they target’ (Warman v Harrison 2006: 23-24; CBC News 2006). The danger is well exemplified in the case of Keith Luke who, in 2009, murdered two Black people and raped and nearly killed a third on the morning after Barack Obama was inaugurated as president. When he was captured, Luke told police that he intended to go to a synagogue that night and kill as many Orthodox Jewish people as possible. Luke also told the police that he had been reading white power websites for about six months (in other words, from about the time that Obama won the Democratic nomination) and had concluded that the White race was being subjected to a genocide in America. Therefore, he had to act (Ellement 2009; Holthouse 2009).
Later the same year, on 10 June 2009, James von Brunn entered the USA Holocaust Memorial Museum in Washington DC and killed Security Guard Stephen Tyrone Johns. Von Brunn, a die-hard white supremacist anti-Semite, was an active neo-Nazi for decades (Beirich 2009; Martin 2015). For this Holocaust denier, the Holocaust Museum was the most appropriate place for the shooting as it served the greatest hoax of all time.
There is some similarity between von Brunn and the 73-year-old American Nazi Frazier Glenn Miller who, in April 2014, murdered three people at two separate Jewish Community Centers in Overland Park, Kansas. Miller was the founder of the Carolina Knights of the Ku Klux Klan and was its Grand Dragon in the 1980s (Strømmen 2014) and also the founder of the White Patriot Party (Beirich 2014a). Miller’s hateful book, A White Man Speaks Out (Miller 1999), was freely available to download on his website and is still available online. On the Vanguard News Network website alone, Miller had more than 12,000 posts. The slogan of this anti-Semitic and white supremacist site was ‘No Jews, Just Right’ (Avlon and Dickson 2014; Beirich 2014a). Miller described the Jewish people as ‘swarthy, hairy, bow-legged, beady-eyed, parasitic midgets’ (Beirich 2014a). Conversely, Adolf Hitler was, according to Miller, ‘the greatest man who ever walked the earth’ (Fitzsimmons 2014). For many years, Miller encouraged his followers to kill Black and Jewish people, judges and human rights activists (Yaccino and Barry 2014). Miller openly declared ‘total war’ on ZOG (Zionist Occupation Government) and called upon his fellow ‘Aryan warriors’ to strike now
In 2014, The Southern Poverty Law Center (Beirich 2014b) published a two-year study that details incidents in which active users on one website, Stormfront, were allegedly responsible for the murders of nearly 100 people in the preceding five years. These incidents include the killing of three Pittsburgh police officers by Richard Poplawski in 2009; the killing of four people by Jason Todd Ready in May 2012; the torturing and dismembering a Chinese immigrant by Eric Clinton Kirk Newman that same month, and the killing of six people at a Sikh temple three months later by Wade Michael Page (Beirich 2014a, 2014b; Dickson 2014).
In June 2015, Dylann Storm Roof, a reborn white nationalist, opened fire at the Emanuel African Methodist Episcopal Church in downtown Charleston, South Carolina, killing nine people. Roof had engaged online with white supremacists. He published a manifesto ‘The Last Rhodesian’ in which he appealed to white nationalists to join the cause: ‘I believe that even if we made up only 30 percent of the population we could take it back completely. But by no means should we wait any longer to take drastic action’ (Hatewatch Staff 2015). Roof believed he had no choice but to murder defenseless Black people. He was out to kill in service to his white nationalist ideology (Siegel 2015; Potok 2015). Roof’s manifesto (2016) was published on The New York Times and also on The Daily Stormer, a neo-fascist website inspired by the notorious Nazi propaganda paper Der Stürmer whose editor, Julius Streicher, stood trial in Nuremberg after World War II and was executed for war crimes.
Together, these cases demonstrate that online hate speech and hate threats need to be taken seriously. When harmful speech is closely linked to harmful action, to the extent that one does not know where the harmful speech ends and the harmful action begins, those speech-acts do not warrant protection (Cohen-Almagor 2006; George 2017). Incitement warrants legal intervention. Overly permissive and tolerant attitudes towards hate speech is a form of askrasia, whereby people act against their better judgment. Not just those who post but also those who allow such postings on their servers are culpable for their akratic conduct. Whether through ignorance, indifference or insistence on clinging to freedom of speech without caring about dangerous consequences, these are unjustifiable. Internet service providers are expected to abide by a basic code of conduct, one that objects to rather than celebrates violence and its promotion. When it comes to hate speech on the Internet, society and its regulators cannot continue to remain akratic and avoid responsibility for the harm that is inflicted. As Christopher Wolf, Chair of the Internet Task Force of the Anti-Defamation League, argues: ‘The evidence is clear that hate online inspires hate crimes’ (Wolf 2004b).
Hate is a powerful emotion. People who allow themselves to develop hatred towards others move in vicious circles. With the help of the Internet, they find like-minded people and then engage in discussions about why their hatred is justified, and what can be done to fight their targets of hate. The entire conversation is negative, dark and destructive. The bigots incite each other to hatred, and push those who are prone to violence to act upon their hatred. This article shows that hateful messages are destructive. They directly harm the victimized targets and they might indirectly desensitize the public on important issues (such as Holocaust denial). Allowing hate to propagate freely shows a form of irresponsible akrasia, acting against one’s better judgement through weakness of will (Stroud 2014) or, worse, through an intention to express bigotry and hate.
This article focused on the study of websites and their conduciveness to hate crime. Hate groups are quite varied and many do not allow access except through direct personal contact, not through the Internet (Appendix 2 citation identifier L). However, some hate mongers make the most of the Internet and the communication options that are now open to them beyond websites: blogs, email, Usenet Newsgroups (computer discussion forums), Web-based bulletin boards, clubs and groups on social networks, chat rooms, Internet Relay Chat and instant messaging. With the help of the Internet, hate groups are able to reach places that were closed for them before: homes, schools, offices. Social networking sites are particularly well suited for connecting social outcasts, angry and isolated individuals on the fringe of society who find solace and comfort in cyberspace. Facebook, Twitter and YouTube are increasingly the platforms used to disseminate hate and to target teens and children to become supporters of hate (Fuchs 2014; KhosraviNik and Unger 2015; Werts 2000). Further research may analyse the ways social media apps are used in spreading hate speech, the way modern technologies are exploited to spread hate speech and whether search engines and social networking sites should continue to assist hate groups in their agenda. I have suggested some counter-measures to tackle Internet hate elsewhere (Cohen-Almagor 2014).
We also need more research that compares the utilization of the Internet to sprout hatred to the way it is being used by other anti-social groups such as paedophiles (Cohen-Almagor 2013) and terrorists. From my interviews with experts on children’s safety, terrorism, crime and hate there seem to be many commonalities between the modes of operation of these groups. Such comparative studies may help security agencies in the fighting against these phenomena (Appendix 2 citation identifier M).
The Internet became commercial and widespread only in the early 1990s. In historical terms, this is new technology in the making. As has been the case with any other powerful innovation that shapes our lives, people quickly learn to cope with the benefits that technology yields. Society and its regulators are slower and seem to have more difficulty in devising mechanisms to deal with the ills of technology: that is, the ways it can be abused and used for negative purposes. Despite having been exposed to the online environment for a couple of decades now, we are still in the early learning stages.
With time, technology gatekeepers, state institutions and civil society will find solutions. Different interests are involved. Some Internet service providers are only interested in making financial profits. Others may also have broader social values in mind. Ethical, cultural and legal standards help shape the bounds of the legitimate. A balance needs to be found between competing interests so that the Internet can continue to develop, fulfilling its potential while mitigating its less positive side effects. The balance might differ from one society to another, as not all societies are the same. Each nation has historical, cultural, demographic and other norms, which shape its conception of the good. At the same time, liberal democracies have common fundamental values and norms: liberty, equality, pluralism, individualism, respect for others and the desire to refrain from inflicting harm upon others. This and similar discussions will contribute to developing standards against violent speech that might translate into violent action.
Correspondence: Professor Raphael Cohen-Almagor, Chair in Politics, School of Law and Politics, Faculty of Business, Law and Politics, University of Hull, Cottingham Road, Hull HU6 7RX, United Kingdom. Email: r.cohen-almagor@hull.ac.uk
Abrams F (2017) The Soul of the First Amendment: Why Freedom of Speech Matters. New Haven: Yale University Press.
Allport GW (1954) The Nature of Prejudice. Cambridge, Massachusetts: Addison-Wesley.
Angelou M (1978) Still I Rise. New York: Random House.
Angie AD, Davis JL, Allen MT, Byrne CL, Ruark GA, Cunningham CB, Huang TS, Bernard DR, Hughes MG, Connelly S, O’Hair HD and Mumford MD (2011) Studying ideological groups online: Identification and assessment of risk factors for violence. Journal of Applied Social Psychology 41(3): 627-657. DOI: 10.1111/j.1559-1816.2011.00730.x.
Anonymous (1994) The Terrorist Handbook. Gunzenbomz Pyro-Technologies. Available at http://www.dvc.org.uk/cygnet/tthb.pdf (accessed 31 January 2018).
Anti-Defamation League (2003a) Church of the Creator: Creed of hate. The Nizkor Project. Available at http://www.nizkor.org/hweb/orgs/american/adl/cotc/ (accessed 31 January 2018).
Anti-Defamation League (2003b) Hate on the Internet. Washington, District of Colombia: Anti-Defamation League.
Anti-Defamation League (2012) Racist groups use computer gaming to promote hate. Available at https://www.adl.org/sites/default/files/documents/assets/pdf/combating-hate/Racist-groups-use-computer-gaming.pdf (accessed 31 January 2018).
Aristotle (350BCE [1999]) Nicomachean Ethics. Kitchener, Ontario: Batoche Books.
Associated Press (2017) Neo-Nazi site’s publisher says he's got no home on Internet. Boston Herald, 16 August. Available at http://www.bostonherald.com/news/national/2017/08/neo_nazi_sites_publisher_says_hes_got_no_home_on_internet (accessed 16 March 2018).Atkins SE (2011) Encyclopedia of Right-Wing Extremism in Modern American History. Santa Barbara, California: ABC-CLIO.
Avlon J and Dickson C (2014) Hate—and Hitler—in the heartland: The arrest of Frazier Glenn Miller. The Daily Beast, 14 April. Available at http://www.thedailybeast.com/articles/2014/04/14/hate-and-hitler-in-the-heartland-the-arrest-of-frazier-glenn-miller.html (accessed 30 January 2018).
Baker CE (1992) Human Liberty and Freedom of Speech. New York: Oxford University Press.
Baker CE (1997) Harm, liberty and free speech. Southern California Law Review 70: 979-1020.
Baker CE (2012) Hate speech. In Herz M and Molnar P (eds) The Content and Context of Hate Speech: 57-80. New York: Cambridge University Press.
Behrens P, Jensen O and Terry N (eds) (2017) Holocaust and Genocide Denial: A Contextual Perspective. London: Routledge.
Beirich H (2009) Holocaust Museum shooter had close ties to prominent neo-Nazis. Southern Poverty Law Center, 10 June. Available at https://www.splcenter.org/hatewatch/2009/06/10/holocaust-museum-shooter-had-close-ties-prominent-neo-nazis (accessed 30 January 2018).
Beirich H (2013) Hate across the waters: The role of American extremists in fostering an international white consciousness. In Wodak R, Mral B and KhosraviNik B (eds) Right Wing Populism in Europe. Politics and Discourse: 89-104. London: Bloomsbury.
Beirich H (2014a) Frazier Glenn Miller, longtime anti-Semite, arrested in Kansas Jewish community center murders. Southern Poverty Law Center, 13 April. Available at http://www.splcenter.org/blog/2014/04/13/frazier-glenn-miller-longtime-anti-semite-arrested-in-kansas-jewish-community-center-murders/ (accessed 30 January 2018).
Beirich H (2014b) White homicide worldwide. Southern Poverty Law Center (Summer). Available at https://www.splcenter.org/sites/default/files/d6_legacy_files/downloads/publication/white-homicide-worldwide.pdf (accessed 27 March 2018).
Bell J (2004) O say, can you see: Free expression by the light of fiery crosses. Harvard Civil Rights-Civil Liberties Law Review 39: 335-389.
Berkowitz H (1999) Prepared statement, hate crime on the Internet. Hearing before the Committee on the Judiciary, United States Senate. Washington, 14 September.
Brown A (2017) What is hate speech? Part 2: Family resemblances. Law and Philosophy 36(5): 561-613. DOI: 10.1007/s10982-017-9300-x.
CBC News (2006) Ontario man accused of posting Internet hate. 12 June. Available at http://www.cbc.ca/news/canada/ontario-man-accused-of-posting-internet-hate-1.590356 (accessed 30 January 2018).
Chan J, Ghose A and Seamans R (2016) The Internet and racial hate crime: Offline spillovers from online access. MIS Quarterly 40(2): 381-403.
Chau M and Xu J (2007) Mining communities and their relationships in blogs: A study of online hate groups. International Journal Human-Computer Studies 65(1): 57-70. DOI: 10.1016/j.ijhcs.2006.08.009.
Chernynkaya (2010) Hate in America, Part 3: The psychology and recruitment of hate. Planet Pov, 12 February. Available at http://planetpov.com/2010/02/12/hate-in-america-part-3-the-psychology-and-recruitment-of-hate/ (accessed 31 January 2018).
Cohen J (1993) Freedom of expression. Philosophy & Public Affairs 22(3): 207-263.
Cohen-Almagor R (2005) Speech, Media, and Ethics: The Limits of Free Expression. Houndmills; New York: Palgrave-Macmillan.
Cohen-Almagor R (2006) The Scope of Tolerance. London: Routledge.
Cohen-Almagor R (2011) Fighting hate and bigotry on the Internet. Policy and Internet 3(3): 1-26. DOI: 10.2202/1944-2866.1059.
Cohen-Almagor R (2012a) In Internet’s way: Radical, terrorist Islamists on the free highway. International Journal of Cyber Warfare and Terrorism 2(3): 39-58. DOI: 10.2139/ssrn.2334640.
Cohen-Almagor R (2012b) Is law appropriate to regulate hateful and racist speech: The Israeli experience. The Israel Studies Review 27(2): 41-64.
Cohen-Almagor R (2013) Online child sex offenders: Challenges and counter-measures. The Howard Journal of Criminal Justice 52(2): 190-215. DOI: 10.1111/hojo.12006.
Cohen-Almagor R (2014) Countering hate on the Internet. Annual Review of Law and Ethics 22: 431-443.
Cohen-Almagor R (2015) Confronting the Internet’s Dark Side: Moral and Social Responsibility on the Free Highway. New York; Washington, District of Columbia: Cambridge University Press and Woodrow Wilson Center Press.
Cohen-Almagor R (2016) Hate and racist speech in the United States: A critique. Philosophy and Public Issues 6(1): 77-123.
Cohen-Almagor R (2017a) Jihad online: How do terrorists use the Internet? In Campos Freire F, Rúas Araújo X, Martínez Fernández VA and García XL (eds) Media and Metamedia Management: 55-66. Dordrecht: Springer.
Cohen-Almagor R (2017b) Jihad on the information superhighway. Sustainable Security, Oxford Research Group, 26 May. Available at https://sustainablesecurity.org/2017/05/26/jihad-on-the-information-superhighway/ (accessed 31 January 2018).
Complotto G (2010) Nazi wolf 3D. YouTube, 19 September. Available at https://www.youtube.com/watch?v=pr9-JMJvMqk (accessed 31 January 2018).
Conway M (2016) Extremist communication. The Conference, 16 August. Available at https://videos.theconference.se/maura-conway-extremist-communication (accessed 31 January 2018).
Delgado R and Stefancic J (2004) Understanding Words That Wound. Boulder, Colorado: Westview.
Dickson C (2014) Where white supremacists breed online. The Daily Beast, 17 April. Available at http://www.thedailybeast.com/articles/2014/04/17/where-white-supremacists-breed-online.html?utm_medium=email&utm_source=newsletter&utm_campaign=cheatsheet_afternoon&cid=newsletter%3Bemail%3Bcheatsheet_afternoon&utm_term=Cheat%20Sheet (accessed 31 January 2018).
DigitalCoprolites (2011a) Let’s play Ethnic Cleansing. YouTube, 13 September. Available at https://www.youtube.com/watch?v=xlZCGyVGjMM (accessed 14 November 2017).
DigitalCoprolites (2011b) Let’s play Zorg’s Nightmare. YouTube, 13 September. Available at https://www.youtube.com/watch?v=xlZCGyVGjMM (accessed 14 November 2017).
FitzPatrick WJ (2008) Moral responsibility and normative ignorance: Answering a new skeptical challenge. Ethics 118(4): 589-613. DOI: 10.1086/589532.
Fitzsimmons EG (2014) Man kills 3 at Jewish centers in Kansas City suburb. The New York Times, 13 April. Available at http://www.nytimes.com/2014/04/14/us/3-killed-in-shootings-at-jewish-center-and-retirement-home-in-kansas.html?hp&_r=0&assetType=nyt_now (accessed 31 January 2018).
Foxman AH and Wolf C (2013) Viral Hate. New York: Palgrave-Macmillan.
Franklin RA (2010) The Hate Directory. Woodstock, Maryland: Raymond A Franklin. Available at http://www.iaca.net/Resources/Articles/TheHateDirectoryApril12010.pdf (accessed 16 March 2018).
Fuchs C (2014) Social Media: A Critical Introduction. London: Sage.
Fuoco MA (2001) County officer specializes in cyber crime cases. Pittsburgh Post-Gazette, 4 September.
George C (2017) Hate spin: The twin political strategies of religious incitement and offense-taking. Communication Theory 27(2): 156-175. DOI: 10.1111/comt.12111.
Gerstenfeld PB, Grant DR and Chiang C-P (2003) Hate online: A content analysis of extremist Internet sites. Analyses of Social Issues and Public Policy 3(1): 29-44. DOI: 10.1111/j.1530-2415.2003.00013.x.
Gey SG (2004) A few questions about cross burning, intimidation, and free speech. Notre Dame Law Review 80(4): 1287-1375. Available at https://scholarship.law.nd.edu/cgi/viewcontent.cgi?referer=https://www.google.com.au/&httpsredir=1&article=1401&context=ndlr (accessed 6 April 2018).
Greyhavens T (2007) Creating identity: The fragmentation of white racist movements in America. The Spark (Fall). Available at http://www.whitman.edu/spark/rel355fa07_Greyhavens.html (accessed 31 January 2018).
Gruen M (2004) White ethnonationalist and political Islamist methods of fund-raising and propaganda on the Internet. In Gunaratna R (ed.) The Changing Face of Terrorism: 127-145. Singapore: Marshall Cavendish.
Hall N, Corb A, Giannasi P and Grieve J (2017) The Routledge International Handbook on Hate Crime. London: Routledge.
HanzVonStickyhooves (2013) KZ Manager:Millennium review (Death camp tycoon). YouTube, 13 April. Available at https://www.youtube.com/watch?v=771pxXhERB4 (accessed 31 January 2018).
Hardin R (2002) Liberal distrust. European Review 10(1): 73-89. DOI: 10.1017/S1062798702000078.
Hatewatch Staff (2015) Dylann Roof’s manifesto is fluent in white nationalist ideology. Southern Poverty Law Center, 20 June 20. Available at https://www.splcenter.org/hatewatch/2015/06/21/dylann-roofs-manifesto-fluent-white-nationalist-ideology (accessed 31 January 2018).
Hatewatch Staff (2017) Waning storm: Stormfront.org loses its domain. Southern Poverty Law Center, 29 August. Available at https://www.splcenter.org/hatewatch/2017/08/29/waning-storm-stormfrontorg-loses-its-domain (accessed 31 January 2018).
Henthoff N (1992) Free Speech for Me—But Not for Thee: How the American Left and Right Relentlessly Censor Each Other. New York: Harper Collins.
Holthouse D (2009) Was alleged Massachusetts spree killer a neo-Nazi? Keith Luke makes it official. Southern Poverty Law Center, 11 May. Available at http://www.splcenter.org/blog/2009/05/11/was-alleged-massachusetts-spree-killer-a-neo-nazi-keith-luke-makes-it-official/ (accessed 31 January 2018).
Ibanga I (2009) Hate groups effectively use web as a recruiting tool. ABC News, 12 June. Available at http://abcnews.go.com/Technology/story?id=7822417 (accessed 31 January 2018).
Jones J (2015) 5 racist board games designed to degrade and humiliate African-Americans. Black Then, 27 September. Available at https://blackthen.com/5-racist-board-games-designed-to-humiliate-and-degrade-african-americans/ (accessed 31 January 2018).
KhosraviNik M and Unger JW (2015) Critical discourse studies and social media: Power, resistance and critique in changing media ecologies. In Wodak R and Meyer M (eds) Methods of Critical Discourse Studies: 205-233. Thousand Oaks, California: Sage.
Lawyers’ Committee for Civil Rights Under Law (2017) Stormfront.com website shut down following successful action by national civil rights organization. Lauren Weinstein, 26 August. Available at https://plus.google.com/u/0/+LaurenWeinstein/posts/MwpNHuFqGeK (accessed 31 January 2018).
Levin B (2002) Cyberhate. American Behavioral Scientist 45(6): 958-988. DOI: 10.1177/0002764202045006004.
Macdonald A (1978) The Turner Diaries. United States: National Vanguard Books.
Martin D (2015) Willis Carto, far-right figure and Holocaust denier, dies at 89. The New York Times, 1 November. Available at https://www.nytimes.com/2015/11/02/us/willis-carto-far-right-figure-and-holocaust-denier-dies-at-89.html?mcubz=0 (accessed 31 January 2018).
Masucci M and Langton L (2017) Hate Crime Victimization, 2004-2015. Bureau of Justice Statistics, USA Department of Justice.
Meiklejohn A (1965) Political Freedom. New York: Oxford University Press.
Meiklejohn A (2000) Free Speech and its Relation to Self-Government. Union, New Jersey: Lawbook Exchange.
Miller G (1999) A White Man Speaks Out. Greensboro, North Carolina: F Glenn Miller, White Patriot Party
Neier A (1979) Defending My Enemy. New York: EPDutton.
Newton M (2014) White Robes and Burning Crosses: A History of the Ku Klux Klan from 1866. Jefferson, North Carolina: McFarland.
Nussbaum MC (1990) Love’s Knowledge. New York: Oxford University Press.
Nye Jr JS, Zelikow PD and King DC (eds) (1997) Why People Don’t Trust Government. Cambridge, Massachusetts: Harvard University Press.
Perry B (2001) In the Name of Hate: Understanding Hate Crimes. London: Routledge.
Perry B (2005) A crime by any other name: The semantics of ‘hate’. Journal of Hate Studies 4(1): 121-137
Perry B and Olsson P (2009) Cyberhate: The globalization of hate. Information & Communication Technology Law 18(2) (June): 185-199. DOI: 10.1080/13600830902814984.
Potok M (2015) Carnage in Charleston. Southern Poverty Law Center, 27 October. Available at https://www.splcenter.org/fighting-hate/intelligence-report/2015/carnage-charleston (accessed 31 January 2018).
Powell W (1971) The Anarchist Cookbook. Secaucus, New Jersey: Lyle Stuart.
Ray B and Marsh II GE (2001) Recruitment by extremist groups on the Internet. First Monday 6(2). DOI: 10.5210/fm.v6i2.834.
Reuters (2009) Hate groups increasingly use social networking to recruit. Fox News, 14 May. Available at http://www.foxnews.com/story/2009/05/14/hate-groups-increasingly-use-social-networking-to-recruit.html (accessed 31 January 2018).
Richards DAJ (1986) Toleration and the Constitution. New York: Oxford University Press.
Roof D (2016) Dylann Roof’s manifesto. The New York Times, 13 December. Available at https://www.nytimes.com/interactive/2016/12/13/universal/document-Dylann-Roof-manifesto.html (accessed 27 March 2018).
Roversi A (2008) Hate on the Net. Aldershot: Ashgate.
Sanchez R and Payne E (2016) Charleston church shooting: Who is Dylann Roof? CNN, 16 December. Available at http://edition.cnn.com/2015/06/19/us/charleston-church-shooting-suspect/index.html (accessed 31 January 2018).
Scanlon TM (1979) Freedom of expression and categories of expression. University of Pittsburgh Law Review 40(3): 519-550.
Scanlon TM (1995) Content regulation reconsidered. In Lichtenberg J (ed.) Democracy and the Mass Media: 331-339. New York: Cambridge University Press.
Schulberg J, Liebelson D and Craggs T (2017) The neo-Nazis are back online. Huffington Post, 3 October. Available at http://www.huffingtonpost.com/entry/nazis-are-back-online_us_59d40719e4b06226e3f46941?m8d (accessed 31 January 2018).
Shekhvtsov A (2013) European far-right music and its enemies. In Wodak R and Richardson JE (eds) Analysing Fascist Discourse: 277-296. New York; London: Routledge.
Sher G (2009) Who Knew? Responsibility Without Awareness. New York: Oxford University Press.
Siegel J (2015) Dylann Roof, 4chan, and the new online racism. The Daily Beast, 29 June. Available at http://www.thedailybeast.com/articles/2015/06/29/dylann-roof-4chan-and-the-new-online-racism.html?via=newsletter&source=DDMorning (accessed 31 January 2018).
Simi P and Futrell R (2006) Cyberculture and the endurance of white power activism. Journal of Political & Military Sociology 34(1): 115-142.
Stephens-Davidowitz S (2014) The data of hate. The New York Times, 12 July. Available at http://www.nytimes.com/2014/07/13/opinion/sunday/seth-stephens-davidowitz-the-data-of-hate.html (accessed 31 January 2018).
Stone GR (2005) Perilous Times: Free Speech in Wartime from the Sedition Act of 1798 to the War on Terrorism. New York: Norton.
Strømmen Ø (2014) ‘White power’ attack hits US. Hate Speech International, 16 April. Available at https://www.hate-speech.org/white-power-past-of-us-rampage-suspect/ (accessed 31 January 2018).
Stroud S (2014) Weakness of will. Stanford Encyclopedia of Philosophy. Stanford, California: Stanford University. Available at https://plato.stanford.edu/entries/weakness-will/ (accessed 31 January 2018).
Volokh E (2003) The mechanisms of the slippery slope. Harvard Law Review 116: 1026-1137.
Volokh E (2015) No, there’s no ‘hate speech’ exception to the First Amendment. The Volokh Conspiracy, 7 May. Available at https://www.washingtonpost.com/news/volokh-conspiracy/wp/2015/05/07/no-theres-no-hate-speech-exception-to-the-first-amendment/?utm_term=.7867056f6759 (accessed 31 January 2018).
Waldron J (2012) The Harm in Hate Speech. Cambridge, Massachusetts: Harvard University Press.
Wallace J (2001) Editorial: Probe can’t overlook homegrown extremists. The Atlanta Journal-Constitution, 28 November: 22A.
Werts D (2000) How the web spawns hate and violence. Newsday, 23 October: B27.
Wines M and Saul S (2015) White supremacists extend their reach through websites. The New York Times, 5 July. Available at http://www.nytimes.com/2015/07/06/us/white-supremacists-extend-their-reach-through-websites.html?partner=rss&emc=rss&_r=0 (accessed 31 January 2018).
Wolf C (2004a) Needed: Diagnostic tools to gauge the full effect of online anti-Semitism and hate. Paper presented at OSCE Meeting on the Relationship between Racist, Xenophobic and Anti-Semitic Propaganda on the Internet and Hate Crimes, 16-17 June. Paris, France.
Wolf C (2004b) Regulating hate speech qua speech is not the solution to the epidemic of hate on the Internet. Paper presented at OSCE Meeting on the Relationship between Racist, Xenophobic and Anti-Semitic Propaganda on the Internet and Hate Crimes, 16-17 June. Paris, France.
Yaccino S and Barry D (2014) Bullets, blood and then cry of ‘Heil Hitler’. The New York Times, 14 April. Available at http://www.nytimes.com/2014/04/15/us/prosecutors-to-charge-suspect-with-hate-crime-in-kansas-shooting.html?hp (accessed 31 January 2018).
Yepss Videos (2015) Flash game: Border Control game. YouTube, 7 June. Available at https://www.youtube.com/watch?v=dyhgy0XVPsw (accessed 31 January 2018).
Case law
United States v Magleby [2001] USCA10 60; (2001) 241 F.3d 1306, 1308 (10th Cir.).
Village of Skokie v The National Socialist Party of America (1978) 373 N.E. 2d 21.
Warman v Harrison (2006) Canadian Human Rights Tribunal, 15 August.
Appendix 1: Website names, URLs and access date of cited websites
|
Website name
|
Website address
|
Last accessed
|
American Nazi Party
|
16 March 2018
|
||
2
|
Apologetics Index
|
http://www.apologeticsindex.org/c171.html
|
31 January 2018
|
3
|
Aryan Nations
|
September 2017; No longer active
|
|
4
|
Aryan Underground
|
September 2017; No longer active
|
|
5
|
Aryan Wear
|
http://aryanwear.com/aboutus.php
|
September 2017; No longer active
|
6
|
Church of the Creator
|
http://www.wcotc.com/
|
Registration required to access this site
|
7
|
Final Conflict
|
http://www.fc-music.com/home
|
31 January 2018
|
8
|
Gore Gallery
|
1 October 2017
|
|
9
|
Hammerskins
|
16 March 2018
|
|
10
|
Holywar
|
http://holywar.org/txt/RitualMurder/jrm-04.html
|
16 March 2018
|
11
|
IKA (Imperial Klans of America)
|
http://kkkk.net/; later changed to http://www.theuka.us/
|
16 March 2018
|
12
|
Ku Klux Klan
|
See IKA website
|
|
13
|
Media Smarts
|
http://mediasmarts.ca/online-hate/impact-online-hate
|
31 January 2018
|
14
|
National Alliance
|
http://natall.com/about/what-we-believe/
|
31 January 2018
|
15
|
National Association for the Advancement of White People
|
http://dfwdude.webs.com/
|
17 March 2018
|
16
|
Nationalist Front
|
17 March 2018
|
|
17
|
Neo-Nazis
|
See American Nazi Party website
|
|
18
|
Northwest Front,
‘Nationhood and Citizenship’
|
http://northwestfront.org/about/nar-constitution/1-nationhood-and-citizenship/
|
17 March 2018
|
19
|
Northwest Front,
Dear White American
|
http://northwestfront.org/about/dear-white-american/
|
31 January 2018
|
20
|
Our People, The Aryans
(JB Campbell)
|
31 January 2018
|
|
21
|
Resist
|
September 2017; No longer active
|
|
22
|
Resistance Records
|
https://www.discogs.com/label/33630-Resistance-Records-3
|
27 March 2018
|
23
|
Skinheads
|
See Hammerskins website
|
|
24
|
Southside Antifa
|
http://southsideantifa.blogspot.co.uk/2010/06/white-supremacist-nordic-fest-event.htm
|
September 2017; No longer active
|
25
|
Stormfront
|
12 December 2017
|
|
26
|
The Brotherhood of Klans Knights of the Ku Klux Klan
|
4 October 2017
|
|
27
|
The Daily Stormer
|
https://dailystormer.name/
|
27 March 2018
|
28
|
Third Reich Books
|
https://third-reich-books.com/
|
31 January 2018
|
29
|
Tightrope
|
31 January 2018
|
|
30
|
USLegal.com
|
http://definitions.uslegal.com/h/hate-crime/
|
18 March 2018
|
31
|
Vanguard America
|
18 March 2018
|
|
32
|
Vanguard News Network
|
1 October 2017
|
Appendix 2: Interviewees and discussants, 2006-2016
|
Name and position
|
Location and date
|
Citation identifier
|
1
|
Allen, Ruth, Head of Specialist Operational Support Child Exploitation and
Online Protection Centre (CEOP)
|
London, UK (18 April 2011).
|
B, M
|
2
|
Dr Atkinson, Robert D, President, Information Technology and Innovation
Foundation
|
Woodrow Wilson Center, Washington DC, USA (3 April 2008).
|
A
|
3
|
Beer, Rosa, Policy Advisor, Home Office
|
London, UK (19 April 2010).
|
I
|
4
|
Rep. Boucher, Rick, Rayburn House
|
Washington DC, USA (16 April 2008).
|
A, , J
|
5
|
Castro, Daniel, Senior Analyst, Information Technology and Innovation
Foundation
|
Washington DC, USA (9 May 2008).
|
C, H
|
6
|
Dr Conway, Maura, School of Law and Government, Dublin City
University
|
Wilton Park, UK (31 January 2011).
|
B, F
|
7
|
Rabbi Cooper, Abraham, Associate Dean, Director Global Social Action Agenda
of the Simon Wiesenthal Center
|
Jerusalem, Israel (17 December 2009).
|
B, C, D, E, F, G, H, L
|
8
|
Corn-Revere, Robert, Davis Wright Tremaine LLP, Woodrow Wilson Center
|
Washington DC, USA (15 November 2007).
|
A
|
9
|
Downs, Juniper, Global Head of Public Policy and Government Relations,
YouTube
|
Jerusalem, Israel (14 May 2015).
|
A
|
10
|
Feld, Harold, Senior Vice President, Media Access Project
|
Washington DC, USA (18 October 2007).
|
A
|
11
|
Dr Firestone, Charles M., Executive Director, Communications & Society
Program, The Aspen Institute
|
Washington DC, USA (11 April 2008).
|
C
|
12
|
Foxman, Abraham H, National Director, Anti-Defamation League
|
Jerusalem, Israel (13 May 2015).
|
C, H
|
13
|
Galligan, Mary E, FBI Chief Inspector
|
Woodrow Wilson Center, Washington DC, USA (20 June 2008).
|
B
|
14
|
Professor Ganor, Boaz, The Interdisciplinary Center (IDC)
|
Herzliya, Israel (10 September 2013).
|
B, M
|
15
|
Giannasi, Paul, Head of the Cross-Government Hate Crime
Programme
Ministry of Justice, England
|
Hull, England (16 December 2015), and Limerick, Ireland (25 May
2016).
|
B, C, E, F, G, H
|
16
|
Harris, Leslie, President/ CEO, Center for Democracy & Technology
|
Washington DC, USA (14 March 2008).
|
A, J
|
17
|
Henry, Shawn, Deputy Assistant Director, Cyber Division, Federal Bureau of
Investigation
|
Washington DC, USA (26 March 2008).
|
B, C, M
|
18
|
Lauter, Debora M, National Civil Rights Director, Anti-Defamation
League (ADL)
|
New York, USA (22 March 2010).
|
B, C, K, L
|
19
|
Professor Leitner, Peter, National Center for Biodefense
|
Woodrow Wilson Center, Washington DC, USA (7 April 2008).
|
I
|
20
|
Dr Linn, Herb, National Academy of Sciences
|
Washington DC, USA (15 May 2008).
|
G
|
21
|
Marcus, Brian, former Director of Internet Monitoring
|
Washington DC, USA (16 April 2008).
|
B, C, D, E, F, G, H, K, L
|
22
|
Matas, David, senior legal counsel of B'nai Brith Canada
|
Washington DC, USA (16 July 2008).
|
B, C, E, F, H
|
23
|
McFarlane, Bruce, Detective Senior Constable, Victoria Police Force of
Australia Global Terrorism Research Centre
|
Wilton Park, UK (1 February 2011).
|
B, I, M
|
24
|
Milner, Simon, Facebook Director of Policy for UK, Middle East, and Africa,
Facebook
|
London, UK (12 July 2015).
|
A, C
|
25
|
Morris, John, The Center for Democracy and Technology
|
Woodrow Wilson Center, Washington DC, USA (7 February 2008).
|
A, J
|
26
|
Mudd, Philip, Associate Executive Assistant Director, National Security
Branch, Federal Bureau of Investigation
|
Woodrow Wilson Center, Washington, DC, USA (25 March 2008).
|
B, H, I, M
|
27
|
Professor Nelson, Michael, former IBM Director, Internet Technology and
Strategy
|
Woodrow Wilson Center, Washington DC, USA (31 January 2008).
|
A
|
28
|
Nojeim, Gregory T, Senior Counsel and Director, Project on Freedom,
Security & Technology, Center for Democracy & Technology
|
Washington DC, USA (14 March 2008).
|
A
|
29
|
Patel, Nisha, Senior Research Officer, Home Office
|
London, UK (19 April 2010).
|
I
|
30
|
Potok, Mark, Southern Poverty Law Center
|
Toronto, Canada (11 September 2006).
|
B, C, E, F, G, H, L
|
31
|
Rawls, W Lee, USA Department of Justice
|
Woodrow Wilson Center, Washington DC, USA (26 March 2008).
|
B
|
32
|
Rotenberg, Marc, President of the Electronic Privacy Information
Center
|
Washington DC, USA (2 May 2008).
|
A
|
33
|
Schmidt, Philippe A, Chairman, Inach, International Network Against Cyber
Hate
|
Paris, France (11 October 2011).
|
C, F, G, H
|
34
|
Schwartzman, Andrew Jay, President and CEO, Media Access Project
|
Washington DC, USA (18 October 2007).
|
A
|
35
|
Segal, Oren, Director, Islamic Affairs, Anti-Defamation
League (ADL)
|
New York, USA (22 March 2010).
|
B, K, L, M
|
36
|
Sheinberg, Steven C, Associate Director, Legal Affairs, Anti-Defamation
League (ADL)
|
New York, USA (22 March 2010).
|
B, K, L
|
37
|
Professor Swire, Peter, former Chief Counselor for Privacy in the Office of
Management and Budget, USA Government
|
Woodrow Wilson Center, Washington DC, USA (6 February 2008).
|
A, C
|
38
|
Dr Theirer, Adam, Senior Fellow & Director, Center for Digital Media
Freedom, The Progress and Freedom Foundation
|
Washington DC, USA (15 January 2008).
|
A, J
|
39
|
Dr Vegar, Jose
|
Lisbon, Portugal (29 October 2009).
|
B
|
40
|
Vick, Jonathan, Internet Technology Analyst, Anti-Defamation
League (ADL)
|
New York, USA (22 March 2010).
|
B, D, E, F, G, H, K, L
|
41
|
Warman, Richard, Canadian Human Rights Commission and human rights
lawyer
|
Toronto, Canada (10 September 2006).
|
B, E, F, H, L
|
42
|
Weisburd, Aaron, Internet Haganah
|
Washington DC, USA (24 December 2008).
|
B, I, M
|
43
|
Whine, Mike, Director, Government & International Affairs, Community
Security Trust England (CST)
|
Limerick, Ireland (24 May 2016).
|
B, F, L
|
44
|
Professor Wodak, Ruth, Emeritus Distinguished Professor of Discourse
Studies at Lancaster University
|
Hull, England (4 November 2015).
|
B, C, E, F, G, H, L
|
45
|
Wolf, Christopher, Chair of the Internet Task Force of the Anti-Defamation
League
|
Washington DC (19 October 2007) and Berkeley, CA, USA (5 June 2009).
|
B, C, D, E, F, G, H, L
|
46
|
Woodley, Kevin, Public Affairs and Communication Services Directorate, the
Royal Canadian Mounted Police
|
Washington DC (22 July 2008).
|
I
|
[1] I am grateful to Abraham Cooper, Harvey Goldberg, Oren Segal, Stephanos Stavros, Jonathan Vick, Richard Warman and Chris Wolf for important information. I thank my interviewees for their time and many valuable insights.
[2] ‘Alt-right' or 'alternative right' is defined by Urban Dictionary (2017) (at
https://www.urbandictionary.com/define.php?term=alt-right accessed 27 March 2018) as ‘a name embraced by some white supremacists and white nationalists to refer to themselves and their ideology, which emphasizes preserving and protecting the white race in the United States in addition to, or over, other traditional conservative positions such as limited government, low taxes and strict law-and-order. The movement is a mix of racism, white nationalism and populism. Its members criticize multiculturalism, feminists, Jews, Muslims, gays, immigrants and other minorities. They reject the American democratic ideal’.
[3] Net neutrality is the basic principle that prohibits Internet service providers from speeding up, slowing down or blocking any content, applications or websites.
AustLII:
Copyright Policy
|
Disclaimers
|
Privacy Policy
|
Feedback
URL: http://www.austlii.edu.au/au/journals/IntJlCrimJustSocDem/2018/15.html