Neo-Nazi website the Daily Stormer went offline on Tuesday, after its domain registration was revoked by GoDaddy and Google, which said the site that was used to help organize the violent weekend rally in Charlottesville, Va., had violated their terms of service.
Users could not access the site from about 10 cities in North America, Europe and Asia. "We're having an outage" said one error message, which directed users to the Daily Stormer blog. Other messages said the site's server could not be found.
The efforts by GoDaddy, Alphabet's Google and several other firms that said they stopped providing services to Daily Stormer are part of broad moves by the tech industry to more actively police online hate speech and incitements to violence.
Andrew Anglin, the founder of Daily Stormer, has not responded to requests to comment on their actions or to discuss plans for getting the site back online.
A 32-year-old woman was killed during the white nationalist rally on the weekend and 19 people were injured when a car plowed into a crowd of counter-protesters.
Tech companies take action
GoDaddy, which manages internet names and registrations, disclosed late on Sunday via Twitter that it had given Daily Stormer 24 hours to move its domain to another provider, saying it had violated GoDaddy's terms of service.
After GoDaddy revoked Daily Stormer's registration, the website turned to Alphabet's Google Domains. But by Tuesday, the site's registration had been revoked there, too.
Internet companies have increasingly found themselves in the crosshairs over hate speech and other volatile social issues, with politicians and others calling on them to do more to police their networks while civil libertarians worry about the suppression of free speech.
Twitter, Facebook, Google's YouTube and other platforms have ramped up efforts to combat the social media efforts of Islamic militant groups, largely in response to pressure from European governments. Now they are facing similar pressures in the United States over white supremacist and neo-Nazi content.
Facebook confirmed on Monday that it took down the event page that was used to promote and organize the "Unite the Right" rally in Charlottesville. Facebook allows people to organize peaceful protests or rallies, but the social network said it would remove such pages when a threat of real-world harm and affiliation with hate organizations becomes clear.
"Facebook does not allow hate speech or praise of terrorist acts or hate crimes, and we are actively removing any posts that glorify the horrendous act committed in Charlottesville," the company said in a statement.
'They are inciting violence'
Several other companies also took action. Canadian internet company Tucows stopped hiding the domain registration information of Andrew Anglin, the founder of Daily Stormer. Tucows, which was previously providing the website with services masking Anglin's phone number and email address, said Daily Stormer had breached its terms of service.
"They are inciting violence," said Michael Goldstein, vice-president for sales and marketing at Tucows, a Toronto-based company. "It's a dangerous site and people should know who it is coming from."
Discord, a 70-person San Francisco company that allows gamers to communicate across the internet, did not mince words in its decision to shut down the server of Altright.com, an alt-right news website, and the accounts of other white nationalists.
"We will continue to take action against white supremacy, Nazi ideology, and all forms of hate," the company said in a tweet Monday. Altright.com did not respond to a request for comment.
Meanwhile, Twilio chief executive Jeff Lawson tweeted Sunday that the company would update its use policy to prohibit hate speech. Twilio's services allow companies and organizations, such as political groups or campaigns, to send text messages to their communities.
Cracking down on hate
Internet companies, which enjoy broad protections under U.S. law for the activities of people using their services, have mostly tried to avoid being arbiters of what is acceptable speech.
But the ground is now shifting, said one executive at a major Silicon Valley firm. Twitter, for one, has moved sharply against harassment and hate speech after enduring years of criticism for not doing enough.
Facebook is beefing up its content monitoring teams. Google is pushing hard on new technology to help it monitor and delete YouTube videos that celebrate violence.
All this comes as an influential bloc of senators, including Republican Sen. Rob Portman and Democratic Sen. Richard Blumenthal, is pushing legislation that would make it easier to penalize operators of websites that facilitate online sex trafficking of women and children.
That measure, despite the non-controversial nature of its espoused goal, was met with swift and co-ordinated opposition from tech firms and internet freedom groups that fear that being legally liable for the postings of users would be a devastating blow to the internet industry.