Tag Archives: Black Hat SEO

AI and reputation management

The newly released Justice Department files on Jeffrey Epstein contain something that should concern every executive, communications professional, and anyone who relies on their established name to do business: a detailed, years-long record of a reputation management campaign built entirely on deception.

And ultimately, it failed.

According to a New York Times review of thousands of pages of emails and financial records released by the DOJ, Epstein began his push to rehabilitate his online image within a year of his 2009 release from jail following a conviction for sex crimes involving a minor. Within two hours of receiving a cold email promising to make the “crap that comes up on Google search on your name basically disappear,” he responded with one word: “Yes.”

What followed was a multi-year, multi-hundred-thousand-dollar campaign involving SEO experts, content writers in the Philippines, self-described hackers, and a revolving cast of fixers — all working to scrub his criminal past from Google, sanitize his Wikipedia entry, and manufacture a false persona as a philanthropist and intellectual.

New York Times reporters Tiffany Hsu and Ken Bensinger‘s in-depth investigation into this ORM program is spot on about the dark side of the online reputation management industry. (For a look at the ethical practice of reputation management, check out my newly updated guide, Reputation Reboot: What Every Business Leader, Rising Star & VIP Needs to Know – 2026 AI Edition.)

The Light Side and the Dark Side

Online reputation management is a legitimate, valuable industry. Corporations, executives, public figures, and private individuals use it every day to ensure accurate information about them dominates their search results, to correct falsehoods, and to build a credible, authentic digital presence. Done right, it is a powerful tool for protecting something that, as I often tell clients, functions as real currency in today’s professional world.

But the documents reveal what Epstein’s team was doing was something else entirely. They built networks of fake Wikipedia editing accounts — known as “sock puppets” — to sneak changes past volunteer editors, who were catching and reversing their edits within 15 minutes. They manufactured fictitious websites and personas designed purely to fool search algorithms. They planted flattering articles in major publications that omitted any mention of his sex offender status. They called this work “pimping.”

As one legitimate ORM professional quoted in the Times put it: “This world has a light side and a dark side.” What Epstein’s crew was doing was “completely anathema” to ethical practice.

A Cautionary Tale with Real-World Consequences

Perhaps the most sobering part of the story is that the deception partially worked — for a while. MIT’s Media Lab accepted $750,000 in donations from Epstein between 2012 and 2017. A subsequent university investigation noted that edits to his Wikipedia page that softened the allegations against him may have influenced the decision to accept his money.

The manufactured reputation gave him enough cover to maintain relationships and access he should never have had. The human cost of that is incalculable.

But here is the other truth the documents make plain: it was never sustainable. No amount of money — and Epstein spent lavishly, constantly, and was still never satisfied — could permanently alter a reality that hadn’t changed. The Wikipedia editors kept coming back. Google kept surfacing the truth. His own emails show him writing, again and again: “Results still very bad.”

Reputation Cannot Be Manufactured

This is the core lesson every executive and organization should take from this story.

Reputation is not built online. It is reflected online. Your digital presence is a mirror of your actions, your conduct, and the truth of who you are. The most powerful thing legitimate online reputation management can do is ensure that mirror is accurate, complete, and favorable — not distorted, fabricated, or falsified.

When clients come to us after a reputational setback, one question we ask is not “what do you want people to find?” but “what is true about you that isn’t being told?” That is where sustainable reputation work begins: with authentic accomplishments, genuine expertise, and honest communication. That is the same technique used in personal branding, as well – when clients want more information about them online so prospective partners, investors, journalists and other pivotal figures can find it.

Black-hat tactics — fake reviews, sock puppet accounts, planted content, manufactured personas — may produce short-term results. But they introduce enormous legal, ethical, and reputational risk. And as the Epstein files demonstrate in painful detail, when the truth eventually surfaces, the gap between the manufactured image and reality only makes the damage worse.

What This Means for You

If you are an executive, business leader, or high-profile individual, this story is a useful reminder to ask some pointed questions about your own digital presence:

— What does your Google search actually say about you today?

— Is your Wikipedia page, if you have one, accurate — and are legitimate channels being used to maintain it?

— Are the people managing your online reputation operating transparently and ethically?

— Is your digital presence built on real content and genuine accomplishment, or on shortcuts that could unravel?

The Epstein files are an extreme case. But the underlying dynamics — the temptation to control one’s online narrative by any means necessary, the willingness to pay for shortcuts, the false sense of security that comes from temporarily buried search results — are not unique to him.

 
 
Reputational Risk of Being A Man

Recently, leaked documents obtained by Forbidden Stories revealed the inner world of Eliminalia, a Spanish reputation management company. Forbidden Stories and partners investigated the company’s manipulation tactics to remove public-interest information from the internet.

Interfor International, the investigative firm helmed by our Advisory Board member Don Aviv,  blogged about Forbidden Stories’ findings. The excerpts below raise awareness of the dirty tactics used by some reputation management agencies, and why it may pay to steer clear of them. That’s especially true if they promise to remove online content, a challenge we have written about here (and here).

Weaponizing Data Protection Regulations

Those who have studied Eliminalia’s strategy identified a pattern. When an article that included unpleasant truths about one of their clients appeared, the company began by sending takedown requests to the journalist, usually through a team member employing a false persona. If the journalist refused to remove their article, Eliminalia went after hosting providers, often weaponizing data protection laws such as the DMCA (Digital Millennium Copyright Act), which was created in 2002 to protect copyrighted content, and the GDPR (General Data Protection Regulation), an EU privacy and human rights law, to push the provider to take down the material.

To exploit the DMCA, for example, they would copy an article, publish it on a third-party website with a falsified earlier date than the original and then claim the real article infringed on the law. Contesting a false claim of DMCA is not easy, leading to long and expensive legal battles many journalists are unable to afford.

Investigators found that if these methods did not work, Eliminalia would then try to hide the material through “deindexing,” which attempts to fool Google into hiding search terms from web results.

Throwing ‘Digital Atomic Bombs’

Eliminalia has used the strategy of “open redirects,” links that appear to drive traffic to legitimate websites but redirect to other fake sites.

At least 622 such websites have been identified. To make the sites appear legitimate, the company mixes content from real sources with positive information about individuals with the same names as their clients.

This method seems to have been successful at influencing Google’s search results, effectively making articles that include allegations against the company’s clients disappear, while replacing them with positive spin.

Now,  Eliminalia and their clients are in the news, and many of their “removed” links and content are back on Google.

Eliminalia is far from the first reputation management firm to create fake news sites to post fake content on. That has been done from the onset of this industry, along with a myriad of ways to post links in places where unwitting internet users would click on, thinking they were clicking on something else.

The problem with using reputation management providers who game the system using what are known as “black hat” methods is that their handiwork is often discovered and undone by Google. You also risk the chance of being identified as a client in investigative articles about them (check out the 2020 Wall Street Journal article, Google Hides News, Tricked by Fake Claims.)

For more insights, read The Washington Post’s article, Leaked files reveal reputation-management firm’s deceptive tactics. And our ultimate guide, The Essentials: Online Reputation Management FAQs.

 
 
Reputation Communications' online reputation management glossary

Algorithm

The formula search engines use to rank websites and determine whether they merit appearing on page 1 or elsewhere in search results.

Authenticity

The quality of being genuine; a valued quality among bloggers and the larger online community.

Astroturfing

Writing fake comments and reviews.

Branded Content

Content that promotes and cultivates a rapport between a targeted audience and a brand’s products and/or services.

Black Hat SEO

Using unethical methods to attempt to raise the ranking of websites in search engine results.

Content

Information delivered in any medium, whether text, videos, podcasts or images. (When two or more media are juxtaposed it is described as “multimedia content.”)

Content Aggregator

A software or web application which collects, combines and publishes a range of syndicated web content (such as news headlines, blogs, podcasts and video blogs).

Content Farms

Companies which create low-quality Internet content with the goal of having their content rank highly in online searches.

Digital Assets

Online images, multimedia and textual content files.

Domain Squatting (also known as cyber squatting)

Registering or using a domain name with the intent to profit from the goodwill of a trademark belonging to someone else. The cyber squatter then offers to sell the domain to the person or company who owns the trademark contained at an inflated price.

Doxxing

Tracing someone or gathering information about an individual using sources on the Internet, then publishing their private information with malicious intent.

Forum

An online discussion site.

Link

A URL name or description providing an instant connection to a different Web site or section of a Web site. A Web site’s page rank on Google (and other) searches is influenced by the number of links pointing to it (“inbound links”), and the quality of the sites they are linking from.

Linkbait

A marketing technique to increase a website’s popularity by providing content that entices visitors to include a link to the website at their own sites.

Link farms

A website created solely for the purpose of increasing the page rank of other sites with indiscriminate outbound links. Most search engines penalize sites connected to link farms.

Name space

A person or company’s name online.

Online audit

An assessment of a subject’s online image: typically a person, business or organization.

Online communities

Social networks where people communicate online. Also called “virtual communities.”

Online image

A subject’s online reputation. Mainly determined by the content appearing in top results in a Google (or other search engine) search of the subject’s name.

Online monitoring

Real-time monitoring of the information available about a person, business, organization or other topic on the Internet, including on social media.

Online reputation management

Establishing, improving and monitoring the publicly available online information about a business or individual.

Page rank

A continually changing value based on a complex algorithm assigned to a Web site or page to determines its position in a search engines’ results – the higher the page rank, the more likely people will find the web site or page.

Search engine optimization (SEO)

Strategically designing a Web site so it gains a higher page rank and consequently attract more new visitors.

SEO-optimized

Website or page that has been designed to be accessible to search engines and improve the chance that the website will be found and ranked by search engines.

Social media

Online communication between people using a variety of platforms, including blogs, forums and Twitter.

Social network

A network of individuals connected through a social media platform such as Facebook.

Sock puppet

An email or social media account set up to publish fake online content.

Transparency

Openness and sincerity in online communications.

Troll

A person who sows discord on the Internet by posting inflammatory, extraneous, or off-topic messages in an online community.

White Hat SEO

Search engine optimization techniques that involve no deception.

Viral Media

Content that attracts new viewers mainly through word-of-mouth in social networks and possibly result in significant and rapid visibility.

For more in-depth information, read The Essentials: Online Reputation Management FAQ.

 
 
Right to be Forgotten on Google

Google has just launched an attack on “fake news and problematic content,” including “rumors, urban myths, slurs or derogatory topics.” That is good news for anyone (and any organization) plagued by such issues. It is bad news for low-quality content, fake links and other tactics used to trick Google into suppressing as well as raising online content.

We like it. Now you, the consumer, can flag false, biased, offensive and inaccurate content that Google brings up on search suggestions. You can also include a note that explains why Google should remove it. Equally beneficial, the Internet can potentially become a fairer playing field.

Last week Danny Sullivan, a leading search engine expert, wrote the defining explanation of what this means. These are excerpts from his article on Search Engine Land :

Google knows it has a search quality problem. It’s been plagued since November with concerns about fake news, disturbing answers and offensive search suggestions appearing at the top of its results. “Project Owl” is an effort by the company to address these issues, with three specific actions being announced today.

In particular, Google is launching:

  • a new feedback form for search suggestions, plus formal policies about why suggestions might be removed.
  • a new feedback form for “Featured Snippets” answers.
  • a new emphasis on authoritative content to improve search quality.

“Problematic searches” is a term I’ve been giving to a situations where Google is coping with the consequences of the “post-truth” world. People are increasingly producing content that reaffirms a particular world view or opinion regardless of actual facts. In addition, people are searching in enough volume for rumors, urban myths, slurs or derogatory topics that they’re influencing the search suggestions that Google offers in offensive and possibly dangerous ways.

“These are problematic searches, because they don’t fall in the clear-cut areas where Google has typically taken action. Google has long dealt with search spam, where people try to manipulate its results outside acceptable practices for monetary gain. It has had to deal with piracy. It’s had to deal with poor-quality content showing up for popular searches.

“Problematic searches aren’t any of those issues. Instead, they involve fake news, where people completely make things up. They involve heavily-biased content. They involve rumors, conspiracies and myths. They can include shocking or offensive information. They pose an entirely new quality problem for Google, hence my dubbing them “problematic searches.”

Read his full article: Google’s ‘Project Owl’ — a three-pronged attack on fake news & problematic content.