Author: cornerstone

What Happened to was a search engine optimization (SEO) consulting and training company based in Boston, Massachusetts. In an online world where people will do almost anything to rank high on search engines, High Rankings made a simple promise to its clients: “We use only legitimate, honest methods to get you in the top listings under the most relevant keywords for your pages” (Source).  

After publishing for about 20 years, went offline. What happened to a firm that once claimed to have “grown to be one of the pre-eminent SEO companies in the United States?” (Source). We took some time to unravel this mystery, and we identify eight of the most important and influential articles that Whalen’s site ever published.

The History of was established in 1995 by Jill Whalen. Whalen’s journey into the SEO field started when she was a stay-at-home mom raising three children. She started an online parenting chat room, which became so popular that it required a website to house the forums.

When the was created, Whalen started learning how to code and attract traffic to the site. She began by analyzing listings with the words “parenting chat.” She then incorporated the terms she discovered into her page. This would be the beginning of a journey that would later lead to her being called “a pioneer in the SEO website marketing industry” (Source).    

Whalen has since retired and now spends her time working on her blog, “What Did You Do With Jill.”  She says that her new mission is “to teach as many people as [she] can about how life truly works so that they can live their own lives more fully and happily.”  Whalen is also the author of the book “Victim of Thought: Seeing Through the Illusion of Anxiety

The Services

In its early years, assisted clients by checking search engines once every week to ensure that the pages that clients submitted were indeed indexed by the search engines. At that time, checking that your pages were indexed was a time-consuming task (Source). advised clients to closely monitor the indexing of their pages as “the search engines [seem] to add and remove pages at will” (Source). grew over the years to offer other services, including website reviews and answers to why a site was not doing well, and recommended what needs to be done to meet the needs of website visitors.  

For those looking for a quick-fix SEO solution, High Rankings was honest: “we’re not a good fit.” The site was clear about whom it wanted to work with: “If you’d like to be educated about SEO best practices that will keep your website optimized long after we’ve done our job–then we should talk” (Source). 

The Most Influential 8 Articles on may no longer be publishing, and Whalen may have moved on to other things, but some of the topics the website focused on are still relevant to this day. We took some time to find the most influential articles published on the website.

The list we presented below is based on the number of links an article received from other websites, editors, and writers. We believe that the real staying power of an article is in its ability to attract attention from the online community. 

1. Ten Tips to the Top of the Search Engines

Whalen starts this article by noting that “Having a website that gets found in Google, Yahoo, and Bing isn’t hard to do, but it can be difficult to know where to begin.” She then presents what she called her “latest and greatest tips to get you started.” 

Even though this article may have been written over a decade ago, some of the tips it contains are still relevant. For instance, she advised website owners to optimize their content for the target audience and not search engines. She advises that when you know who your audience is, you will likely create the type of content that the audience will be searching for (Source). 

Another piece of advice that still seems to be useful today is that you should “make sure your site is ‘link-worthy.’” This is important because “other sites linking to yours is a critical component of a successful search engine optimization campaign, as all of the major search engines place a good deal of emphasis on your site’s overall link popularity” (Source).   

2. The Meta Description Tag

In this piece, Whalen discusses why a Meta description is a critical component in your SEO and overall marketing strategy. She starts by noting that while the “Meta description tag may not affect your page’s ranking in the search engines, but this tag can still come in handy in your overall SEO and social media marketing campaigns.”

Whalen then describes the Meta description as a “snippet of HTML code that belongs inside the <Head> </Head> section of a web page.” She then provides some reasons why it may be useful for your SEO and general marketing strategy: 

  • It can provide information about the contents of a page and tell a visitor whether it contains what they are looking for.
  • Sometimes, search engines use a part of your Meta description when they show “extended sitelinks” for a website. 
  • Social media marketing links, such as Facebook and Google+, use it as a default description. 

3. All About Title Tags

If there is one thing that many webmasters will agree with, it is the sentence: “The title tag has been – and probably will always be – one of the most important factors in achieving high search engine rankings.” If done properly, title tags should be an accurate and concise description of the content on a page.  

Whalen advises that when struggling with rankings because people are simply not clicking through, “fixing just the title tags of your pages can often generate quick and appreciable differences to your rankings” (Source).

4. Can Meta Tags Such as the Keyword tag Bring High Rankings to my Site?

Man holding Keywords message on book and keyboard with a hot cup of tea, macaroon on the table. Can be attributed to your ad.

Although Whalen’s advice to include meta keywords is largely seen as outdated in 2020, her advice to “Stop worrying about Meta Tags and focus on what really matters” is still timely.

So, what really matters, according to Whalen? 

  • Choose your relevant keywords.
  • Write the site’s content based on these keywords.
  • Create a title tag using the same keywords.
  • Create a Meta description tag as a marketing sentence, also based on these keywords. 

5. Getting into Google 

In this article, Whalen reports about an evening at the third SEMNE event (Search Engine Marketing New England). She provides details about a presentation by Dan Crow, then a director of crawl systems at Google. Whalen reports that Crow was “spilling the beans about how to get your site into Google.”

Some of the advice from Crow was: 

  • Besides having high-quality links, building a site with unique content can be a significant milestone towards being indexed and ranked. 
  • The Webmaster Central Tool (today called Google Search Console) plays a vital role in providing information about a particular site, including backlinks and the keyword phrases that people have used to find each page of a website. 

6. Nitty-gritty of Writing for the Search Engines 

This article features the book titled Nitty-gritty of Writing for the Search Engines by Jill Whalen. The book’s central idea is, “To successfully rank highly in the search engines, the words on your Web pages should never be an afterthought but a major investment in your search engine optimization campaign.”  

7. 16 SEO Tactics that WILL NOT Bring Targeted Google Visitors

In this article, Whalen lists some tactics that she says are “done to websites in the name of SEO that, in reality, have no bearing on it.”

Some of the tactics Whalen names include keyword-stuffed content, linking to Google and other popular websites (“it’s the links pointing to your pages from other sites that help you with SEO, not the pages you’re linking out to”), and making insignificant changes to freshen content (Source).   

8. Link Popularity

The message in this article is simple: “If other sites are linking to your site, it must be a winner; therefore, it deserves a boost in the rankings… People link to good sites, not bad ones.” 

Whalen also uses the rest of the article to explain the difference between link popularity and PageRank. She also explains how link popularity works, where reciprocal links come from, and whether owners of websites should care about link popularity.  

Given the Popularity of the Site, What Happened to  

Whalen announced in 2013 that she was “moving on from SEO”.  She believed that Google’s Panda and Penguin algorithm updates finally removed the possibility of gaming the search engine results, and that most SEO consultants were now arguing that clients simply create great content that their customers wanted — a strategy that she had been recommending for nearly two decades. went offline sometime in 2015.

What Happened to was a website that provided a human-curated directory of many of the most credible websites on the internet. This multilingual resource would find websites in the same niche and group them into categories. A group of volunteers who formed a community that later came to be called the Open Directory Project (ODP) worked on the site. The project closed in March 2017 when its biggest funder, America Online (AOL), indicated that it no longer wished to support it. We traced the history of the project to determine what happened to it.

Why DMOZ was Important to SEOs

When Google launched in 1998 every search engine faced a difficult problem: how can you tell high-quality content from low-quality content? 

Some solutions scaled (meaning that the incremental cost of applying that solution to millions of pages was near zero) but were easy to game. One example was to calculate the keyword density of a particular keyword on a page, ranking articles with “natural” keyword densities higher than pages with “unnatural” keyword densities.

Other solutions provided excellent results and were difficult to hack, but didn’t scale at all.  One example was to pay humans to read all of the pages that targeted a particular keyword and rank order those results.

Computers are excellent at calculating keyword densities, but keyword density is poorly correlated with content quality.  Humans are excellent at identifying quality content, but relative to the cost and computational power of a computer, ridiculously expensive.  Google needed a hack that allowed them to harness the power of humans without having to pay them.

Google’s solution, of course, was elegant: when a webmaster chose to link to another resource on the web, why not treat that link as a vote for the quality of that page? The more votes that a particular website acquired, the higher the value of the votes that it cast in favor of other pages.

It’s clear that this one insight, implemented well, gave Google its lead in the search engine race.

But, there were many other problems that humans were better at solving than were the computers and algorithms available in 1998.  For example:

  • What subject category should a particular website be placed into?
  • What was the best summary of a page?  (The meta description, which every savvy webmaster tried to optimize?)
  • And, most importantly, who could you trust to link to sites based on the merit of the sites, rather than on who was willing to pay the most?

DMOZ was important to Google because it allowed them to answer these questions without having to hire humans to answer them. was important to SEOs because it was important to Google.

To fully understand DMOZ’s role, you need to understand more about the DMOZ itself.

The Early Days of DMOZ

According to, a website that describes itself as the “insider’s look into what the search engines like and do not like,” was an attempt to solve the frustrations emanating from a similar directory of the web produced by Yahoo, which was known for having many broken links (Source). So, in June 1998 a new directory called GnuHoo went live. (GnuHoo was a hat tip to both the open-source movement and “Yahoo”).

Initially, the GnuHoo process was disorganized. The system lacked clearly defined policies and proper management. reports that “It was unclear just how sites were to be described or categorized, and general anarchy reigned” (Source). There was also no way of ensuring that unscrupulous editors did not join to promote their businesses.

GnuHoo also met some intellectual property challenges when the GNU project, an established organization which at that time was producing the free UNIXish operating system, protested the use of the name GnuHoo. This forced GnuHoo to rebrand as NewHoo (Source). reports that within a year of its establishment, NewHoo had around 400 editors, 3,900 categories, and 31,000 websites. Notwithstanding the initial disorganization at GnuHoo, the directory was gaining popularity. The main reason behind its reputation is that it had substantially reduced the amount of time required to index sites. Within the same year, the number of editors would rise to around 8,000. The categories grew to 2,500, and approximately 430,000 sites were listed (Source). 

The Middle Years

The moniker DMOZ was an abbreviation of the name: Directory Mozilla. Writing for the British publication, The Guardian, Glyn Moody called DMOZ, “a Wikipedia was written by experts.” 

The experts that Moody refers to were volunteers. Their role was to select, evaluate, describe, and organize websites. Each entry went through a painstaking seven-stage process.

According to the site, there were no special requirements for one to become an editor. An archive of an early page from indicates that all one needed was “an interest or passion and a computer.” However,  editors on the site had to ensure that they did not have any conflict of interest  (Source).

Because Google trusted DMOZ’s volunteer curation process the search engine would sometimes make use of the DMOZ descriptions as snippets on its results. This would happen in instances were Google felt that the DMOZ depiction did a better job of describing the content of an article than the article’s meta-description or its content (Source). 

Acquisition by Netscape

Source: Coolcaesar

As would be expected from a project that was scaling in the manner that was doing, the site soon attracted the attention of industry players.  In October 1998, Netscape paid $1 million for the directory, promising to maintain the website’s original character of being a non-commercial entity. Soon after the purchase, Netscape changed the name of the project to the Open Directory Project (ODP) (Source).

Soon after taking over the resources of, Netscape was acquired by America Online (AOL). AOL agreed to respect the Open Directory License. Writing for, a website that calls itself the “portal for language professionals,” Jim Hedger says that the 1998 purchase could be seen as the “start of the ODP’s rise.” This resulted in it becoming the favored resource where major search engines were getting their data (Source). Andrew Goodman, who writes for the website, reports that the Netscape-owned directory would soon have almost 23,000 editors and 1, 5 million sites in over 230,000 categories (Source). 

Allegations of inappropriate conduct

As the ODP grew in popularity, controversy also started to circle it. One of the accusations of improper behavior against the ODP was made in 2007 by the owner of the website Jeremy Schoemaker. Schoemaker says that an individual claiming to be an editor at the OPD had attempted to extort $5,000 from him. In an article posted on his website, Schoemaker says he ignored the email but soon received another one telling him that his site was no longer listed. He claims that the same editor wrote to him again, advising him to rethink not paying him.

What Then Happened to DMOZ?

In early 2017 DMOZ posted the following note: “As of Mar 14, 2017, will no longer be available.”

Beyond the obvious issue of cost, DMOZ had stopped being relevant. If, in 1998, DMOZ could far outperform computers in terms of categorizing and summarizing sites, 19 years later that was no longer true., a website dealing with content categorization, notes that DMOZ suffered as a result of the arrival of new technology such as artificial intelligence and machine learning, which made data analysis much faster without human intervention. The human-edited directory was simply no longer scalable as a business model. (Source

Read more about our top blogs here: