Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
An SEO expert could probably use a combination of AdWords for the initial data, Google Search Console for website monitoring, and Google Analytics for internal website data. Then the SEO expert can transform and analyze the data using a BI tool. The problem for most business users is that's simply not an effective use of time and resources. These tools exist to take the manual data gathering and granular, piecemeal detective work out of SEO. It's about making a process that's core to modern business success more easily accessible to someone who isn't an SEO consultant or expert.
Over the past year or two, we've also seen Google begin to fundamentally alter how its search algorithm works. Google, as with many of the tech giants, has begun to bill itself as an artificial intelligence (AI) and machine learning (ML) company rather than as a search company. AI tools will provide a way to spot anomalies in search results and collect insights. In essence, Google is changing what it considers its crown jewels. As the company builds ML into its entire product stack, its core search product has begun to behave a lot differently. This is heating up the cat-and-mouse game of SEO and sending the industry chasing after Google once again.
For the purposes of our testing, we standardized keyword queries across the five tools. To test the primary ad hoc keyword search capability with each tool, we ran queries on an identical set of keywords. From there we tested not only the kinds of data and metrics the tool gave, but how it handled keyword management and organization, and what kind of optimization recommendations and suggestions the tool provided.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines. By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.
When it comes to finally choosing the SEO tools that suit your organization's needs, the decision comes back to that concept of gaining tangible ground. It's about discerning which tools provide the most effective combination of keyword-driven SEO investigation capabilities, and then on top of that, the added keyword organization, analysis, recommendations, and other useful functionality to take action on the SEO insights you uncover. If a product is telling you what optimizations need to be made to your website, does it then provide technology to help you make those improvements?
Depending on your topic / vertical and your geographic location the search engines may have vastly different search volumes. The tool can only possibly offer approximations. Exact search volumes are hard to find due to vanity searches, click bots, rank checkers, and other forms of automated traffic. Exceptionally valuable search terms may show far greater volume than they actually have due to various competitive commercial forces inflating search volumes due to automated search traffic.
The caveat in all of this is that, in one way or another, most of the data and the rules governing what ranks and what doesn't (often on a week-to-week basis) comes from Google. If you know where to find and how to use the free and freemium tools Google provides under the surface—AdWords, Google Analytics, and Google Search Console being the big three—you can do all of this manually. Much of the data that the ongoing position monitoring, keyword research, and crawler tools provide is extracted in one form or another from Google itself. Doing it yourself is a disjointed, meticulous process, but you can piece together all the SEO data you need to come up with an optimization strategy should you be so inclined.
Where the free Google tools can provide complementary value is in fact-checking. If you're checking out more than one of these SEO tools, you'll quickly realize this isn't an exact science. If you were to look at the PA, DA, and keyword difficulty scores across KWFinder.com, Moz, SpyFu, SEMrush, Ahrefs, AWR Cloud, and Searchmetrics for the same set of keywords, you might get different numbers across each metric separated by anywhere from a few points to dozens. If your business is unsure about an optimization campaign on a particular keyword, you can cross-check with data straight from a free AdWords account and Search Console. Another trick: Enable Incognito mode in your browser along with an extension like the free Moz Toolbar and you can run case-by-case searches on specific keywords to get an organic look at your target search results page.
Crawlers are largely a separate product category. There is some overlap with the self-service keyword tools (Ahrefs, for instance, does both), but crawling is another important piece of the puzzle. We tested several tools with these capabilities either as their express purpose or as features within a larger platform. Ahrefs, DeepCrawl, Majestic, and LinkResearchTools are all primarily focused on crawling and backlink tracking, the inbound links coming to your site from another website. Moz Pro, SpyFu, SEMrush, and AWR Cloud all include domain crawling or backlink tracking features as part of their SEO arsenals.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
The emphasis on tools, meaning plural, is important because there's no one magical way to plop your website atop every single search results page, at least not organically, though there are best practices to do so. If you want to buy a paid search ad spot, then Google AdWords will happily take your money. This will certainly put your website at the top of Google's search results but always with an indicator that yours is a paid position. To win the more valuable and customer-trusted organic search spots (meaning those spots that start below all of those marked with an "Ad" icon), you must have a balanced and comprehensive SEO strategy in place.
"Organic search" pertains to how vistors arrive at a website from running a search query (most notably Google, who has 90 percent of the search market according to StatCounter. Whatever your products or services are, appearing as close to the top of search results for your specific business has become a critical objective for most businesses. Google continously refines, and to the chagrin of search engine optimization (SEO) managers, revises its search algorithms. They employ new techniques and technologies including artificial intelligence (AI) to weed out low value, poorly created pages. This brings about monumental challenges in maintaining an effective SEO strategy and good search results. We've looked at the best tools to ket you optimize your website's placement within search rankings.
Another excellent guide is Google’s “Search Engine Optimization Starter Guide.” This is a free PDF download that covers basic tips that Google provides to its own employees on how to get listed. You’ll find it here. Also well worth checking out is Moz’s “Beginner’s Guide To SEO,” which you’ll find here, and the SEO Success Pyramid from Small Business Search Marketing.