Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.

Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
There are three types of crawling, all of which provide useful data. Internet-wide crawlers are for large-scale link indexing. It's a complicated and often expensive process but, as with social listening, the goal is for SEO experts, business analysts, and entrepreneurs to be able to map how websites link to one another and extrapolate larger SEO trends and growth opportunities. Crawling tools generally do this with automated bots continuously scanning the web. As is the case with most of these SEO tools, many businesses use internal reporting features in tandem with integrated business intelligence (BI) tools to identify even deeper data insights. Ahrefs and Majestic are the two clear leaders in this type of crawling. They have invested more than a decade's worth of time and resources, compiling and indexing millions and billions, respectively, of crawled domains and pages.
Ultimately, we awarded Editors' Choices to three tools: Moz Pro, SpyFu, and AWR Cloud. Moz Pro is the best overall SEO platform of the bunch, with comprehensive tooling across keyword research, position monitoring, and crawling on top of industry-leading metrics incorporated by many of the other tools in this roundup. SpyFu is the tool with the best user experience (UX) for non-SEO experts and the deepest array of ROI metrics as well as SEO lead management for an integrated digital sales and marketing team.
Steve Webb is an SEO audit specialist at Web Gnomes. He received his Ph.D. from Georgia Tech, where he published dozens of articles on Internet-related topics. Professionally, Steve has worked for Google and various other Internet startups, and he's passionate about sharing his knowledge and experiences with others. You can find him on Twitter, Google+, and LinkedIn.
Provides links to price estimate tools from Google AdWords. That Google AdWords tool showed the necessary bid to rank #1 for 85% of queries, and roughly how much traffic you could expect AdWords to send you based on that bid price and ad position, though, as mentioned above, Google has obfuscated their data in their interface for everyone but longtime AdWords advertisers.

Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
SEO platforms are leaning into this shift by emphasizing mobile-specific analytics. What desktop and mobile show you for the same search results is now different. Mobile results will often pull key information into mobile-optimized "rich cards," while on desktop you'll see snippets. SEMrush splits its desktop and mobile indexes, actually providing thumbnails of each page of search results depending on the device, and other vendors including Moz are beginning to do the same.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?

LinkResearchTools makes backlink tracking its core mission and provides a wide swath of backlink analysis tools. LinkResearchTools and Majestic provide the best backlink crawling of this bunch. Aside from these two backlink powerhouses, many of the other tools we tested, such as Ahrefs, Moz Pro, Searchmetrics, SEMrush, and SpyFu, also include solid backlink tracking capabilities.
We concentrated on the keyword-based aspect of all the SEO tools that included the capabilities, because that's where most business users will primarily focus. Monitoring particular keywords and your existing URL positions in search rankings is important but, once you've set that up, it's largely an automated process. Automated position-monitoring features are a given in most SEO platforms and most will alert you to issues, but they don't actively improve your search position. Though in tools such as AWR Cloud, Moz Pro, and Searchmetrics, position monitoring can become a proactive process that feeds back into your SEO strategy. It can spur further keyword research and targeted site and competitor domain crawling.
Depending on your topic / vertical and your geographic location the search engines may have vastly different search volumes. The tool can only possibly offer approximations. Exact search volumes are hard to find due to vanity searches, click bots, rank checkers, and other forms of automated traffic. Exceptionally valuable search terms may show far greater volume than they actually have due to various competitive commercial forces inflating search volumes due to automated search traffic.
In the enterprise space, one major trend we're seeing lately is data import across the big players. Much of SEO involves working with the data Google gives you and then filling in all of the gaps. Google Search Console (formerly, Webmaster Tools) only gives you a 90-day window of data, so enterprise vendors, such as Conductor and Screaming Frog, are continually adding and importing data sources from other crawling databases (like DeepCrawl's). They're combining that with Google Search Console data for more accurate, ongoing Search Engine Results Page (SERP) monitoring and position tracking on specific keywords. SEMrush and Searchmetrics (in its enterprise Suite packages) offer this level of enterprise SERP monitoring as well, which can give your business a higher-level view of how you're doing against competitors.
For example, within the HubSpot Blogging App, users will find as-you-type SEO suggestions. This helpful inclusion serves as a checklist for content creators of all skill levels. HubSpot customers also have access to the Page Performance App, Sources Report, and the Keyword App. The HubSpot Marketing Platform will provide you with the tools you need to research keywords, monitor their performance, track organic search growth, and diagnose pages that may not be fully optimized.
Mike Levin is the Senior SEO Director at Ziff Davis, PCMag's parent company. His career goes back 25 years to the halls of Commodore Computers, as an original Amiga fanboy to be beamed up by the mothership just as it imploded. Over his past 10 years in NYC, Mike's highlights have included leading the Apple Store, Kraft and JCPenney SEO accounts whi... See Full Bio
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]