When your business has an idea about a new search topic for which you think your content has the potential to rank highly, the ability to spin up a query and investigate it right away is key. Even more importantly, the tool should give you enough data points, guidance, and recommendations to confirm whether or not that particular keyword, or a related keyword or search phrase, is an SEO battle worth fighting (and, if so, how to win). We'll get into the factors and metrics to help you make those decisions a little later.
Page and Brin founded Google in 1998.[23] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[24] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[25]

For traditional SEO, this has meant some loss of key real estate. For SERP results pages that once had 10 positions, it's not uncommon now to see seven organic search results below a Featured Snippet or Quick Answer box. Rather than relying on PageRank algorithm for a specific keyword, Google search queries rely increasingly on ML algorithms and the Google Knowledge Graph to trigger a Quick Answer or pull a description into a snippet atop the SERP.
Provides links to price estimate tools from Google AdWords. That Google AdWords tool showed the necessary bid to rank #1 for 85% of queries, and roughly how much traffic you could expect AdWords to send you based on that bid price and ad position, though, as mentioned above, Google has obfuscated their data in their interface for everyone but longtime AdWords advertisers.

SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay per click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running and optimizing search engine ad campaigns.[56] Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. Its purpose regards prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58] In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public,[59] which revealed a shift in their focus towards "usefulness" and mobile search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016 where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device [60]. Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and how user-friendly it is.


These cloud-based, self-service tools have plenty of other unique optimization features, too. Some, such as AWR Cloud and Searchmetrics, also do search position monitoring—which means tracking how your page is doing against popular search queries. Others, such as SpyFu and LinkResearchTools, have more interactive data visualizations, granular and customizable reports, and return on investment (ROI) metrics geared toward online marketing and sales goals. The more powerful platforms can sport deeper analytics on paid advertising and pay-per-click (PPC) SEO as well. Though, at their core, the tools are all rooted in their ability to perform on-demand keyword queries.

Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[33] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[34] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[35]
For example, within the HubSpot Blogging App, users will find as-you-type SEO suggestions. This helpful inclusion serves as a checklist for content creators of all skill levels. HubSpot customers also have access to the Page Performance App, Sources Report, and the Keyword App. The HubSpot Marketing Platform will provide you with the tools you need to research keywords, monitor their performance, track organic search growth, and diagnose pages that may not be fully optimized.
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
Google used to make much of its ad hoc keyword search functionality available as well, but now the Keyword Planner is behind a paywall in AdWords as a premium feature. Difficulty scores are inspired by the way Google calculates its Competition Score metric in AdWords, though most vendors calculate difficulty using PA and DA numbers correlated with search engine positions, without AdWords data blended in at all. Search Volume is a different matter, and is almost always directly lifted from AdWords. Not to mention keyword suggestions and related keywords data, which in many tools come from Google's Suggest and Autocomplete application programming interfaces (APIs).
On the voice and natural language side, it's all about FAQs (frequently asked questions). Virtual assistants and smart home devices have made voice recognition and natural language processing (NLP) not only desirable but an expected search vector. To predict how to surface a business's results in a voice search, SEO professionals now need to concentrate on ranking for the common NL queries around target keywords. Google's Quick Answers exist to give its traditional text-based search results an easy NL component to pull from when Google Assistant is answering questions.

Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay per click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running and optimizing search engine ad campaigns.[56] Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. Its purpose regards prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58] In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public,[59] which revealed a shift in their focus towards "usefulness" and mobile search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016 where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device [60]. Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and how user-friendly it is.
For traditional SEO, this has meant some loss of key real estate. For SERP results pages that once had 10 positions, it's not uncommon now to see seven organic search results below a Featured Snippet or Quick Answer box. Rather than relying on PageRank algorithm for a specific keyword, Google search queries rely increasingly on ML algorithms and the Google Knowledge Graph to trigger a Quick Answer or pull a description into a snippet atop the SERP.

Rob Marvin served as PCMag's Associate Features Editor from December 2017 to December 2019. He wrote features, news, and trend stories on all manner of emerging technologies. Beats included: big tech coverage, startups, business and venture capital, blockchain and cryptocurrencies, artificial intelligence, augmented and virtual reality, IoT and aut... See Full Bio
The Java program is fairly intuitive, with easy-to-navigate tabs. Additionally, you can export any or all of the data into Excel for further analysis. So say you're using Optify, Moz, or RavenSEO to monitor your links or rankings for specific keywords -- you could simply create a .csv file from your spreadsheet, make a few adjustments for the proper formatting, and upload it to those tools.
If you want to develop a real-time multitasking plagiarism detection system, incorporated into your website, then we have your back. The Plagiarism Checker API offers you a great API integration solution. This completely eliminates the need to check each and every article for every student individually and saves you hours upon hours of work and headache. You can check plagiarism for multiple essays, thesis or assignments of your students in just one click. This also works great for big websites who accept dozens of articles from contributors frequently.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
If you are a locksmith or a pizza shop mobile search ads which drive conversion oriented calls are highly valuable. However for businesses with more complex sales funnels desktop visitors have a substantially higher visitor value than mobile phone users. In August of 2016 TripAdvisor executives stated their visitor values on desktop and tablet devices were similar, but cell phone visitors were only worth 30% to 1/3 as much. Smaller businesses likely see a deeper click value discount on smart phones and other small mobile devices where typing (and thus converting) is hard to do.
SEO platforms are leaning into this shift by emphasizing mobile-specific analytics. What desktop and mobile show you for the same search results is now different. Mobile results will often pull key information into mobile-optimized "rich cards," while on desktop you'll see snippets. SEMrush splits its desktop and mobile indexes, actually providing thumbnails of each page of search results depending on the device, and other vendors including Moz are beginning to do the same.

Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don't require an internal "search" functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.
Another example when the “nofollow" attribute can come handy are widget links. If you are using a third party's widget to enrich the experience of your site and engage users, check if it contains any links that you did not intend to place on your site along with the widget. Some widgets may add links to your site which are not your editorial choice and contain anchor text that you as a webmaster may not control. If removing such unwanted links from the widget is not possible, you can always disable them with “nofollow" attribute. If you create a widget for functionality or content that you provide, make sure to include the nofollow on links in the default code snippet.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
Difficulty scores are the SEO market's answer to the patchwork state of all the data out there. All five tools we tested stood out because they do offer some version of a difficulty metric, or one holistic 1-100 score of how difficult it would be for your page to rank organically (without paying Google) on a particular keyword. Difficulty scores are inherently subjective, and each tool calculates it uniquely. In general, it incorporates PA, DA, and other factors, including search volume on the keyword, how heavily paid search ads are influencing the results, and how the strong the competition is in each spot on the current search results page.
In addition to on-page SEO factors, there are off-page SEO factors. These factors include links from other websites, social media attention, and other marketing activities outside your own website. These off-page SEO factors can be rather difficult to influence. The most important of these off-page factors is the number and quality of links pointing towards your site. The more quality, relevant sites that link to your website, the higher your position in Google will be.
Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[50] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[51]
×