The quality of your website content is the primary driver of your website’s search engine rankings and user experience. Great content is still king and better than having quality content created specifically for your intended user than that content that increases site traffic, which improves your site’s authority and relevance to user needs. You need to fine-tune your website content when you want to go along this direction. And most often, it is rather good not to add to, and just let go of something that no longer gives value to your web page content. This situation is real and frequently happens to search engine optimization (SEO). That to optimize the content of your website, it is better to let go of underperforming or non-performing webpage content. So, better remove, prune, and take a portion of that content away, to allow those remaining web page content to revitalize and shine out their value to achieve your website SEO goal.
Table of Contents
What is SEO Pruning?
The term SEO pruning, content pruning, and SEO content pruning are interchangeably in use. Contexts wise, these terms are usable one after the other or all usable at one time as long as the use of these terms leads to the same purpose for website SEO. When Google rolled out Google Panda 4.0 and Penguin updates, these updates enforced a reduction of the number of Google search indexed pages. Some webmasters began to perform content pruning of low-quality content. Those contents that did not add any value to the website anymore, such as thin or poor content, bad link building, and keyword stuffing that become underperforming or non-performing web page content. It is easier said than done to decide, and prune or remove indexed pages as it may go wrong if it is mishandled.
When you carry out the techniques of SEO Content Pruning, you can essentially define SEO prune, pruning or optimization, as a selective process to remove a webpage content from the website with the end goal of taking away unwanted elements, links, or pieces of content that are underperforming to improve overall search engine visibility and direct new, healthy traffic reputation of your website.
Why is SEO Pruning essential for your website?
That is what SEO pruning is all about. It matters significantly to clean and organize your website on two optimization fronts. First, your SEO pruning can take away content that is low-quality or poses threats or harm to your website ranking and visibility. Second, your content pruning can reorganize your content to bridge organic traffic while simplifying similar content elements or duplicating pieces of information and integrate these into a more accessible and shareable piece of website content for your search engines, readers, and customers.
Quality and not quantity matters most in search engine optimization, get rid of those non-performing content that can be indexed by search engines causing your entire website down, and provide a unique content to entice customers who get smarter every day in deciding whether to choose your content over the other website. These are the reasons that make SEO pruning as one of the effective SEO best practices. So, what SEO content pruning does is cutting off, taking away, and editing those underperforming or non-performing pages to improve the overall SEO health and reputation of your website.
Yes, it is best to maintain a little but great content to your website than pumping in or pampering as many web page content into Google search engine as possible. Get familiar and master these SEO Pruning Tips to guide you in pruning those underperforming or non-performing content that can pull your entire website down when Google search engines might index it. Kindly note that this is not a step by step guide and you are free to add other considerations to this list, that is arranged in no particular order. You can always tailor it to your website or business needs.
1. Identify pages with syndicated content.
It is best to define your content syndication strategy for you to optimize the potential of your web-based content and have it republished through a third-party website or syndication platform. Syndication works well and becomes useful when you are working symbiotically with a robust (website content optimization) strategy that begins with great content. When your digital content such as blog posts, articles, infographics, videos, and other relevant content of your website is syndicated, it enables your website to build a win-win relationship of having your content being published while the third-party website gets free, relevant content as a reciprocal value of such content exchange. Not all syndicated content is made unique, but when web-based content containing value-added content is affiliated and copied across websites, the syndication makes your web pages unique.
Identify the pages with syndicated content and make this available either in summaries or full renditions of your website’s recent content additions for content licensing for reuse or republishing arrangements. When you identify pages with syndicated content, include any type of digital content like videos, infographics, and other multimedia aside from the usual textual content. When you have correctly identified your syndicated page content, monitor and moderate its syndication cycle as it can last long, and may even create another series of syndication loops of content exposure, publicity for a wider content reach frame, and backlinks to your website. Make this as your kick-off point for the rest of your SEO pruning tactics that we will discuss in the succeeding sections.
2. Temporarily Pruning Out of Stock Pages
You can temporarily prune out of stock webpages that may return later on your website. Consider that these webpages can best represent what is happening now to our low-to-no-supply and high-demand conditions. And that it may again drastically change over a definite period shortly. Hence, you need a temporary redirect while keeping your webpages to rank and continue to give value to your website searchers or customers. Basically, you are not pruning this content out of your website, but just taking this away from Google index for a definite time. Temporarily prune your web pages with <META NAME=”ROBOTS” CONTENT=”NOINDEX, FOLLOW”> as you don’t want to move it permanently. The robots exclusion protocol tells web robots or most often the search engines what to crawl or not on your site.
You can also use product-offer schemas to set options such as In-Stock, In-Store-Only, Out-Of-Stock, and Sold-Out. This information, as part of your organic results, will be displayed by Google, which is still a continued ranking of your webpage. The schema.org ItemAvailability tool can help you do this content pruning tactic.
3. Remove Review Spam
Before you remove any of your review spam from your webpages, check the site’s ratings and reviews whether these make sense. Then list down the issues of each spam. Identify these spams to reduce anomalies on your web pages content. Check for as well as filter out spammy or pointless reviews. You can use tools like Google Alerts, MOZ fresh web explorer, reputation management software, WordPress Plugins, and other related applications that can detect, remove your review spam, as well as filter out spammy reviews.
Also, Google has new updates from time to time on new review spam detection, so better follow them for your timely use, such as the recent algorithm changes in February this year. The update aimed to address review spams on Google+ Local pages to filter potential fake or computer-generated review spams. Google removes any “fake glowing testimonial,” or those reviews that are “too good to be true” as part of Google’s new algorithm update. In case of a negative review, Google will only remove it if it violates Google’s Guidelines to ensure posting of the true reviews, regardless if it contains negative or positive comments.
This leaves many companies to think about what they can do in terms of SEO content optimization to make sure their Google+ reviews remain on the page. In short, Google works to ensure that only irrelevant reviews and spammy reviews are removed to assure that your other reviews will not be deleted or affected. Well, business owners can help out by following a few simple content rules – and that is to make sure that your valid reviews remain on the page, and you can appropriately handle these authentic reviews.
4. Disallow permanently pruned pages.
Disallow permanently pruned web pages that still exist on your website. Make sure to remove only those content that’s weighing down your website, those that are underperforming on search engine result pages. You can use robots.txt file, which is a “robots exclusion protocol or standard,” which tells search engines (some are web robots), which pages to crawl on your site. Subsequently, it also tells web robots which pages not to crawl. Most websites don’t need robots.txt files, but why take your chances? Even though Google can usually find and index all of the important pages on your site as search engines can automatically not index pages that aren’t essential or duplicate versions of other pages, well, it depends largely unless specified. So better practice to put in place your robots.txt files on your website to indicate web crawlers which content they can or cannot crawl. Mainly, when you disallow permanently pruned pages by putting a robots exclusion protocol on your website, this serves as specified crawl instructions of “disallowing” or “allowing” the searches of any or all crawlers or user-agent crawl software.
5. Consolidate by redirecting
When you consolidate web pages by redirecting one to the other web page, it depends, but in most cases, only use suitable redirects such as the 301 redirect as being one of the most SEO-friendly redirects. The 301 redirects can tell robot crawlers that a certain page or pages has been consolidated and moved permanently to the other page. You should always use a 301 redirect whenever you move or change your domain. Such consolidation requires you to change your indices and existing Google rankings to this new page. When your web pages have any duplicate content, it will significantly affect a lot of factors that search engines, as well as site visitors or users, consider. These factors may rank better for some and may not be acceptable to others; hence, it may result in the weakening of all the web pages on your website. When Google and other search engines crawl your website, they will pick one page to index and give it the highest rankings while the similar pages will get low rankings in return. The problem is that these pages would not be the pages that you want to highlight. The next best step to do is consolidate the content into a single page that performs best and use a 301 redirect to point to that page that ranks high.
Redirects are basically suitable for SEO, but a bad implementation may cause trouble and lead to loss of page rankings and drive away site traffic when mishandled. You need to take note as well that consolidating pages by redirecting them to another page is an essential tactic, especially when you change your URLs as this SEO practice can increase the strength and quality of the pages. Note as well that using 401 redirections is unauthorized and can be harmful from an SEO standpoint as this kind of redirection is unacceptable to search engines. Explore Google Webmasters for a quick guide on using redirects, and to submit an updated sitemap and/or the main page for crawling and indexing again.
6. Identify low-quality pages that can be consolidated.
Identify low-quality pages, especially product variant webpages alongside other kinds of thin web pages that should be consolidated where possible. When you can identify them, it will help you not only consolidate but assess how to make it comply with Google’s guidelines. Primarily, these quality factors should be identified, such as the purpose of the web page; the expertise and authority of the website; the amount of value-adding content on the page, and the information about the content creator or website is the most page quality ranking criteria under Google’s eyes. In most cases, various scenarios involving low-quality pages with numerous factors should also be identified when complying with Google’s standards, such as the searcher’s behavior, content spelling mistakes, and other things that indicate the quality of a web page.
In the case of product variants, each specific detail has to be identified, such as price, colors, sub-categories, and other unique identifiers to determine the quality factors of each page. Basically, you do not identify a low-quality page such as in the case of a product variant based on a general category but on each of the product variant’s varieties, latency descriptions, specifications, and identifiers. Knowing whether your web page is low-quality gives you the chance to optimize them and make them high-quality pages.
7. Remove Comment Spam
Searching as well as removing spammy or pointless comments from your web pages can be tedious, but you need to do this content pruning to prevent further harm to your website’s overall reputation. One proactive approach you can do is restrict it via your browser tool’s Internet Options by clicking the Restrict Sites button, and in the pop-up menu, manually type the name of those websites where the comment spams came that you want to block individually. You can also remove and prevent a website notification from your web pages by just following google block options available on the platform. You can also install browser apps, google chrome extensions, and plugins like Akismet to quickly and realistically filter and remove comment spam.
Akismet, a WordPress Plugin, to check every review, comment as well as contact form submissions, and filter potential spam from their global database – telling you whether a blog, review or comment is a ham or spam. When you use Akismet, its discard feature allows your website to outrightly block the spam to save you disk space and speed up your website. Thus, it prevents your site from publishing malicious content. It also provides a status history for an honest review of each comment, whether it is spammed or unspammed. You can also see the number of approved comments for each User.
8. Consolidate pagination pages
We recommend that you index all or any important paginated pages on your website. When you index it, you are able to consolidate your pagination pages that help users or bots discover unique content on your web pages, especially those relevant paginated pages and have it indexed in Google. You can use Google’s URL Inspection Tool to understand whether Google’s search engine has selected a page as the canonical version. Then you can consolidate pagination URLs by using rel=next/prev and canonicalize them to page 1. When you use this rel prev/next markup, you provide a hint for page discovery and site index to Google even though it may not be used for “indexing” purposes, and you will have high chances for link discovery purposes. Keeping a rel=”next” and rel=”prev” link attributes in place on your website will not do any harm to your ranking, useful for accessibility options, can be used by some browsers for prefetching. Consolidating pagination pages using this markup can tell Google how to index your web pages with paginated pages by looking at your page internal links. Make sure to correctly set-up your internal pagination link as this is vital on SEO, mainly providing Google and other search engines such as Bing, better to understand page relations and page discovery of your website. Ensure to consolidate pagination pages properly to organize your web page content into discrete pages that are marked-up sequentially.
9. Remove or update pages that get no traffic
List down the pages that are invisible to searches or those that do not produce organic traffic to your website. You can learn this quickly by using Google Analytics to identify website pages that are getting zero traffic. It is essential to make at least three updates every week unless you have the resources to execute the task frequently, then you can make a daily removal or update of pages that get no traffic from search results. Especially for business websites, a more fresh and updated showcase of your product/ services offer a getaway of inbound traffic to your website. Make sure that URLs that have absolutely no performance that needs to be improved or removed are identified and pruned correctly. Handling this content pruning tactic – deletion or redirection of expired or zero traffic pages can affect your website SEO. If mishandled, it may create issues that can severely clutter and bloat the number of web pages for Google indexing and even cause frustration to users. When you do this, you are able to catalog and filter out those pages that Google Analytics is not able to report, and will give you the fastest way to get rid of those web pages having no traffic while improving the overall page ranking of your website to boost your organic search traffic.
Robust content pruning is needed for web pages that are featured and deleted frequently, such as seasonal product or job listings, classified ads, events, and other contents that expire in a shorter period. Content pruning will clean-up, prune, and organize the site and typically involves 301 redirects to resurrect or update a web page content. While for those web pages that have no traffic and need absolute removal, you can do this by using Webmaster Tools. On the dashboard homepage, click “site configuration” from the left side menu pane and click on “Crawler Access” and then select “Remove URL.” Click on the “New Removal Request” and type the full URL of the page you want to remove from the search results. Removing pages that get no traffic value, or do little to no SEO value on your website is safe. You may want to use a 404 “not found” or 410 “permanently removed” code so that Google can recognize your page removal request and eventually remove it from its index. Recently, according to Google Support published guidance, it may take as long as 90 days for Google to de-index the content.
10. Site Search your URLs
Performing a “site: search your URL.” can be your most comfortable way of content pruning, particularly identifying duplicate content issues on your website. Simply, you can do it through a Google search by just looking for a keyword that you rank for and observe the search engine results. From this result, you can dete