The quality of your website content is the primary driver of your website’s search engine rankings and user experience. Great content is still king and better than having quality content created specifically for your intended user than that content that increases site traffic, which improves your site’s authority and relevance to user needs. You need to fine-tune your website content when you want to go along this direction. And most often, it is rather good not to add to, and just let go of something that no longer gives value to your web page content. This situation is real and frequently happens to search engine optimization (SEO). That to optimize the content of your website, it is better to let go of underperforming or non-performing webpage content. So, better remove, prune, and take a portion of that content away, to allow those remaining web page content to revitalize and shine out their value to achieve your website SEO goal.

What is SEO Pruning?

The term SEO pruning, content pruning, and SEO content pruning are interchangeably in use. Contexts wise, these terms are usable one after the other or all usable at one time as long as the use of these terms leads to the same purpose for website SEO. When Google rolled out Google Panda 4.0 and Penguin updates, these updates enforced a reduction of the number of Google search indexed pages. Some webmasters began to perform content pruning of low-quality content. Those contents that did not add any value to the website anymore, such as thin or poor content, bad link building, and keyword stuffing that become underperforming or non-performing web page content. It is easier said than done to decide, and prune or remove indexed pages as it may go wrong if it is mishandled.

When you carry out the techniques of SEO Content Pruning, you can essentially define SEO prune, pruning or optimization, as a selective process to remove a webpage content from the website with the end goal of taking away unwanted elements, links, or pieces of content that are underperforming to improve overall search engine visibility and direct new, healthy traffic reputation of your website.

Why is SEO Pruning essential for your website?

That is what SEO pruning is all about. It matters significantly to clean and organize your website on two optimization fronts. First, your SEO pruning can take away content that is low-quality or poses threats or harm to your website ranking and visibility. Second, your content pruning can reorganize your content to bridge organic traffic while simplifying similar content elements or duplicating pieces of information and integrate these into a more accessible and shareable piece of website content for your search engines, readers, and customers.

Quality and not quantity matters most in search engine optimization, get rid of those non-performing content that can be indexed by search engines causing your entire website down, and provide a unique content to entice customers who get smarter every day in deciding whether to choose your content over the other website. These are the reasons that make SEO pruning as one of the effective SEO best practices. So, what SEO content pruning does is cutting off, taking away, and editing those underperforming or non-performing pages to improve the overall SEO health and reputation of your website.

Yes, it is best to maintain a little but great content to your website than pumping in or pampering as many web page content into Google search engine as possible. Get familiar and master these SEO Pruning Tips to guide you in pruning those underperforming or non-performing content that can pull your entire website down when Google search engines might index it. Kindly note that this is not a step by step guide and you are free to add other considerations to this list, that is arranged in no particular order. You can always tailor it to your website or business needs.

1. Identify pages with syndicated content.

It is best to define your content syndication strategy for you to optimize the potential of your web-based content and have it republished through a third-party website or syndication platform. Syndication works well and becomes useful when you are working symbiotically with a robust (website content optimization) strategy that begins with great content. When your digital content such as blog posts, articles, infographics, videos, and other relevant content of your website is syndicated, it enables your website to build a win-win relationship of having your content being published while the third-party website gets free, relevant content as a reciprocal value of such content exchange. Not all syndicated content is made unique, but when web-based content containing value-added content is affiliated and copied across websites, the syndication makes your web pages unique.

Identify the pages with syndicated content and make this available either in summaries or full renditions of your website’s recent content additions for content licensing for reuse or republishing arrangements. When you identify pages with syndicated content, include any type of digital content like videos, infographics, and other multimedia aside from the usual textual content. When you have correctly identified your syndicated page content, monitor and moderate its syndication cycle as it can last long, and may even create another series of syndication loops of content exposure, publicity for a wider content reach frame, and backlinks to your website. Make this as your kick-off point for the rest of your SEO pruning tactics that we will discuss in the succeeding sections.

2. Temporarily Pruning Out of Stock Pages

You can temporarily prune out of stock webpages that may return later on your website. Consider that these webpages can best represent what is happening now to our low-to-no-supply and high-demand conditions. And that it may again drastically change over a definite period shortly. Hence, you need a temporary redirect while keeping your webpages to rank and continue to give value to your website searchers or customers. Basically, you are not pruning this content out of your website, but just taking this away from Google index for a definite time. Temporarily prune your web pages with <META NAME=”ROBOTS” CONTENT=”NOINDEX, FOLLOW”> as you don’t want to move it permanently. The robots exclusion protocol tells web robots or most often the search engines what to crawl or not on your site.

You can also use product-offer schemas to set options such as In-Stock, In-Store-Only, Out-Of-Stock, and Sold-Out. This information, as part of your organic results, will be displayed by Google, which is still a continued ranking of your webpage. The schema.org ItemAvailability tool can help you do this content pruning tactic.

3. Remove Review Spam

Before you remove any of your review spam from your webpages, check the site’s ratings and reviews whether these make sense. Then list down the issues of each spam. Identify these spams to reduce anomalies on your web pages content. Check for as well as filter out spammy or pointless reviews. You can use tools like Google Alerts, MOZ fresh web explorer, reputation management software, WordPress Plugins, and other related applications that can detect, remove your review spam, as well as filter out spammy reviews.

Also, Google has new updates from time to time on new review spam detection, so better follow them for your timely use, such as the recent algorithm changes in February this year. The update aimed to address review spams on Google+ Local pages to filter potential fake or computer-generated review spams. Google removes any “fake glowing testimonial,” or those reviews that are “too good to be true” as part of Google’s new algorithm update. In case of a negative review, Google will only remove it if it violates Google’s Guidelines to ensure posting of the true reviews, regardless if it contains negative or positive comments.

This leaves many companies to think about what they can do in terms of SEO content optimization to make sure their Google+ reviews remain on the page. In short, Google works to ensure that only irrelevant reviews and spammy reviews are removed to assure that your other reviews will not be deleted or affected. Well, business owners can help out by following a few simple content rules – and that is to make sure that your valid reviews remain on the page, and you can appropriately handle these authentic reviews.

4. Disallow permanently pruned pages.

Disallow permanently pruned web pages that still exist on your website. Make sure to remove only those content that’s weighing down your website, those that are underperforming on search engine result pages. You can use robots.txt file, which is a “robots exclusion protocol or standard,” which tells search engines (some are web robots), which pages to crawl on your site. Subsequently, it also tells web robots which pages not to crawl. Most websites don’t need robots.txt files, but why take your chances? Even though Google can usually find and index all of the important pages on your site as search engines can automatically not index pages that aren’t essential or duplicate versions of other pages, well, it depends largely unless specified. So better practice to put in place your robots.txt files on your website to indicate web crawlers which content they can or cannot crawl. Mainly, when you disallow permanently pruned pages by putting a robots exclusion protocol on your website, this serves as specified crawl instructions of “disallowing” or “allowing” the searches of any or all crawlers or user-agent crawl software.

5. Consolidate by redirecting

When you consolidate web pages by redirecting one to the other web page, it depends, but in most cases, only use suitable redirects such as the 301 redirect as being one of the most SEO-friendly redirects. The 301 redirects can tell robot crawlers that a certain page or pages has been consolidated and moved permanently to the other page. You should always use a 301 redirect whenever you move or change your domain. Such consolidation requires you to change your indices and existing Google rankings to this new page. When your web pages have any duplicate content, it will significantly affect a lot of factors that search engines, as well as site visitors or users, consider. These factors may rank better for some and may not be acceptable to others; hence, it may result in the weakening of all the web pages on your website. When Google and other search engines crawl your website, they will pick one page to index and give it the highest rankings while the similar pages will get low rankings in return. The problem is that these pages would not be the pages that you want to highlight. The next best step to do is consolidate the content into a single page that performs best and use a 301 redirect to point to that page that ranks high.

Redirects are basically suitable for SEO, but a bad implementation may cause trouble and lead to loss of page rankings and drive away site traffic when mishandled. You need to take note as well that consolidating pages by redirecting them to another page is an essential tactic, especially when you change your URLs as this SEO practice can increase the strength and quality of the pages. Note as well that using 401 redirections is unauthorized and can be harmful from an SEO standpoint as this kind of redirection is unacceptable to search engines. Explore Google Webmasters for a quick guide on using redirects, and to submit an updated sitemap and/or the main page for crawling and indexing again.

6. Identify low-quality pages that can be consolidated.

Identify low-quality pages, especially product variant webpages alongside other kinds of thin web pages that should be consolidated where possible. When you can identify them, it will help you not only consolidate but assess how to make it comply with Google’s guidelines. Primarily, these quality factors should be identified, such as the purpose of the web page; the expertise and authority of the website; the amount of value-adding content on the page, and the information about the content creator or website is the most page quality ranking criteria under Google’s eyes. In most cases, various scenarios involving low-quality pages with numerous factors should also be identified when complying with Google’s standards, such as the searcher’s behavior, content spelling mistakes, and other things that indicate the quality of a web page.

In the case of product variants, each specific detail has to be identified, such as price, colors, sub-categories, and other unique identifiers to determine the quality factors of each page. Basically, you do not identify a low-quality page such as in the case of a product variant based on a general category but on each of the product variant’s varieties, latency descriptions, specifications, and identifiers. Knowing whether your web page is low-quality gives you the chance to optimize them and make them high-quality pages.

7. Remove Comment Spam

Searching as well as removing spammy or pointless comments from your web pages can be tedious, but you need to do this content pruning to prevent further harm to your website’s overall reputation. One proactive approach you can do is restrict it via your browser tool’s Internet Options by clicking the Restrict Sites button, and in the pop-up menu, manually type the name of those websites where the comment spams came that you want to block individually. You can also remove and prevent a website notification from your web pages by just following google block options available on the platform. You can also install browser apps, google chrome extensions, and plugins like Akismet to quickly and realistically filter and remove comment spam.

Akismet, a WordPress Plugin, to check every review, comment as well as contact form submissions, and filter potential spam from their global database – telling you whether a blog, review or comment is a ham or spam. When you use Akismet, its discard feature allows your website to outrightly block the spam to save you disk space and speed up your website. Thus, it prevents your site from publishing malicious content. It also provides a status history for an honest review of each comment, whether it is spammed or unspammed. You can also see the number of approved comments for each User.

8. Consolidate pagination pages

We recommend that you index all or any important paginated pages on your website. When you index it, you are able to consolidate your pagination pages that help users or bots discover unique content on your web pages, especially those relevant paginated pages and have it indexed in Google. You can use Google’s URL Inspection Tool to understand whether Google’s search engine has selected a page as the canonical version. Then you can consolidate pagination URLs by using rel=next/prev and canonicalize them to page 1. When you use this rel prev/next markup, you provide a hint for page discovery and site index to Google even though it may not be used for “indexing” purposes, and you will have high chances for link discovery purposes. Keeping a rel=”next” and rel=”prev” link attributes in place on your website will not do any harm to your ranking, useful for accessibility options, can be used by some browsers for prefetching. Consolidating pagination pages using this markup can tell Google how to index your web pages with paginated pages by looking at your page internal links. Make sure to correctly set-up your internal pagination link as this is vital on SEO, mainly providing Google and other search engines such as Bing, better to understand page relations and page discovery of your website. Ensure to consolidate pagination pages properly to organize your web page content into discrete pages that are marked-up sequentially.

9. Remove or update pages that get no traffic

List down the pages that are invisible to searches or those that do not produce organic traffic to your website. You can learn this quickly by using Google Analytics to identify website pages that are getting zero traffic. It is essential to make at least three updates every week unless you have the resources to execute the task frequently, then you can make a daily removal or update of pages that get no traffic from search results. Especially for business websites, a more fresh and updated showcase of your product/ services offer a getaway of inbound traffic to your website. Make sure that URLs that have absolutely no performance that needs to be improved or removed are identified and pruned correctly. Handling this content pruning tactic – deletion or redirection of expired or zero traffic pages can affect your website SEO. If mishandled, it may create issues that can severely clutter and bloat the number of web pages for Google indexing and even cause frustration to users. When you do this, you are able to catalog and filter out those pages that Google Analytics is not able to report, and will give you the fastest way to get rid of those web pages having no traffic while improving the overall page ranking of your website to boost your organic search traffic.

Robust content pruning is needed for web pages that are featured and deleted frequently, such as seasonal product or job listings, classified ads, events, and other contents that expire in a shorter period. Content pruning will clean-up, prune, and organize the site and typically involves 301 redirects to resurrect or update a web page content. While for those web pages that have no traffic and need absolute removal, you can do this by using Webmaster Tools. On the dashboard homepage, click “site configuration” from the left side menu pane and click on “Crawler Access” and then select “Remove URL.” Click on the “New Removal Request” and type the full URL of the page you want to remove from the search results. Removing pages that get no traffic value, or do little to no SEO value on your website is safe. You may want to use a 404 “not found” or 410 “permanently removed” code so that Google can recognize your page removal request and eventually remove it from its index. Recently, according to Google Support published guidance, it may take as long as 90 days for Google to de-index the content.

10. Site Search your URLs

Performing a “site: search your URL.” can be your most comfortable way of content pruning, particularly identifying duplicate content issues on your website. Simply, you can do it through a Google search by just looking for a keyword that you rank for and observe the search engine results. From this result, you can determine if you have a non-user-friendly URL of your content, as shown by Google, then you have duplicate content on your website. The URL parameters are one source of duplicate content issues, not only by the parameters themselves but also by the order in which those parameters appear in the URL itself, which means the simpler the URL, the better. So, keep it short and straightforward. Also, you can address duplicate content issues by performing a URL inspection and these four steps – preventing to create duplicate content, redirecting duplicate content to the canonical URL, adding a canonical link element to the duplicate page, and adding an HTML link from the duplicate page to the canonical page.

On how to do your URL inspection, you run Google’s URL inspection tool to run an analysis based on its last crawl and see your URL’s current index status. The tool will inform you if the URL is on Google and searchable via search results and if the mobile version through AMP is valid. The first step is to open the URL Inspection tool, click the button above it, and click URL inspection in the navigation panel in the middle of the Search Console. Second, type the complete URL that you want to inspect and read how to understand the URL inspection results. And third, though this is optional, you can run an indexability test on the live URL.

11. Identify pages that should be removed

Make sure to indicate which web pages need to be removed and identify what content to be removed. It is a good chance that your website, once it has any of the criteria mentioned in the previous sections of this SEO pruning tips, you have a low quality, underperforming, thin, and shallow contents. While these are not good for Search engine optimization of your website, identify these pages, and it should be removed. Those pages with duplicate, automatically generated, affiliate and scraped content, and doorway pages are some of the pages that should be listed for removal. You can use Google Removals Tool to block pages from Google Search Results on your website temporarily. Again, take note that you consequently remove one or more URLs as you delete a page (or post) from your website. That old URL, when visited, will usually return a ‘404 not found’ error, which is not the best thing for Google or your users. So, once you have identified these pages for removal, you have to include it to your redirect list. If you want to take the content away from your site, using a 410 header would be a better idea. A spreadsheet can best list down all of the pages you have identified to monitor it appropriately. You identify and list these pages before to provide you two options for content pruning. These options are to either redirect them (meaning temporary removal from search results), give that page’s equity to your new page, and remove or take them out from the website.

12. Get Internal Links From GSC

You need to obtain internal links to your website from Google Search Console (GSC). To make a sample on how to get internal links, you can perform Google search using the “link:” operator. For this instance, the [link:www.google.com] will list you with choices of web pages that have links pointing to Google’s Home page, and ensure there is no space between the “link:” and the web page URL. You can get internal links from GSC by performing the following steps: Click the site you want on the GSC Home page, click Sitelinks under the Search Appearance tab. Then, in the “for this search result box,” complete the URL for which you don’t want a specific sitelink URL to appear.

Also, you can utilize online internal link analyzer or checker tools via Google, Google Chrome extension, apps, and plugins, which are available to check the links structure and the total number of links on a specific page of your website. There are other ways convenient as well for you to check the hidden pages of your website. Remember that when you create or delete any content, each has a corresponding URL or links. In such a scenario, you can use free online link checker tools for broken links on your site. Broken links can be hidden from browsers and search engines.

13. Prune retired pages.

Remove Pages with outdated information, those that are not getting, and never will get, traffic after conducting a site audit. Ensure your website of the content that contributes to its overall quality and permanently prune out-of-stock webpages that absolutely give no return – either in terms of traffic or SEO value. Pruning your retired pages on your website is good for your site SEO, for so long that it does not fulfill specific functionality that requires it to be kept on the website. Such retired pages have to be pruned to prevent it from becoming a dead weight and bring your site down in rankings. So check thoroughly what are those content elements that should not be kept on your website. These things include automatic sound or music, splash pages such as welcome or entry pages that load first before the actual home page does on your website.

While those content that is used for a short period (sales promotions) or pages that briefly expires, should be retired from your web pages as well as those pop-ups, background images, large images, animated banner ads, and a lot more. Well, this can be another tedious content pruning work that will require you to permanently prune retired pages and, at the same time, prune their links. If these things are avoided, it will lessen the numerous retired pages that you need to prune. So, it is better to avoid this kind of content on all of your web pages when you are still creating your website or migrating it to a new domain or when you do SEO optimization, such as this SEO tactic. Link pruning is also helpful in permanently removing bad links to your website. The latest Google Penguin update affects most websites, and link pruning is one of the methods to analyze and remove links or URLs of retired pages.

14. Consolidate Product Variant Pages

You can combine product variant pages by eliminating several product variant pages into a solitary product URL or consolidated page, which makes it possible for a flexible product variant selection. Then you can consolidate these pages to a similar one and can also resurrect them to the one that you are going to select as the original, newly updated web page that you want to keep. In Shopify, for instance, you can consolidate a maximum of five products with a maximum of 100 variants for a consolidated product and make it available for a product variant mapping (available only on Oberlo Basic and Pro subscriptions). In case the unified product does not have any variants, it cannot be available for variant mapping. This methodology of combining product variant pages is useful when you find multiple suppliers or manufacturers that offer similar products with distinct variant offerings from each other. This product variants can be easily consolidated into one solitary product and allow you to have one consolidated product variant page for all your customers on one product. It can also be possible to consolidate one variant when a supplier runs out of it. Then you can have it merged with the same product from a different supplier who offers that similar variant. This product variant consolidation can give you a flexible option of changing the variant that is out of stock with the new one without the need of deleting the whole product variant page.

Also, you can consolidate the similar pages by using a 301 code to redirect them to a single URL or page version and let Google crawl it to index the page. This page usually drives the highest SEO value or user engagement, a value that shows the same content, or fulfills a similar role or functionality based on a business and user experience perspective. Take note that you should replace, and remove all of the internal links going to this URL that you are to resurrect. In removing those links, take them away from the XML site map as well. Once you have done this to the content that is held with the new or resurrected page, which the other URLs are redirecting to, double-check if it is really the content of the final destination page. Make sure that when you consolidate product variant pages, these pages have no content that is too poor/ thin, neither updated nor relevant enough nor formatted to satisfy the user’s needs.

15. Get Traffic Data for Internal Pages

Getting traffic data for internal pages can be done through the use of a few online tools available to make it easy to determine as well as to measure the traffic of internal pages. One common tool is Google Analytics or Google Search Console (GSC) to collect and analyze the traffic data of your internal web pages. Tracking traffic data for your website’s internal pages is one of the SEO best practices that checks user flow by finding information on your website’s internal traffic using Google Analytics. To do this, you have to access your Google Search Console (GSC) Account, Go to Audience > Users Flow. Select Landing Page in the dimension settings and enable you to view an impression of the paths users take through your website. These impressions are the number of times that a URL from your site appeared in the search engine results as users view it that affects the performance of your internal pages. These impressions do not include paid Google Ads searches. The Google Search Console allows easy monitoring and resolution of server errors, site load issues, and security issues like hacking and malware, which can harm your internal page traffic.

So, getting the data of your internal pages will help ensure smooth site maintenance or optimization for SEO, user search performance, and internal site traffic. Getting traffic data for your internal pages will also guide Google towards pages you want to rank and index faster, and keep them up-to-date as well. So bring up traffic data of your internal pages to increase user impressions.

16. 404 and 410 deleted pages

The 404 and 410 codes or responses are useful when you permanently prune web pages by deleting them and returning the needed response for your users and search engines. Ensure that whenever you remove a page (or content) from your website, you also de-link all of the URLs. That old URL will return a ‘404 not found’ error when not removed and indexed by the search engine. A 404 is not the best impression for Google or users. You could redirect the deleted page to another page, or if you want the content totally gone from your site – a 410 code is your better idea. Should you run out of alternatives page on your website with that information, you need to decide whether it’s better to remove, keep, or improve it. Once you are entirely sure to remove it, ensure you send the proper HTTP header: “410 content deleted header.”

In Google terms, a 410 error means “gone,” when the server returns this response. Once Google has permanently removed the requested content, a 410 response is displayed. It also has similarities to a “404 Not found” code but is sometimes used in the place of a “404 resources” that used to exist but no longer do. Now, the difference between a 404 and a 410 header is simple: 404 means “content not found,” 410 means “content deleted” and is, therefore, more specific. If a URL returns a 410, Google knows for sure you removed the URL on purpose, and it should, therefore, remove that URL from its index immediately. The 410 response, when used with a temporary custom 410 page, allows the search engine robots the more accurate status and knowledge that the old link should be removed from their crawl index, preventing unnecessary traffic. You can also try available plugins such as the 410 for WordPress for this purpose.

17. Identify pages that can be improved.

Be certain to determine which URLs could be improved instead of pruning to get search engines a better way to understand the information contained on your webpage describing what it is all about. Getting traffic data for your internal pages will also guide Google towards pages you want to rank and index faster, and keep them up-to-date as well. After a content audit based on your SEO goals for your website, be certain to determine which of the URLs on your website that can be improved instead of pruning it. After the audit, you have to map these URLs on your website so you can appropriately identify pages that can be improved from those pages that need pruning – either redirection or deletion.

Check as well those URLs that link to scraped or copied contents as there might be spaces that can still be improved in terms of a design and quality premise. Any designer would tell you that a clean design can often help provide impressions to others because of the impact that the minimalist design makes. These minimalist designs are easier to identify for essential content (text, images, etc.), which effectively used white spaces on their web pages making the site look more engaging. There are web design tutorials that you can benchmark on how to effectively identify pages that utilize good design, quality, and other dynamics on your web page and improve it significantly for better user experience. When you identify (listed or cataloged) these pages that can still be improved, it will make it a lot easier to search engine robots to identify and give your web page with high ranking and recognize its rich-quality and high-value content. Using schema for SEO is also handy to this intent. When your website pages are using schema markup, it helps search engines return more informative results to users. Sight the rich snippets, and it tells the search engine result page (SERP) crawlers to better understand the website information and recognize the basic identity of the page.

18. Identify underperforming pages

Identify underperforming pages to determine whether they have no inbound links, no shares, no traffic, no sales, not indexed, rank poorly, and have no Click-Through-Rate (CTR). Otherwise, you can still do some content optimization for those that can still be improved. Getting traffic and doing a site audit to identify and prune those underperforming pages effectively are both excellent ways to help identify those weak pages. There are a lot of content pruning tactics you can adopt nowadays. You can start with the simplest way to identify underperforming pages. Simply check your landing page for inbound URLs or links, shares, traffic, sales, indexes, SERP ranking, CTR, call-to-action (top or bottom of the page text), out-dated content, broken links, among others. Include content such as product listing page, product video tutorial, and online demonstration media and ensure match-up of the results you want to identify based on the goals of the content. Also, you can use Google Webmaster Tools to spot weak web pages, or Google Analytics to look for poor performers, and search error logs for issues that affect the landing page of your website.

Use the Google Search Console to view all of the pages on your website. It will show you page analysis (such as graph) with details about queries, pages, countries, devices, and search appearance. When you click the page option, you will be provided with the site’s pages, along with each page’s clicks and impressions, sorted from most clicks to least. Using this information, you can validate the status and performance of each page, and it makes it easy to identify if a page is underperforming. A page that shows up in search results but rarely gets clicked can be one. Google evaluates each page’s CTR as an overall measure of your website’s quality index. When you improve a page that performs poorly, you can enhance its CTR, indicating Google that your site is doing better, and it gives searchers the content that they want from your website. Explore these SEO tools to identify underperforming pages and backlinking (inbound link) on your website – Google SEO, Ahrefs, Moz SEO, SEMrush, and a lot more. Once you find and identify underperforming pages, you can fix these pages to give your website a boost in the organic search results.

19. Prune search result pages

You can do a simple removal of search result pages or web page browsing contents by clearing all your history on your computer using Chrome. Go to the top right corner, click “More,” and click “History,” then on the left, select “Clear Browsing Data.” A box will appear after that from the drop-down menu, choose how much search results history you want to delete. And to clear all browsing history, select the beginning of time, then click “Clear browsing data.” While for a more detailed content pruning of the search results of your web pages if they get no more traffic. These results may become obsolete or low-quality content when they are left behind unoptimized or unpruned. Over time, these may transform to a huge pile of page contents, which will drag your site down from search engine ranking.

Given that you’re going to see many more results per page with little to no search traffic, backlinks, social shares, a page that loads forever, and pages that don’t provide any value to the user. So, consider removing these search result pages. Just take note that when you prune search result pages, you have to update the internal links that were pointing to the page URL. Take this URL and have it 301 redirect to the most relevant page you have selected to highlight for Google to index. Prunin