In the last article we spoke about beginner level pagination tactics so let us see how things are at an advanced level. As soon as you master the beginner tactics, it’s not hard to take an in-depth look at the whole process. Server access will allow you to see exactly how the content of your pagination pages is covered by Google before you start implementing the tactics mentioned earlier. Before you apply any of these tactics, we recommend choosing a series of pagination pages from the website in order to see how many pages can Googlebot crawl.

After that, start making queries in Google in order to identify how many of these pages have been indexed. You have the possibility to determine the success of the implementations that you`re doing and come back after the modifications have been made in order to see if the crawl and indexation rates have improved.

Ajax and Java scroll settings

We’ve all probably come upon infinite scroll settings, mostly on e-commerce websites. This characteristic is preferred in order to enhance the user experience and both Ajax and Javascript should be implemented with the help of the Progressive Enhancement.

Thus, you ensure the fact that the website works perfectly for those that don’t have Javascript enabled and you can implement the above mentioned pagination tactics. The irrefutable advantage is the fact that Googlebot accesses the content without any problem, runs over it and indexes it and the website offers advanced Javascript navigation functions for the users. Here you must take into account the loading time of the page.

View-all page vs rel=”prev”/”next”

View-all page or rel=”prev”/”nest”? Which one of the methods is the most popular? Google states that they prefer using the “View-all page” method for resolving pagination problems but there are instances in which “rel=”prev”/”next” is more appropriate, in terms of relevancy signals.

Google states that, in terms of link authority, the two methods are similar, thus leading to the following question: What about the other relevancy signals, unique URLs, titles and descriptions? In the instances in which canonicalization appears when we use the View-all method, Google knows how to identify the elements on the canonicalized page and shifts all the link authorities on the first page to the View-all page.

On the other hand, there are instances in which pages that are intertwined by the “rel=”prev”/”next” method have unique titles and URLs thus any of these pages have the possibility to position themselves in the search for relevant queries, and every page has its own relevancy signals as opposed to the instance in which all of these pages are canonicalized by the first page.

Even though we don’t know the way in which Google handles the rel=”prev”/”next” in the index, we can be sure that in the instances in which the pagination pages (except the first page) are returned in SERP, the URL, title and other factors have a defining role into establishing the relevancy for any query.

Parameters and rel=”prev”/”next”

Sometimes, while implementing the rel=”prev”/”next” to the pagination pages URL, these will contain certain parameters, such as the unique ID of the session, that cannot change the content of the page. This can lead to duplicate content issues.

To avoid this, you`ll just have to tell Googlebot to not crawl certain URLs, using the “URL Parameter” in Google Webmaster Tools. But we can keep the authority of the links from these URLs by using rel=”prev”/”next” along with the canonic tags.

Firstly, you’ll need to be sure that all of the pages in the rel=”prev”/”next” sequence, are using the same parameter. Secondly, every URL that contains a parameter, must be canonicalized to its version without a parameter.
For example: we have three pagination pages accessed through the same SessionID:

site.ro/pagina1.html?sessionID=10
site.ro/pagina2.html?sessionID=10
site.ro/pagina3.html?sessionID=10

The pages are intertwined bi the rel=”prev”/”next” as in the example bellow:
pagina1.html will have pagina2.html will have and pagina3.html will have
The pages that contain the SessionID will have the canonical tag implemented to the pages without a SessionID

site.com/page1.html?sessionID=10 -> canonical to site.ro/pagina1.html
site.com/page2.html?sessionID=10 -> canonical to site.ro/pagina2.html
site.com/page3.html?sessionID=10 -> canonical to site.ro/pagina3.html

Filtered content and rel=”prev”/”next

Here is an example for a situation in which the parameters filter the content into a pagination page sequence. In the following example, we have a parameter that filters the product pages by brand:

Page 1: http://www.site.com/page1.html?brand=nike
The content on every page depends on this variable, thus:
Page 1: http://www.site.com/page1.html?brand=adidas
Page 2: http://www.site.com/page2.html?brand=adidas
It will retrieve a set of products completely different from:
Page 1: http://www.site.com/page1.html?brand=reebok
Page 2: http://www.site.com/page2.html?brand=reebok

If you are sure that these pages bring some extra value to the user and want these pages to be present in the Google index, we advise you to create different sequences with rel=”prev”/”next” for every brand filter.

We`ll have intertwined pagination pages with rel=”prev”/”next”, for every brand:
page1.html?brand=nike
page2.html?brand=nike
page2.html?brand=nike

Sorted content and rel=”prev”/”next”

The final kind of URL parameters that we`ll speak of is the one with content sorting. Even though you can mainly find it into a blog or forum, it can also be present on an e-commerce website.
For example:

Page: http://www.site.com/page1.html?order=oldest
There can also be an option to view the newest products (items) first.
Page 1: http://www.site.ro/page1.html?order=newest

There are of course, debates in the SEO community regarding the best tactics to handle a situation like this. A tip from some of them would be to try and divide the rel=”prev”/”next” sequences both for the “newest” and for the “oldest”. In which case Google should index more pagination pages with the same content. This could lead to huge duplicated content problems.

Our advice is to try and make the navigation as simple as you can and try to eliminate the hassle both for the users and for the search engines. Even though many of the methods may seem complicated at first, studying and applying them for every separate case could make the job for the SEO specialists much easier.