When we talk about the organic searchThere is a way to divide the discipline into 3 types of optimizations (other than technical, content, popularity!).
One can indeed differentiate on-site optimizations, on-page and off-site.
On-site SEO represents all the SEO optimizations carried out on the website, in a global way. It includes technical SEO, at the site level, as well as on-page SEO, which is the SEO specific to the pages of a website and its contents.
On-site SEO is opposed to off-site SEO (optimizations outside the site itself).
What exactly does the On-Site SEO consist of?
Here is a non-exhaustive list of important On-Site SEO criteria for your website. Note that they will not all have the same weight in your SEO.
- The tree structure / site structure
The structure of your site plays a determining role in your natural referencing. It is therefore essential to compose a menu in phase with theyour site identity and what the latter has to offer, but also in line with theuser and his research.
Thus, the first level pages (in particular the main tabs of your menu, which have an important weight) of your site must take into account the search volume of the topics you wish to highlight.
Anyway, to think a SEO-friendly treeIt is essential to carry out a keyword search!
- The depth of the pages
This is due to the site structure that you will have set up: the further your pages are in your tree structure, or even hidden, the less impact they will have in SEO.
Make sure your pages are accessible in less than 4 clicks from the home page!
- The existence of zombie pages
What's a zombie page? It is a page that is little or no SEO positioning (thus bringing little traffic from natural referencing) but also being very little visited directly on the site.
How do these zombie pages Are they penalizing for a site and its SEO? Well, search engines are likely to pass on irrelevant pages when they could have used their crawl budget to explore really relevant pages of your site.
How detect zombie pages ? Take your Analytics data over one year and export the pages that get the least traffic. Cross-reference them by the number of words they contain. Sort them carefully, you have to understand the usefulness of each page. If it is useful, modify it and work on its internal mesh to make it live. If it is not useful, you can delete it, redirect it...
Otherwise, the tool RMTech makes it very easy to bring them out!
- Loading time
Loading time, according to the SEO community, would be a lightweight criterion. However, it will (surely) have to be more important, especially since everyone agrees that it is crucial from a user experience point of view.
To have a good loading time, make sure to :
- Minimize your image files
- Configure your server caching
- Have a server adapted to the size of your site
- The Https protocol
This is also a criterion that has little or no impact on websites according to SEO.
However, there again, https becomes the norm and greatly reassures visitors: it therefore participates in the user experience and can, at its scale, be considered as an on-site criterion.
- Errors 404
To improve the crawl of search engine robots, your site should not contain 404 errors.
404 errors are problematic because they are pages that no longer exist (and therefore return the error code), but are still still indexed by search engines.
Like finding your mistakes 404? By making a crawl of your site (with Screaming Frog for example).
How to correct 404 errors? By making redirections 301 from old pages to new pages of the same theme.
- Lack of mixed content
Mixed content is when one of your https pages contains a link to a page in http. This is to be avoided, even if an automatic redirection takes place!
- Redirection chain
It may happen that your site contains a internal link to a page that no longer exists and has been redirected. So you have 301 redirection codes in the structure of your site. The link from page A leads to page B which leads to page C.
To optimize the crawl of Google's robots, it is preferable to change the link of the old URL, redirected, to directly integrate the new URL.
You can find redirection strings using Screaming Frog or Semrush auditing.
- robots.txt file
Is your robots.txt file compliant? Does it block well the resources of your website that should not be explored? Does it include a link to the site map?
Benchmarking The proper functioning of its robots.txt file is important. and you will have to check it as part of your on-site optimization process!
- The sitemap.xml file
The sitemap.xml file can be considered as a directory of the pages of your website - at least the pages you want search engines to crawl first.
It will thus encourage Google and others to navigate on these pages in particular. This has the advantage ofspeed up the indexing process !
Of course, the sitemap.xml won't prevent search engines from exploring pages that are not listed there!
- Mobile compatibility
There is no doubt that from a user experience point of view, the compatibility of your site with the smartphone navigation is very important. And this criterion can now be considered as a real SEO criterion!
An on-site criterion, therefore, not to be neglected in the long term.
- The absence of duplicate pages
In SEO, each page of your site must have a very specific purpose and therefore satisfy a user search intention.
To facilitate the understanding of your site, its pages and their themes by Internet users such as search engines, it is important that there are not several web pages with the same content. Duplicated content can indeed cause risks of cannibalisation between different pages, which is not going to be favourable to their natural referencing.
How do I detect the presence of duplicate pages? The Semrush audit in particular, here too, will easily bring it up!
- And of course, the most important part: On-Page SEO!
Title tag, meta description, H1, Hn tags, content, blog posts, structured data... We tell you more in our article dedicated to On-Page SEO!
Here it is: so, On-site SEO = the technical part of a site + On-page SEO. Of course, these elements are only some of the ways to optimize a site on-site.
We could also have mentioned the management of canonical beacons, the implementation of a self-referencing canonical beacon system, the good management of hreflang beacons in the case of a site targeting several languages, the taking into account of Google's new UX indicators (The "essential web signals": LCP, FID, CLS ...) ...
Yes, the SEO is great 😇