Your cart is empty

SEO

Current Quickstarts

joomla 4 quickstart packages
Various extensions can be selected!

Technical SEO

Use a suitable domain

Domain

To anticipate it right away: The domain has no decisive influence on SEO.

Nevertheless, one should carefully choose the domain for the user and thus for potential customers. It should be related to the company. This can be done, for example, through the use of one's own brand name, which creates a kind of familiarity, as well as through a thematic relevance to the company. Both can also be combined!

Ideally, the domain should be short and easy to remember. A short domain also looks nicer on business cards or other printed matter.
So-called keyword domains, which contain certain keywords, no longer play a role in the ranking. The search engines generate the keywords from the content of the website.

Top Level Domain (TLD)

In terms of SEO, it hardly matters which TLD you use. The classic TLDs (.com, .org, etc.) are treated in the same way as the generic TLDs (e.g. .club, .shop, etc.). If the TLD contains a keyword, this is also irrelevant for the search engines.

However, it looks different with the country-specific TLDs! A Top Level Domain such as ".de" is rated higher in Germany than in South Africa. Regional domains such as ".berlin", on the other hand, are rated in the same way as .com domains.

 

Use SSL certificate

If you use an SSL certificate for your website, the data is encrypted and transmitted between the website and the server. This makes the website more secure, which in turn creates trust among website visitors.

Some browsers now report when a website is unsafe. Then there is a high risk that website visitors will leave the website again immediately. Search engines like Google and Co. classify an unsafe website, which results in a poorer ranking. An SSL certificate is therefore highly recommended.
There is also the fact that, for reasons of data protection law, data in contact forms may only be transmitted in encrypted form!

A secure website has the prefix https (Hypertext Transfer Protocol Secure) in the URL. If the website is insecure, the "s" is missing and the prefix is just http.

There are different SSL certificates, e.g. for 1 domain, for 1 domain and its subdomains, for multi domains as well as certificates with different validations or covered damage amounts, and much more.

Nowadays, an SSL certificate uses the TLS protocoll (Transport Layer Security), which is the successor to the SSL protocoll (Secure Socket Layer).
Since the abbreviation "SSL" is better known than "TLS" from a development point of view, the term "SSL" is still used by the hosters, sometimes also "SSL / TLS".

The current TLS version is 1.3. The outdated protocols SSL 2.0 and SSL 3.0 are insecure and may no longer be used! TLS 1.0 and 1.1 are also no longer supported.

 

Optimize URL structure

The next thing to do is to optimize the URL structure. A meaningful URL gives visitors to a website a first impression of what they can expect on this page, e.g. the contact details under the URL "www.joominator.de/de/kontakt". Keywords can also be used here. However, you shouldn't overdo it with the keywords and keep the URLs as short as possible.

In Joomla you can optimize the URL structure by activating the "Search engine friendly URL's"! Here Joomla offers several variants.

First go to the following page in the backend of Joomla:
"System" -> Setup: "Global Configuration" - TAB "Site"
or alternatively via the dashboard!

Technical SEO in Joomla

Without Search Engine Friendly URLs

Without any activation, neither the website visitor nor a search engine can see what is hidden behind the URL. In Joomla it looks like this, for example:
https://example.org.de/index.php?option=com_content&view=category&layout=blog&id=3&Itemid=102
 

Search Engine Friendly URLs

In the first step to optimize the URL, set the option “Search Engine Friendly URLs” to “Yes”. This means that the alias of the menu item is also used in the URL. The URL is now much shorter:
https://example.org/index.php/blog
 

URL Rewriting

In the second step the "/index.php" is removed from the URL. In addition, "Use URL Rewriting" is set to "Yes"!

Important: The option "Use URL rewrite" only works if you also rename the htaccess.txt contained in Joomla to .htaccess at the same time! There are instructions there with which the URLs can be rewritten internally and invisibly for the visitor.
If this renaming is not carried out, the subpages (menu items) can no longer be reached after activating "Use URL Rewriting" (404). Correct spelling is important so that the file can also be processed by the web server: A dot in front of the file name and no file extension!

The renaming is done via FTP or a web interface (e.g. web FTP) of the host. In the first case, FTP programs such as FileZilla are suitable for this. Please always use the current version! Since the .htaccess begins with a period, it can happen that it is no longer visible after it has been renamed. However, this can usually be set in the program used, for example by activating "Show hidden files"!

It is possible that a .htaccess already exists. Instructions such as the PHP version to be used, 301 redirects, cache instructions or HTTP security headers can also be entered there. In this case, the .htaccess must be supplemented accordingly!

Note: The previous description is for an Apache web server. If an IIS7 web server is used, then instead of "htaccess.txt", the "web.config.txt" has to be renamed to "web.config". The following applies to both server types: The URL rewrite module must of course be installed on the server used so that it can work at all.

With "URL Rewriting" and active .htaccess, the URL now has the following structure:
https://example.org/blog
 

Add Suffix to URL

If this option is activated ("YES"), a file extension suitable for the document type is appended to the URL. This is typically the ending ".html". The URL could then look like this, for example: https://example.org/faq.html
This is of no advantage in terms of SEO. On the contrary: It is advisable to leave this option deactivated ("No")!
 

Unicode Aliases

In principle, umlauts and other special characters can also be used in the URLs of a Joomla website. All you need to do is activate the "Unicode Aliases" option ("Yes")! A URL could then look like this, for example: https://example.org/häufige-fragen

If this option remains deactivated ("No"), Joomla will rewrite the "ä" in "ae". The same url would then look like this: https://example.org/haeufige-fragen

According to Google, these two variants are considered equally. In the case of international, multilingual websites, however, there may be problems with the links. It is therefore advisable not to use the Unicode aliases!
 

Sitename in Page Titles

This option has no effect on the URL structure. Still, it's interesting because it can automatically add the website name to the page title displayed in the browser. The website name can be added before or after the page title. Examples:
"example.org - Contact"
"Contact - example.org"

Note: The website name that is entered in the configuration under "Site Name" is used!

 

Domain with or without "www"?

The question arises again and again whether one should use a domain with www or without www. From an SEO-technical point of view, it doesn't matter. Both variants are treated the same by search engines. It is only important that you choose 1 variant and then only use this variant. Then you have the preferred domain.

Important: You should always forward the unused domain to the preferred domain using 301 rewriting. This enables the search engine bot to index the website better, as it does not have to choose between 2 variants. Furthermore, so-called internal "duplicate content" is avoided. In the chapter "OnPage Optimization" you will find further information on the topic "Duplicate Content"!

 

Reduce Loading Time / Increase Page Speed

The page speed of a website is a very important criterion for an optimal ranking. One should fundamentally examine how quickly a website loads on a PC, a tablet and a smartphone. The tendency is that search engines primarily use the page speed of mobile websites as a criterion.

Tip: To test the page speed, tools like "Google PageSpeed Insights", "Pingdom Tools" and others can be used!

There are a number of ways you can optimize the loading speed:

Choose a suitable hoster and tariff

A hoster should use modern, fast and safe technology!
Depending on the tariff, a different number of customers are hosted on a web server. More customers usually mean more websites and therefore more hits. This makes the performance worse.
If you create your website on the basis of Joomla, you should look for a Joomla-compatible hoster. The same applies of course to other CMS systems.

Use http/2

If available, the newer network protocol http/2 should be used instead of http/1.1. This is usually activated automatically by the hoster. If a browser does not support this protocol, it will automatically load the website via http / 1.1. Downward compatibility is therefore guaranteed.
By using the so-called multiplexing technology, with http/2 the data can be transmitted in parallel over only 1 connection, while with http/1.1 several TCP connections are necessary for css, js and image files. Furthermore, http/2 transmits the data in compressed form. A number of other advantages will not be discussed at this point.
It should be noted that the http/2 protocol, in contrast to http/1.1, significantly increases the loading speed of a website. Most browsers now support this protocol.

Activate cache

There are various ways of caching data so that data does not have to be reloaded every time a website is called up. A cache is nothing more than a buffer memory. A distinction is made between different types of caches:

  • Joomla Cache: Joomla itself offers various options for caching data: conservative caching, progressive caching and page caching. Which method you should use depends on the respective website. When a URL is called up again, Joomla loads the data directly from the Joomla cache. If you use the page cache, even entire pages are cached. This means that they do not have to be regenerated from the database every time. The use of the page cache is not recommended for interactive websites (e.g. also with a shopping cart function), as the content changes and must be displayed immediately.
  • Server Cache: A server cache, such as the OPCache, means that PHP scripts do not have to be recompiled each time they are called. These are loaded into the memory and always fetched from it as long as they have not changed. This increases the performance. Caching can be implemented using RAM or files.
  • Browser Cache: Copies of resources that have already been accessed from the web are stored in the cache of the browser (PC, local). These can be, for example, css, js, image files and much more. If the required resources are already in the browser's cache when a URL is called up, they are called up directly from it. This significantly reduces the time it takes to download all parts of a website.
    The browser cache can even be controlled via certain instructions.

Minimize the number of http requests

When the browser sends a request to the web server, this is known as an http request. The "GET" and "POST" methods are mostly used here. Depending on the network protocol and browser used, several requests can usually be processed in parallel. Nevertheless, it is advisable to minimize the number of requests. This can be achieved, for example, by combining several css or js files into 1 file. This means that only 1 request is sent to the server and not for each file individually. The number of requests is an important ranking factor for search engines. For Joomla, css and js files can be combined and minimized at the same time using the JCH Optimize tool, see next point!

Optimization of CSS and JS

A website usually has to load a large number of CSS and JS files, some of which can also be quite large. This is at the expense of performance. There are several ways to minimize the files:

  • Avoidance of inline styles
  • Use shorthand spelling (e.g. #ccc statt #cccccc)
  • Remove comments
  • Remove spaces
  • write everything in 1 line

Most editors and various other tools offer a function for this (compression) that does all of this automatically. As already mentioned, there is, for example, the extension "JCH Optimize" for Joomla.
It is best to add ".min" to the name of optimized files so that it is easier to distinguish between optimized and non-optimized files.

Tip: A formatted and uncompressed file is much easier to edit because of the better overview. You should only optimize it after processing!

Optimize the loading time of the images

If you use huge image files on your website, you will significantly increase the loading time of your website. You should reduce the image size to the size shown!

Let's assume you take a photo with a size of 3000 x 2000 pixels. On the website, however, it should be displayed in a size of only 600x400 pixels. Due to the file size, the website requires a significantly longer loading time for the large image than for the small image. You should therefore scale the image to the ideal image size beforehand using an image editing program or an online tool and only then insert it into the website. If several non-optimized images are used on a website, the page loading will be slowed down considerably and the website visitor may not be able to scroll.

The file size of an image can also be influenced by the image format (png, jpg, etc.) and the compression used.

But what is the ideal image size?

Websites are displayed on different devices (PC, tablet, smartphone) with different resolutions. You should adjust images to the maximum displayed size. Finally, you should definitely check the quality of the images on the various devices!

Use Lazy Load for images

You can rarely do without images on a website. As a rule, however, they take up the largest volume of data on a website. However, there is a possibility to delay the loading of images (Lazy Load).

If, for example, several images have been inserted at the end of a website that can only be viewed after scrolling down, these images do not necessarily have to be loaded first. "Lazy Load" ensures that only the main content is initially loaded and displayed. That happens relatively quickly. Only when the user scrolls down and the images come into view are they reloaded. "Lazy Load" is a very effective method of reducing the loading time of a website, especially for long pages with many images.

There are plugins for Joomla 3 that can reload images later. Some of these are already integrated in frameworks or editors.
In Joomla 4, "Lazy Load" was already integrated into the Joomla core, so that no additional extensions need to be installed.

If you insert an image in the TinyMCE, proceed as follows:

  • Click the button "CMS Content" and then select "Media"!
  • Select the image you want to insert!
  • Indicate whether the image should be loaded with lazyload or not!
  • Finally click on "Insert Media"!

The parameter loading=lazy" is now automatically added to the image.

If you would like to undo the loading of the image via "Lazy Load", then simply remove loading = "lazy" in the source code! If an image that has already been inserted is to be loaded with a delay afterwards without lazy load, simply add the parameter in <img> or insert the image again!

Lazy Load

Note: "Lazy Load" cannot be activated via "Insert/Edit Image"!

Important: "Lazy Load" also has a disadvantage. Search engine crawlers cannot capture content to be reloaded during indexing. It is therefore advisable to use "Lazy Load" only to reduce the data transfer for less relevant images!

Analyze Core Web Vitals

Google analyzes the so-called "Core Web Vitals" as an important ranking criterion! Therefore you should take a closer look at it. However, since this topic is very extensive, a separate section was dedicated to it:

 Core Web Vitals

Possibly use CDN

CDN stands for "Content Delivery Network" and is a network of servers (proxy or edge servers) at various locations around the world. If you use CDN for your website, the data is automatically loaded from the server that is closest to the visitor to a website. This means that content can be transmitted to website visitors much faster. The use of CDN also improves security. However, this will not be discussed at this point. Whether CDN makes sense for a website depends on the website itself.

The implementation of CDN is now much cheaper than it was 20 years ago. There are a number of providers: KeyCDN, Cloudflare, Amazon CloudFront, Azure (Microsoft) and a few more.

The use of CDN is, however, is not entirely trivial and requires some knowledge. Some of the CDN providers have very good documentation.

 

Use clean code

A valid source code is important for the ranking of a website. Because no user likes to use a website with errors. And no search engine recommends faulty websites. Furthermore, a website should not necessarily contain more code than necessary, as this leads to a longer loading process for the website.

 

Create an XML Sitemap

A website usually consists of a large number of pages (URLs). Ideally, these are also well linked to one another. This allows the search engine bots to find and index these pages relatively easily. Nevertheless, the search engine operators recommend creating a so-called XML sitemap.

An XML sitemap contains all pages (URLs) of a website. This means that pages (URLs) that are less well linked or not linked at all can be crawled and indexed. The XML sitemap is typically located in the root directory of a website and is regularly read by search engines. You can also submit these e.g. in the "Google Search Console" (GSC). In addition, the link to the sitemap can also be entered in the robots.txt.

An XML sitemap is a text file in XML (Extensible Markup Language) format.

The XML sitemap is not to be confused with the HTML sitemap, which shows the website visitor the pages (URLs) directly in the browser and can thus improve user guidance! The HTML sitemap is hardly used any more these days.
You can now generate your own XML sitemaps for images, videos and news.

It is important to note the structure of an XML sitemap so that the web crawlers can read the URLs correctly. If you submit this to the GSC, for example, the sitemap is automatically checked.

A typical XML sitemap has the following structure:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
 
 <url>
   <loc>https://www.example.org/</loc>
   <lastmod>2021-12-01T18:35:48+00:00</lastmod>
 </url>
 
 <url>
   .....
 </url>

</urlset>

Within <url>...</url> the individual elements for the respective URL are specified. Here the elements mean the following:

  • <loc> : URL
  • <lastmod> : The date and time the URL was last midified
  • <changefreq> : Approximate frequency of changes to the URL, e.g. always, hourly, daily, weekly, monthly, yearly, never
  • <priority> : Importance of a URL within the domain (0,0 - 1,0). Default: 0,5 . Priority has no relevance for Google, however.

 
Note: If you want to create an XML sitemap for a multilingual website, there are additional elements, as you have to specify all the corresponding URLs of the other languages for each URL. Due to the complexity, this cannot be elaborated on here.

Tip: There are XML sitemap generators that search a website and automatically write the URLs to the file in the correct form. Some of these generators can be set very precisely in order to find URLs that are not well linked. The XML sitemap generators are particularly recommended for large, complex or multilingual websites!

 

Customize robots.txt

With the help of the "robots.txt" file, you can basically tell a search engine which pages it should not crawl (disallow). Exceptions can be made to this (allow). Different information can be provided for the various search engines. The robots.txt is basically a text file with a specific structure. It is located in the root directory of a website. When using Joomla, this usually only needs to be adjusted slightly. If the robots.txt is not optimally adapted to a website, it can in the worst case even prevent the crawler from crawling a page (URL) and adding it to the index of their search engine. You should always check the robots.txt carefully and correct it if necessary!

Note: The search engine bots do not necessarily have to adhere to the information in the robots.txt. The reputable search engines such as Google, Bing and many others do that, however.

Example using the robots.txt from Joomla:

User-agent: *
Disallow: /administrator/
Disallow: /bin/
Disallow: /cache/
Disallow: /cli/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /layouts/
Disallow: /libraries/
Disallow: /logs/
Disallow: /modules/
Disallow: /plugins/
Disallow: /tmp/

Due to the specification User-agent: * all search engines are addressed.

 

Protect the webseite from certain types of access

It can happen that a high number of accesses also generates a high level of "background noise". This is particularly disadvantageous if a lot of hits on the website are of no use at all. A high "background noise" can in fact have the consequence that the performance deteriorates, which in turn could lead to a poorer ranking. If this is the case, there are several ways to prevent it:

It is very effective if you block the affected IP addresses or user agents using appropriate entries in .htaccess. Then these accesses are blocked directly by the server before the CMS system used, such as Joomla and its database, is even accessed. It should be noted that the user agents can change at any time. A regular check is therefore recommended. The corresponding log files must be evaluated for this purpose. This can be very time consuming.

Tip: It can make sense to first write a corresponding entry in robots.txt with regard to the user agents! After all, most bots already adhere to this instruction. Only in stubborn cases should you implement the lock via .htaccess. In any case, you should make an exception in the .htaccess for the robots.txt so that access to the robots.txt is always possible.

 

Website must be crawlable and indexable

In order for a crawler or bot (search engine) to read and analyze a website, it must be able to access all content (links, images, scripts, cache ...) of a website. That is the prerequisite for a page to be indexed. Only then is it included in the Google index and can be displayed in the search engine results (SERPs).

There are a number of factors that can greatly affect the crawling and indexing of a page (URL):

  • Server errors
  • Blocking in robots.txt or .htaccess
  • Problems with redirects or 404
  • Bad page and URL structure
  • Parts of a website such as scripts, CSS files etc. are not loaded
  • Website was hacked or unsafe
  • Website is on a blacklist

 
There are several online tools you can use to check your website for problems like this! It also makes sense to register into the "Google Search Console" (GSC), for example! The website is then regularly analyzed and problems are automatically reported by email.

Notes

All prices quoted in Euro include the statutory German VAT of currently 19%. The final price can change after entering the billing information, for example if the order is placed as a company. The prices refer to the creation of the Quickstart packages.
More

Joomla! Trademark

Joominator.de is not affiliated with or endorsed by The Joomla! Project™. Use of the Joomla!® name, symbol, logo and related trademarks is permitted under a limited license granted by Open Source Matters, Inc.

Joomla Logo