Successful Homepage Optimization for Google
What are the most effective SEO levers and components of OnPage optimization?
Good planning and implementation are the be-all and end-all in SEO: In the ideal case, the factors are already included in the creation of the website, but even in retrospect, small changes can often achieve great success. It is often claimed that good OnPage optimization is very complex. But with patience, perseverance and our SEO tips, the improvement of the search result positions can succeed. Here you will learn about the most effective levers to optimize your homepage for Google:
What is OnPage Optimization?
Besides OffPage optimization, OnPage-SEO is the second pillar of successful search engine optimization. The aim is to improve search engine rankings in the long term through technical, content and structural adjustments to your own website and to optimise the relevance, structure and crawlability of your homepage for Google. OnPage optimization is also the prerequisite for successful OffPage search engine optimization, as the creation of backlinks without correspondingly optimized pages is not very efficient. Users and Google should be provided with a website that precisely meets their respective search intentions and optimally satisfies their needs. Because perhaps the visitor does not want to inform at all, but is already informed and would like to buy directly a certain product. Or the user would like to contact your company and search for your contact data such as a telephone number, a contact form or an e-mail address. So there are different motives which are reflected in the search behaviour of your visitors. They are divided into the following types of search queries:
- informational and
- navigational search queries, which also included brand queries.
This distinction should be noted in the SEO keyword search. In order to understand the intention of the user, one has to put oneself into the perspective of the user. In practice, however, this is difficult to impossible because there is not enough time and resources to explore the intentions of a large group of people for each individual keyword. So it’s worth gold for your SEO to take a look at the organic search results in Google Search.
How to exactly find out and analyze the search behavior of your visitors would go beyond the scope of the article, but we will discuss this in another article at a later date.
1. Meaningful and intuitive website structure and internal linking
Each of us wants to get from A to B quickly and without detours. This applies not only to walking or driving, but even more so to the Internet. Users should be shown the fastest and easiest way to the desired product or service. This also applies to the Googlebot, which “shuffles from link to link” on your homepage. Therefore the internal linking is an important SEO component to optimize your homepage successfully for Google. The individual pages should be logically linked to each other so that users and crawlers can find their way around optimally. If a page has a bad structure, it is difficult for the Googlebot to find it and to include or update it accordingly. An intuitive and meaningful navigation structure is an absolute MUST, because otherwise users are quickly confused and leave the website in no time.
There are some approaches to this:
Vertical and horizontal navigation structure:
In the course of time, two types of navigation have established themselves: For vertical navigation, the navigation bar is in the left area, and for horizontal navigation, it is in the upper area. The disadvantage of horizontal navigation is the space that is quickly exhausted, while the vertical variant can be extended as required.
In addition, many web pages also have a footer navigation at the bottom of the web page, in which all or the most important items of the main menu are listed again.
Primary and secondary navigation
This combines vertical and horizontal navigation. The primary navigation contains the classic main menu items such as “Company”, “Services” and “Contact”. If the user touches these points, subordinate points appear in a secondary navigation. This variant is similar to the classic drop-down menu. It is important to note that for the sake of clarity no more than two navigation levels should be linked together.
Beispiel: Primäre und sekundäre Navigation
The tasty sounding name goes back to the three superimposed strokes: Some may see in it a patty lying in two halves of a roll. This is a variant of hidden navigation, i.e. they are only visible through an explicit user action, e.g. by clicking on the burger icon. This new type of menu is particularly popular with mobile users, while desktop users often experience usability problems. This is because the small icon can quickly sink between the other elements and is therefore difficult to find.
SEO-TIPP: No matter for which kind of navigation one decides, using additionally breadcrumb navigation is a good idea. This offers users even more orientation. It lets them know at any time where they are in the hierarchy of a domain. This structure not only contributes to the easy, intuitive usability, but also supports search engine optimization.
SEO-TIP: For better orientation and clarity of topics in online shops we recommend to work with superordinate categories and subpages.
2. Make contents accessible with internal linking
The optimization of the internal linking is a very powerful SEO tool and belongs to the most important ranking factors in the Google algorithm. The Googlebot or crawler calls up individual pages and follows the links on these pages. In this way it gets from page to page, examines the content and stores information on the Google server. Since no spam filter is known for internal links – in contrast to external links – the anchor text can be selected as keyword-relevant. This in no way contradicts the Google guidelines, it is even recommended in the Google blog. Although the internal linking belongs to a significant factor of the OnPage optimization of Homepages, most Homepages do not exhaust this possibility sufficiently and/or consistently. It should be noted that no page should be more than three clicks away from the start page, which should be well placed in the search results. With smaller pages it should be better only one click. The easiest way to do this is to link to the most important end products/performance pages in the footer. From the subpages you should also link back to the start page, e.g. from the logo. In addition, only one incoming anchor text should be used per subpage and only one subpage per anchor text. A link with the anchor text “Products” should therefore refer to the product page of the website and not to any other page.
The strategy is crucial for search engine optimization. Depending on which keywords and which pages (e.g. category, overview or product detail pages) are to be ranked, the strategy and especially the internal linking of the website and the selection of anchor texts must be adapted. For example, if (certain) products are to rank, they should be linked to the correct anchor text in the footer or the most important products should be presented on the start page. The products should then also be accessible via a maximum of 2 clicks. If necessary, it is also advisable to create a product sitemap.
TOP TIP: With SEO tools such as Screaming Frog (which can even be used free of charge for websites with less than 500 pages), you can see the average click depth of a website. If you find that a large part of your content can be reached with more than 3-4 clicks on average, you should optimize the internal linking of your homepage.
3. SSL certificate as a signal of trust for users and as a ranking factor for Google
As early as 2014, Google announced that the SSL certificate would be included in the ranking. At the latest since the DSGVO came into force in May 2018, all homepages that transmit user data or have contact forms must have switched to https. The certificate ensures that a secure connection is established between a web server and the browser and that confidential user data cannot be intercepted and maliciously processed by third parties. Users can recognize this by the green lock on the far left of the address bar of the browser. Here there are three security levels with different requirements, if you click on the lock icon, you get more information about which certificate it is. For search engines, however, this distinction makes no difference so far.
4. Metadata: Optimize Titles and Descriptions for higher click rates
Above all, the targeted search terms and synonyms should appear in meaningful meta descriptions and title tags. Previously, descriptions were a good SEO opportunity to influence positioning. For years, Google has not used the meta description unless it is negatively noticeable: If it is overfilled with too many and manipulative keywords, it is considered a spam attempt. If it is attractively written and contains the search term, it promotes the click rate on the result.
SEO-TIP: For some time now Google has been constantly changing the descriptions: the length is constantly changing, Google is increasingly taking content from the page. Featured snippets stand as a direct response in front of the organic search results – at “Position 0”! These give on the one hand the decisive advantage in the speech search and on the other hand they have a very positive effect on the CTR of the page used for the answer. The other nine results, on the other hand, lose a lot of clicks.
TOP-TIP: More tips for descriptions and how best to generate featured snippets can be found in our current blog post.
You can see whether errors occur during crawling and indexing in the Search Console. Google’s URL checking tool provides information about the indexed and live versions of a URL.
We know that Google has problems with Single Page Applications (SPA). In this web application, content on the page is constantly reloaded but without changing the URL. If something is not loaded initially, it usually remains invisible to the bot.
6. Sitemap for smooth indexing
Finally and when the homepage is finished, it is recommended – especially for large homepages – to create an XML sitemap. It is also helpful for smooth indexing and is recommended for new sites that are hardly linked to external sites, as they may not be found by crawlers without a sitemap. Google provides more information about Sitemaps:
SEO-TIP: From the XML sitemaps, the crawler recognizes the structure of a homepage more quickly and also finds the subpages of a domain more easily. So that all pages that are to be indexed are really indexed, the provision is highly recommended!
7. Crawling control and robots.txt
In order to control crawling, it is important to understand how search engines work. Search engines like Google or Bing don’t just start searching when users submit a query, but use an algorithmic process that determines when and how many pages of a homepage are accessed. After each crawler search, the Google index is fed new websites, updated pages and new content, while outdated links are corrected. This crawl budget is not the same for all websites, but depends on a number of factors. In addition to links, metadata tells the crawler which pages should not be indexed.
The small text file “robots.txt” is a small piece of technology that should be understood – if you do search engine optimization. With robots.txt you control the crawlers of the search engines via your own website. Or, better said: In robots.txt you can ask the search engine bots to leave certain areas of the website alone. However, there is no guarantee that the search engines will comply with these prohibitions. Google almost always uses the command not to crawl the URL, but Bing usually does not. If links point to the URLs, they may still be included in the Google index. Here it is very important that no errors occur during the creation of robots.txt, because in this way important pages or entire web pages can become inaccessible, since the URLs are not crawled at all and can therefore not appear in the index of the search engines. With this crawling control, the work of the search engine bots can be controlled and the crawl budget can be used optimally.
The meta tag “noindex” tells a search engine robot that the visited page should not be included in the index. SEOs and webmasters can actively influence which URLs should be indexed and which should not. This is supplemented by the attributes “follow” or “nofollow”. In order to check whether the meta tag is read and observed, the site query for the page in question can help. The meta tag in conjunction with the attribute “nofollow” can be used, for example, for double category pages, copyrighted content or paginated pages.
SEO-TIP: Every human web user, search engine crawler and other crawling tools leave characteristic traces, but errors often occur during crawling. These crawling errors are displayed in the search console, Google bot activity is roughly displayed, and robots.txt can be checked. For a detailed analysis, take a closer look at the log files. The log file, often referred to as the log file, is a file that logs events on computer systems or networks. In this way, the behavior of crawlers can be analyzed and it can be seen which pages have been crawled too often in relation to each other and where problems exist.
SEO-TIP: As banal as it may sound: Check whether your pages are approved for search engines at all. In some CMS systems it is enough to check or forget the box and the page will be removed from the index or not indexed at all – keyword “noindex”. The so-called site query at Google can help here.
8. Fast homepages are preferred by Google
Google wants to provide its users with the most interesting search results possible and a great user experience. This includes speed. Google prefers fast websites so that surfers with a slow Internet connection are not put to the test of patience. For mobile devices, for example, fast loading times are even more important.
SEO-TIP: However, it should be noted that the loading time plays a rather subordinate role in the search engine optimization of your homepage for Google and a further improvement in loading time (from fast to very fast) does not send any positive signals to Google. With large pages, however, long loading times have a negative effect on the crawl budget and, especially for smartphone users, a positive user experience should be created through fast loading times. If the loading time is above 3 seconds and the desired destination is not reached quickly enough, the majority of visitors leave the website again. High loading times in the online shop can therefore act as true “sales killers”. If the loading process of product images takes too long or the purchasing process takes too long (due to the loading time), online shoppers usually do not wait long and cancel the purchasing process. This is how conversions of potential customers are given away.
9. Can Google read image content and does it need unique images and videos?
Google also states that visitors should be offered an “outstanding user experience” through old texts and that the site should be barrier-free so that, for example, people with visual impairments can have the picture descriptions read to them. This technology will become increasingly important for search engine optimization in the future, because the old text enables screen readers and voicebrowsers to read the information content of the image. In addition, this setting is also rewarded by Google, especially if the description matches the search intention.
Alt-texts or the alternative description should refer to the content in the text and of course to the content of the image, because Google can only “see” to a certain extent what is on the images. If, for example, a green table can be seen on a picture, and if the image data and the image environment often mention a blue chair – Google is just as unconvinced of this as website visitors. The Cloud Vision API can be used to test how much of the image content Google can recognize.
SEO-TIP: By showing up at the top of Google Image Search, you can attract more visitors to the site. In addition, videos increase the time spent on the site and address several senses at the same time! Although Google is not yet able to interpret the content of videos and images optimally, good images and video SEO can help to increase traffic.
SEO-TIP: As a rule, the main keyword of the subpage should be included in the alt tag if it matches the content of the image.
TOP-INFO: Our tests have also shown that Google makes no difference whether it uses its own photos or graphics from platforms such as Shutterstock, iStock and Co. Also own – mostly very complex – videos, instead of integrated YouTube videos, are not preferred. For website visitors, however, it makes a difference: Images that have already been integrated on thousands of pages quickly seem boring or unloving, so it’s better to do it yourself if the resources are sufficient!
10. Optimize mobile display and roll-out of the mobile index
Actually, no site operator can afford to present his visitors with a non mobile optimized website. If you don’t do that yet, it’s high time for you!
Mobile first: The epoch change has taken place – until now the desktop versions of a URL were used by Google as the main index for the evaluation of websites and to determine rankings; since the roll-out of the Mobil-First index, this has now been done by the mobile version of a page. This means that there is still only one index, but the basis of the pages that will be indexed changes. All websites with a separate mobile version should also be converted to responsive design. This makes it easier for Google to interpret the same content for both views.
Optimization for mobile devices has developed into an important ranking factor. Therefore, it is now particularly important for optimal search engine optimization to check the performance of your website in the mobile search results. This means that the monitoring should be switched to mobile or additionally introduced.
While many business-to-business companies are still visited by more desktop users than by users with mobile devices, end consumers are predominantly on the move on mobile phones in the network.
SEO-TIP: One of the most powerful SEO tools use: In the Search Console you get helpful hints about problems with the mobile friendliness of a website, these should be regularly checked and fixed.
SEO-TIP: When redesigning a website, we recommend that you first design the mobile version and then the desktop version or view.
- Despite Google’s statement that normal and “hidden” content are rated the same from the Mobile First index onwards, our tests showed a different result. For example, Google can partially read “hidden” content in tabs or accordions (which make sense for the smartphone), but they have not yet been indexed. Even already indexed content was partly removed from the index when it was later pushed into tabs.
11. Content is King, but SEO is Queen
The “Content is King” mantra is repeated to vomit everywhere. But is it really true?
Many bloggers, who reject search engine optimization out of ignorance, want to make you believe that good content is enough and that good content will get around on its own. However, this is rarely the case! You need unique content, good marketing AND your homepage must be optimized for Google. Google still loves unique content, but it is similar to chess: Your homepage is nothing without king. But you also need other pieces like a queen or knight to win. But we have unique tips that are not to be found in every SEO guidebook and show how you can add value to your content:
SEO-TIP: Google loves articles that are up to date or content that is constantly updated with information or changes are made. Old articles should also be deleted or updated over time.
SEO-TIP: We recommend several hundred words per article. So all keywords can be built in without keyword spaming. An exact specification of words is difficult here, because the algorithms of Google so variously interlock and do not simply prefer a text with at least word length X. Rather, Google respects and rewards unique, high quality and irresistible, compelling and good content. So the primary goal is to create “the best content on a particular topic”. While a product text should have several hundred words, a blog post can have several thousand words.
Paragraphs and bullets as well as other formatting elements also help to improve readability for users and search engines.
SEO-TIP: Good homepage search engine optimization for Google is supposed to be invisible. It should only ensure that your homepage is found, visited and the actions desired by the operator are carried out there. But why does SEO have a negative aftertaste with many people? Often SEO is associated with trashy stuffed keyword texts and websites. But this should be avoided at all costs, it should be written in the same way for Google AND users easily readable texts.
SEO-TIP: Jump marks can also be used within the texts so that the user can reach the desired product, service or information even faster – without detours.
SEO-TIP: Interactions! Allow comments and ask your visitors what they would like to read. You’ll also get valuable tips and suggestions on how to improve your content. This can be achieved, for example, through comments or surveys.
SEO-TIP: Videos and pictures should be included. This loosens up the text for the user and improves the readability of the text. In addition, images are also rewarded by Google.
12. BONUS: Stand out in search results by using structured data
With structured data, information can be prepared for Google, because Google is simply overwhelmed with some data. The major search engines Google, Bing, Yahoo and Yandex have founded the Schema.org project to create a uniform standard for labelling structured data. The formats for the Schema.org award are RDFa, Microdata and JSON-LD.
Example: A long phone number on the page could be the actual phone number, but it could also be held for the sales tax identification number, for example. By marking information with structured data with schema.org, the assignment is undoubtedly successful and flawless.
Google has no more difficulties with data and as a bonus there are nice possibilities to stand out in the search results optically and thus positively influence the click rates on your own result.
Bonus-Tipp: Attract attention in SERPS with rich snippets: The schema.org award is indispensable for rich snippet optimization. More traffic and higher click rates due to cleanly formatted and useful listings in the search results: structured data can be used to enable a number of additional contents in the SERPs/Rich snippets, such as the display of customer ratings in the form of asterisks, which above all brings a visual advantage and increases the click rate / CTR.