If you need a complete technical SEO tutorial, you came to the right place! Technical SEO is an essential skill no matter in what niche your business is. It’s up to you to carefully read everything we have written in this complete technical SEO tutorial and apply our everyday business tips.The tutorial is intended for beginners and SEO experts who want to expand their knowledge, so if you find something you already know, feel free to skip that section.So lets begin.What does technical SEO mean?When someone mentions the word “technical,” everyone starts panicking, but it’s a simple term that includes improving the technical considerations. As a result, it will provide a better user experience and boost your site’s ranking dramatically.  You don’t have to be a coder to work on the technical SEO initially. Still, it will surely need you to know and understand those codes if you want to work entirely on all technical aspects like server optimization.  But there is a question: are these technical corrections worth your time?  Even if your website has excellent content that ranks well, the technical issues on your website will prevent the users from entirely focusing on the inside matter, thus making them irritated due to other more technical factors.  Technical SEO is one central pillar for every successful website. Its major elements are crawling, rendering, indexing, and site architecture. With the development of Google algorithms, technical aspects of SEO have changed as well.  In this article, we’ll cover the most critical technical SEO aspects from a beginner’s perspective and give you a few specific steps to fix common problems. To sum it up, technical SEO is everything you do except for on-page/off-page SEO.  It includes making your site load faster on any device, image compression, mobile optimization of your website, etc. It’s all a part of technical SEO optimization. The importance of technical SEOBefore we begin with technical SEO, let us understand its importance in a couple of sentences. Here are a few reasons that highlight the importance of technical SEO: 1. Your website speed: this is maybe the most critical aspect of technical SEO because no one wants to visit a slow and ineffective website. 2. Mobile-friendly approach: In 2021, one might not carry their laptops or tablets with them, but a smartphone is a must-do. Even at home, people are using their mobile phones daily. Moreover, due to its mobility, people tend to visit websites from their smartphones. Therefore, for a website that’s not mobile-friendly, the search engine will decrease its ranking or, worse, not show it on SERP for the user’s query.  3. SSL certificate: A fully working SSL certificate gives a message to the users that the website they visit is secure. This certificate is required for a site to move from HTTP to HTTPS, providing encrypted communication between the browser and the server. This way, the site holders can protect their sensitive information such as login IDs, passwords, e-mail addresses, etc., from hackers online.  4. Better crawling: if the technical SEO is good, the crawlers will follow a link structure faster while crawling your website. You can tell them specifically which pages need to be crawled and which to be skipped.  Consequences of the bad technical SEOThere are clear consequences of not working on technical SEO factors:  If your site takes more than 3 seconds to get loaded, 32% of your visitors will leave.Your website’s conversion rate will drop by 4.42% for each second.If your website loading speed exceeds 3 seconds, the traffic will fall by 40%.Google doesn’t want to serve any non-secure content to its visitors, so Google will not rank your website well. Conversions would be much lower compared to the conversion for poorly optimized sites to other sites.Poor crawling speed Going forth, we shall discuss all these topics in detail!  Slow loading of the webpage If your website loads slowly, you will lose many visitors. As an SEO professional, you need to know that a slow website can harm you in many ways. One of the best examples are Walmart’s improvement in sales after increasing their site speed:  Their analysis found that visitors who converted had received pages that loaded two times faster than those who didn’t convert. It showed that the page speed directly affects purchasing decisions of the clients. At the end of their website speed optimization, the report looked like this: For every one second of improvement, the conversions increased up to 2% For every 100 ms of improvement, the revenue grew up to 1%Website speedThe three critical factors of the speed test are:  Time to first byte (TTFB)Fully loaded pageFirst meaningful text  It’s clear from the name that it is the time taken for a page to load fully. It includes all the elements on the page like text, images, videos, animations, etc.  The page will only load fast when all its elements, together, can load fast.TTFB (Time to First Byte)As the name defines, TTFB is the time taken by the browser to receive the first byte of information from the server, usually expressed in milliseconds. On the other hand, it’s the time taken by the server to receive and process the request and the time in which it can deliver the data.  Alternatively, Google calls it ‘waiting’ in its language.  It happens in 3 steps:  1. Requesting server: when a client enters your website, an HTTP request is made by the web browser to the webserver to fetch data. There can be a delay here depending on the distance of the server from requesting location.  2. Server processing: server receives the request and processes it.  3. Responding: the data or the first byte is sent to the browser. Two clear disadvantages for a site that has a slow TTFB are:  1. The number of visitors to your site will be low2. Search engines will rank your website poorly because of the low speed.  So, what TTFP is good then?  According to Google, a website shouldn’t have a TTFB of more than 200 milliseconds. It is because a long TTFB will not only affect your website but will increase a pogo-sticking. A pogo-sticking is when users return from your website irritated as your site took too long to get loaded, and they visit other pages in your competition. There can be many reasons for a slow TTFB.  Some of them include: 1. A large number of heavy elements that decrease your website speed 2. Harmful or large JavaScript code: JavaScript elements usually take much more to load. To fix this, you need to optimize the JavaScript code and minimize them on your website. 3. Too large images: images that are not in new-age formats load poorly  4. A large number of HTTP requests: a result of poor coding. When you click on a page, the browser sends HTTP requests to access the website’s content. However, these requests can be slow due to poor coding on your website.  5. Rich content: rich content like videos, audio, and heavy animations can decrease your website speed dramatically. You can minimize it or delete it, depending on your will.  6. Poor maintenance: It’s essential to manage and audit your website’s code regularly for the proper working of your website. Fully loaded pageAs a sentence says, a fully loaded page means that every element of your page is fully loaded fast and efficiently. Otherwise, your website will rank poorly.First meaningful textThe first meaningful text, a.k.a first meaningful paint, is when primary content needs to load. As discussed above in TTFB, users usually wait for only 1-2s before they go to another website. After that, they are spoiled with choices, with so much content out there.However, there are a few innovative ways to make the users feel comfortable during loading. You can create some distractions for the visitors during the loading time, like a small blurred image to make them feel that the actual picture is getting loaded and the page doesn’t appear blank. Also, you can use thumbnails working as placeholders while the image is still getting loaded. Let us understand how to improve website speed!Page size-file size: There can be many heavy files on your website, including videos, high-resolution images, graphics, animations, which will make your website slow. It can also happen when you have poorly optimized codes on your site.  Also, when a page is loaded, the client sends an HTTP request for every element of the webpage that’s getting loaded. So if there’s a large amount of these elements, you need to reduce or optimize them. Otherwise, it will irritate your visitors, and they’ll jump to another website. Minify codes: if you use the core web technologies like HTML, JavaScript, and CSS, the most uncomplicated optimization you can do is to minify your code. It will reduce whitespace between the lines, improve the loading speed, and reduce the size of the files. Hosting Service: if you want your website to rank well, you must choose a fast and reliable hosting provider. There are three types of hosting: · Shared hosting: Shared hosting is the cheapest, and in 90% of cases, the reliable choice. · Dedicated server: the best choice for large companies and corporations. You get a dedicated PC that will host only your website. · VPS hosting: a hybrid version of shared and dedicated hosting. You will not get full access to a server, and you will share the resources with other websites.  Caching the browser: browser caching is a good practice if you want to improve your website’s speed. It allows you to store copies of files on your website to reduce the work done by the server when the browser requests some content.  The TTFB will decrease by allowing the server to use fewer resources while a page gets loaded.  When a client requires requests from a server, the server delivers those requests for the first time. After that, the server will provide a cashed data to a client, which will make a delivery time much faster. This is why your search result requires more time for the first search and less time for every other search. CDN (Content Delivery Network): a CDN dramatically reduces the delivery time of heavy-weight elements like pictures and videos by storing them to the nearest server, which will deliver them on demand. It helps cut down the number of requests made while loading your page, thus reducing the bandwidth by 60%. Page Speed Test: After successfully using all of the previously mentioned tips, it’s time to test your website using online tools. There are many online tools where you can test the speed of your website, but the most popular and easiest one is Google PageSpeed Insights. If you want a second option, you can use a GTMetrix, which is also a perfect tool.  Sitemaps: Sitemaps are files that contain an XML code. Their primary usage is to help Google crawlers to better “understand” the structure of your website, alongside the content. XML sitemaps are generally safe, but they can also be used in different types of SEO scams that we’ll not cover in this article. An XML sitemap becomes even more critical for websites that don’t have internal linking or backlinks. It can also be used if you add new pages regularly.  There is also an HTML sitemap that is designed for your human readers.  Having an XML sitemap is always a significant advantage to your website because it allows your website to mention the URL of its pages in an organized and systematic manner.  You can also edit your XML sitemap if you don’t want Google to index some of your pages using the meta robots. Robot.txt file is the file we use when we don’t want the crawlers/spiders to crawl through specific pages or posts.  To be more specific, these instructions will be performed by web crawlers and not the bots that invade your site.  To better understand how the crawlers work, we need to know what a crawling budget is.  The crawling budget is a range in a crawler that determines how many pages/posts will be crawled.  If you go above that range, the budget will be exhausted, and Google will not index your website entirely. There are a few situations where you need to be concerned about your crawling budget:  ·  When you have a big website, many pages can make things difficult for the crawler. ·  Extra budget for new pages: every additional page will require more crawling budget if you want your webpage to be indexed appropriately.  ·  Redirects: every redirect will cost you some amount of the crawling budget  To summarize, you must only crawl pages important for your website and not waste the budget on not-so-important pages. To understand more about robot.txt files, we must know its elements:  User-agent: user agent is a name for anyone on the internet. In bots, they tell what type of crawlers are crawling your website. Inside the robots.txt file, you can provide various instructions for these bot user agents.  Disallow a command inside a robots.txt file that specifies which pages you don’t want to index and crawl. If you leave it blank, this will indicate that you wish all of the pages to be indexed. You can include pages with sensitive information, like login/register forms or other types of backend information. No Index and No Follow: No index and no follow are tricky for beginners, but don’t worry; we’ll try to simplify things. So, when you use a disallow tag, it doesn’t necessarily mean that your page will not be indexed. Confused? Though it sounds a bit confusing yet, so we will explain it further!  Let’s understand them one by one:  Disallow a simple instruction to the bots to crawl the mentioned pages, but it can’t keep them from getting indexed.  No Index: no index is a meta tag in HTML code that can also be returned in the HTTP response that provides a non-indexation to your page. No Follow: asks the bots not to follow the link of the page and makes them non-indexed No Index Meta Tag: No index meta tag is a tag in the head section of your HTML code that keeps the page from getting indexed. As a result, the page will not appear in the search results. If you apply a no-index tag to any page, it shouldn’t be included under disallow in the robots.txt file.  If you don’t want your page to be indexed, use:  <meta name = “robots” content = “no index”/> But there’s a tricky part! You can also include no index in your robots.txt file! So let us know the difference.  The tag will still perform the same function, but we can apply no index to dozens of pages in one go.  Unfortunately, though, Google disapproves of this method.  If you disallow a whole page, it will not get crawled. However, imagine that you have an internal link to the URL of that page from another page on your website, or even when an external site links to that page, the search engine may find the page this way and end up indexing the URL.  Also, with the disallow directive, Google might rank your webpage lower.  So, if you want a guarantee that a specific page/pages are not indexed and therefore won’t show up in the search results, don’t use the disallow tag in robots.txt.  On the other hand, adding the no-index meta tag in the HTML of your page will allow search engines to give your page points for links and content you have on it but still won’t index it.  For a new website, the disallow directive would be better as it wouldn’t show up for a while for it to have linkings.  So as we discussed, if you want the search engines to crawl your page perfectly, but you don’t want the page to be showing up in the search results, you must use the no-index tag in the header section, then apply a no-follow tag to any link you don’t want to vouch for. In this case, search engines will follow the good link, leaving the bad links.   Page Experience: Page experience is a part of core web vitals that measures the user experience while browsing your website. Some of the most critical metrics are performance, interactivity, and visual stability.  Mobile-friendliness: Google launched its mobile-first indexing on March 26, 2018, after a year and a half of careful testing and various experiments. It was implemented on all the websites from July 1, 2019. Before the introduction, the crawling and indexing were done according to the desktop version of a website. If the content of the website was good on the desktop version, the website was considered good. Later, it was found that not every website that works perfectly on the Desktop runs well on mobile phones. It led to mobile-first indexing, where the crawler first checks the mobile version’s content and indexes it accordingly. Thus, it flipped the situation drastically. This change was made to provide a better user experience to the people using mobile phones. So, if you want your website to be mobile-friendly, it should be adaptable to various smartphones.  If you want your website to rank highly, it should reach its customers from any device, not sacrificing the user experience. How to make the website mobile-friendly?The mobile optimization begins by choosing the correct layout of our website. Not every element of your website that works on your Desktop can be suitable for mobiles too.  Let us understand these factors one by one:  Highly Responsive Design: every website that is optimized for Desktop should be optimized for smartphones too. By optimization, we mean a content layout that can be easily reshuffled according to the mobile phone’s screen size.  Dynamic Design: when you make your website mobile-optimized, the website needs to change according to the smartphone.  The process of change is called a dynamic design. The only way to implement a dynamic design is by changing your HTML/CSS/JS code.  Keeping different URLs: this is when you work on a subdomain by creating a new page with a different URL. It can cause technical SEO issues.How to test mobile friendliness?If you want to check the mobile-friendliness of your website, you can use a mobile-friendly test, which is Google’s official tool. Some tips: ·  Make your font large for both Desktop and mobile versions. ·  There should be enough white space between the different elements of the website. It will prevent the user from accidentally clicking an element when they wanted to click something else.  ·  Design your mobile version to be optimized for both vertical and horizontal modes of the smartphone screen. ·  Mobile phone users tend to find instant results because mobile is a handy tool and can be used anywhere. It is why you must not keep the best information at the top of a page because users might not scroll down.Safe browsing The site must be spotless from every type of malicious code, which can cause a virus to spread across the devices.  If the user browses a website, they can engage you in activities that don’t look harmful and are deceptive.  If the user visits a malicious website, it can deceive them by believing its content isn’t malicious.  It would help if you kept a few things in mind as a website holder:  Never cheat your visitors: never tell your visitors to download files if you’re unsure that they are malicious or not.  Don’t go against advertising: don’t make up your website for what it’s not! If by visiting it, there will be some malware or any kind of software-induced onto a user’s device, clearly mention it. Don’t trick them into coming to your site and making a fool of themselves.  Don’t take others brand names: Endorse a brand’s name only when if you have permission. Don’t simply put someone else’s business name to benefit from it. HTTPSIf you want to rank into Google’s search results, your website must have a security (SSL) certificate to provide you with an HTTPS (hypertext transfer protocol secure) extension. HTTPS allows safe communication between any given network.  To understand the HTTPS better, let us know what HTTP is?  HTTP (hypertext transfer protocol) is the application layer protocol used in the distributed systems like the world wide web for client-server communication. HTTP and HTTPS are the same protocols, but HTTPS comes with extra SSL encryption. What are the benefits of HTTPS? In its early days, this protocol was used for storing passwords, but today, HTTPS is a must if you want to rank your website. Whenever you browse something, or you store sensitive data on HTTPS, it’s kept safe. Other users will not know your history.  All websites using HTTPS have an SSL certificate. When you visit a website, the browser will automatically detect if the website has a valid SSL certificate. Also, if your website has HTTPS, the service provider will not trace your activity inside your website. Core Web VitalsCore web vitals are a combination of the most important ranking factors of Google. They are made up of three metrics used to gauge a page’s experience: Largest contentful paint The largest contentful paint represents the time taken by the website to load the most significant element on your website after a user has clicked on the website’s link.  A good LCP time to load after clicking on the page is 1.5 – 2.5s Remember, TTFB and LCP are very different. While TTFB is more theoretical when it comes to page speed, it directly affects LCP time. TTFB is the time required to lay the first byte on a web page. LCP is the time needed for a user actually to see something valuable on the page. First input delay First input delay (FID) is the time occupied by the visitor to interact with your website.  After loading LCP, you can’t be entirely sure if users can still interact with your website. It can include entering your e-mail, clicking on the call to action button, filling up a form,  etc. FID doesn’t apply to all the websites. For example, if your website only has text, the only interaction will be to scroll.  For this type of page, FID is not essential, but for the website that asks for user actions, FID is very important. A good FID score shouldn’t be more than 100 ms. If it’s between 100ms – 300ms, it needs to get better. On the other hand, if the FID exceeds 300 ms, it is poor and needs to be worked.  Cumulative layout shift (CLS) Cumulative layout shift measures the stability of a web page while it loads.  Some pages fluctuate a bit before they open. This instability at the loading time varies depending on the page. If the elements of your page don’t load quickly, you will have a bad CLS score.  A score of up to 0.1 seconds is a good CLS. If it’s between 0.1 – 0.2 seconds, it’s time for an improvement. But, if it is more than 0.3 seconds, it can destroy your page experience. No Unnecessary Elements As we previously mentioned, more than 50% of Google’s searches are from mobile phones, so website owners and Google needed to provide users with a better mobile experience.  The elements like large banners, unnecessary pop-ups, overlays, etc., are majorly responsible for ruining your page’s experience. They cause distraction from content on the website. Unnecessary elements can also be a dominating layer with a dark background that overshadows the actual content, dialogue boxes asking for your login credentials, GDPR cookie pop-up, etc.  Google has decided to penalize such websites and to decrease their rank.