fbpx

SEO Technical Error: 13 Common Errors And Effective Ways To Fix

It can be said that during my SEO process, I have had countless technical checks on the project website. I and the GTV SEO technical team have noticed an overview of some SEO Technical Error that SEOer often faces.

In your opinion:

  • What are the most common SEO-er mistakes?
  • What factors have the greatest impact if repaired well?

In today’s article, I will send you all these technical errors. Also I will show you how to identify and edit them. Hope to help you learn SEO and make SEO more effective.

So, watch it right away ^^

Some technical errors Onpage

Website speed

According to Google, website speed is also considered as a ranking factor on search engines. It directly affects the onsite time of users who stay on the site.

Usually, it is the best when one website should load the page at least two to three seconds. Users will not wait for your website to load while there are countless other websites on the Internet. Too slow loading of a website may affect revenue. Why?

Simply, users will not stay on your website long enough to learn about products or services. Obviously, the ability to buy or use the service is not high.

Common mistakes in optimizing website speed

  • The image size is not properly optimized.
  • Write web code not up to standard.
  • Creating too many plugins
  • Javascript and CSS are heavy

Website GTV SEO needs to be optimized for speed

How to find errors that slow down your website

  • Check your website on Google through the PageSpeed ​​Insights, GTMetrix or Pingdom channels

How to optimize website loading speed?

  • Hiring 1 private employee with experience in this area
  • Ensure that you have installed the staging domain so that site performance is not obstructed
  • If possible, make sure you have upgraded PHP to PHP7, where you use WordPress or PHP CMS. This has a great impact on website speed.

Not yet optimized user experience (UX) on mobile

The truth of UX optimization on mobile

During the ranking process, the first step of the Google algorithm is to review and evaluate your website via the mobile version.

Saying that does not mean you should bypass or simplify the user experience on the desktop. Because they want that if any changes occur, the users of the autoresponder site will not be affected too much.

How to check mobile browser?

  • Use the Mobile-Friendly Test tool to see if your website is compatible with the mobile viewer?
  • Check if your smartphone’s Googlebot collects information from your website. Note that the smartphone has not implemented all web forms.
  • Does your website correspond to different devices? If your site does not work on mobile, you should find a fix right away.
  • Is there any unusable content on your site? Check if all content is loading normally or not. In general, make sure you fully examine all the pages on your mobile device

How to fix website errors on mobile phones?

  • Understand the impact of mobile phones on page loading on the server.
  • Focus on building pages on mobile phones that are impressive at first glance.

Google prefers a site that meets demand and is their preferred option for mobile site distribution. If you currently run an independent subdirectory, m.yourdomain.com will consider the potential impact of increasing crawl on your server.

  • Selecting a template update suitable for the website interface, each plugin is not enough. You should refer to web developers to get a template like that.
  • Set multiple breakpoint sizes of mobile phones, usually the maximum fixed width is 320px corresponding to the width of the Iphone screen
  • Test on Iphone and also smartphones using Android operating system
  • If there is a piece of content that needs fixing, such as flash or other proprietary systems that do not work on the mobile browser

Consider switching to HTML5 for easier display on mobile. Google’s web designer will allow you to recreate the FLASH file in HTML.

URL structure problem

As your site grows, it is easy to lose track of the URL structure and hierarchy. Poor structure makes it difficult for both users and bots to navigate. This will negatively impact your rankings.

  • Web and hierarchical structure issues
  • Do not use proper directory and sub-directory structure
  • URLs with special characters, capital letters, or are not useful for user searches.

How to find out about URL structure errors?

  • 404 errors, 302 redirects, problems with XML sitemaps are all signs that a website needs a review of its structure.
  • Conduct full information collection on the website (using SiteBulb, DeepCrawl or Screaming Frog) and manually review quality issues
  • Check the Google Search Console report (Crawl> Crawl Errors)
  • Test users – ask people to find content on your website or make test purchases – use UX testing service to record their experience

How to fix URL structure error?

  • Website level planning – I encourage you to use the parent-child directory structure
  • Ensure that all content is placed in appropriate directories or subfolders
  • Make sure your URL is easy to read and makes sense
  • Delete or merge any content that ranks for the same keyword
  • Try to limit the number of subfolders to no more than three levels

It sounds a bit confusing, but I guarantee that effective onpage & URL SEO optimization strategies in 2019 and the next few years will fulfill your desires.

The site contains too much thin content

The truth is:

Google only wants to rank for pages with in-depth content, providing lots of valuable information useful to users.

So, do not focus too much on writing content for SEO purposes, but write for Google.

A page that has too much poor-quality content can negatively affect your SEO for a number of reasons:

  • Content that does not meet user needs can reduce conversion rates and reach customers.
  • Google’s algorithms place great emphasis on the quality of the content, the reliability and the relevance of the page
  • Too much poor-quality content will reduce the crawl rate of search engines, index rates and web traffic.
  • Instead of writing content for each keyword, gather words with the same topic or a certain correlation and then put them into the same article with more details.

An example of Screaming Frog analysis on thin content pages

How to find thin content errors?

  • Scan through the website to find pages with fewer than 500 words.
  • Check Google Search Console for manual notifications from Google.
  • Do not rank the keywords that you are using in the content or keywords showing signs of demotion
  • Check the bounce rate and the time users spend on the website – the higher the bounce rate, the more likely it is that there is poor quality.

How to fix thin content error?

  • Combine multiple keywords with the same topic into the same article instead of writing articles for each word (the number of words of your choice but I think around 5, 6 words are okay)
  • Focus on the pages in which the content is most likely to interact with users – note additional video or audio, infographics or images – if you do not have these sections, look for them on Upwork, Fiverr or PPH.
  • Must put the needs first of all to see what they want? From there create content articles that suit that need.

Technical error when you have not optimized Meta description

Meta description is one of the factors that determine the rate of users who click on your article.

There are 2 cases:

  • If you do not write meta description, Google will automatically take any content in the article to fill in the meta description you have not optimized.
  • If you write too long, your content will not be fully displayed on the search results table.

Of course, you want to optimize your meta description as much as possible.

  • A meta description must summarize the main content of the article. (This part is a bit difficult, isn’t it? It’s easy to write long, not to write short)
  • It contains up to 120 characters to fit both desktop and mobile. Faster, right?
  • Meta description does not necessarily need to cram keywords that need SEO optimization.

How to recognize meta description errors?

  • Use Screaming frog to check the number of meta description characters and the missing meta description article on the total website.
  • Is checking meta description stuffed with too many keywords or unlikely to be too simple?

Because, you have a meta description list of each article on the website + each keyword article. See is recognize immediately ^ ^

How to fix meta description quickly?

  • Write the full meta description before publishing the article.
  • Add meta description for all missing articles.

Note: Meta description of each article must contain a maximum of 120 characters.

Error when not optimizing H1 / Title

It can be said that H1 / Title is the most important content, attracting users to access your website. Besides, title displays right on Google search rankings and H1 is located at the “prime location” – the first line of the article.

Surely you understand the importance of H1 / Title already?

Some common errors

  • H1 / Title is too long (H1> 70 characters, title> 65 characters), does not contain the main keyword and contains LSI keywords.
  • H1 and Title overlap
  • Missing H1 or H1 is not placed at the beginning of the article.
  • The title of some blog posts on the website is duplicated.

How to detect H1 / Title errors?

In fact, you can use the versatile tool Screaming Frog to check the majority of onpage errors, such as H1 / Title suboptimal errors.

Or use the structure “allintitle: title name to check whether your existing titles are identical to the titles of other web articles.

Ex: Type allintitle: Detail 7 simple steps to write attractive SEOstandard 2019 articles to check whether the title is duplicated or not.

The formula to check whether the title is set or not

How to fix H1 / Title errors?

Depending on what mistakes you are making when optimizing H1 and title for quick error correction. Actually, this is very simple!

  • Based on the Screaming frog tool’s report, you may know: Which article is missing H1 or H1 and the same title to adjust.
  • Insert keyword itself + LSI Keyword in H1 and title
  • Note the limited number of characters for H1 and the title of the article.

Too many pieces of unrelated content

In addition to modifying thin pages, you also need to make sure the content in it is interlinked. Irrelevant pages not only do not help the user but also effectively lose other items that are working best on your website.

Moreover, who does not want their website to serve only the best Google content to increase the reliability, authority and SEO power?

A few common cases

  • Create pages with little interaction
  • Let search engines scan through non-SEO pages.

How to find these errors?

  • Review your content strategies. Focus on creating more quality pages instead of trying to create lots of pages.
  • Check the statistics collected from Google and see which pages are being indexed and crawled

How to fix irrelevant content errors?

  • You do not need to be too concerned about the target when planning content. Instead of having to post 6 posts as planned, you should focus more on adding value to the content.
  • If you do not want Google rankings, add them to Robots.txt file. This way, Google will only see the best sides of your website.

Do not take advantage of internal links to create affiliate networks

Internal links help distribute affiliate networks on a website. Websites with little content or irrelevant content often have a lower number of cross links than the number of links of many quality contents.

Cross-linked posts and posts help Google better understand your site. In terms of SEO techniques, the value that these factors bring is to help you build a hierarchical website, while improving keyword rankings. A keyword up can pull other keywords up.

How to find these errors?

  • For the pages you want to top, check which internal pages link to them. You can use Google Analytics tool to check the internal links of the page.
  • Use Screaming Frog to conduct data collection inlinks.
  • You will know for yourself if you actively link to other pages on your website.
  • Do you add internal nofollow links through a plugin that applies to all links? Check the link code in the browser by checking the source code.
  • Use the same small amount of anchor text and links on your site.

How to fix these errors?

  • For the pages you want to top, you should add content by using content that is already available on other sites on the site. Then, insert the internal link into the post.
  • Use data from the Screaming Frog process to crawl websites to find more opportunities for building internal links.
  • Do not cram the number of links and keywords used to link. Make it natural and follow the given sequence
  • Check your nofollow link rule in any plugin you are using to manage links

Some Offpage technical errors

Not closely managing 404 errors

This is an error that e-commerce sites often encounter.

Specifically, when a product is discarded or expired, it will easily be forgotten or encounter a 404 error.

Error 404: “Something is wrong. The page you are looking for is not available.”

While 404 errors may limit your data collection, don’t worry too much. They will not adversely affect the SEO process.

However, 404 pages will actually have problems when they:

  • Receive large amounts of internal traffic or organic search terms.
  • There are external links pointing to the page.
  • There are internal links pointing to the page.
  • A larger number of pages on a website.
  • Pages shared on social networking sites or on other websites.

The way to fix the above problems is …

You should install 301 redirects from one deleted page to another that has a certain link on your website.

This helps to partially preserve the power of SEO resources and ensure users can navigate seamlessly.

Error 404 with large numbers on the website

How to find 404 pages?

  • Conduct a crawl of the entire website (via SiteBulb, DeepCrawl or Screaming Frog) to find 404 pages.
  • Check the Google Search Console report (formerly Google Webmaster Tools)
  • You can also refer to the fix through the following video 

How to fix 404 errors?

  • Analyze a list of 404 errors on your website.
  • Cross-check URLs with Google Analytics (create internal links) to see which pages are getting traffic.
  • Cross-check URLs with Google Search Console to see which pages receive links from external websites.
  • For high value pages, identify an existing page on your site that is most relevant to the deleted page.
  • Set 301 server-side redirects from the 404 page to the current page you specified. If you plan to use the 4XX page, make sure that the site actually works so that it does not affect the user experience.

Problems migrating website (Redirect)

When creating a new website, a new page or new design changes, the technical problems need to be solved quickly.

Some common mistakes

  • Use 302 redirects (short term) instead of 301 redirects (long term). Although recently, Google thinks that 302 redirects are capable of transmitting SEO power. But based on internal data collected, I feel that using 301 redirects is still safer.

Many of you use 302 code to redirect Http to Https, please check and edit, otherwise you will be adversely affected by the whole site SEO!

  • The HTTPS setting is incorrect on the website. In particular, not redirecting your site’s HTTP to HTTPS may cause problems with duplicate pages
  • Do not transfer 301 from the old site to the new site. This problem usually occurs when using plugins for redirects 301. 301 redirects must always be set through the website cPanel.
  • Leave old tags on the site from the staging domain. (canonical tags, NOINDEX tags, … – tags that hinder the indexing of pages on the staging domain)
  • Staged domains index: In contrast to the above case. It happens when you misplace tags in staged domains (or subdomains) for the purpose of Noindexing them from the SERPs.
  • Create a “redirect string” during the cleaning process of old websites. In other words, it does not identify the pages that have been redirected before.
  • Do not save www or do not have website www in .htaccess file. Make 2 or more cases when Google proceeds to index your website, then duplicate pages are also likely to be indexed.

How to identify these errors

  • Conduct a full crawl on the website (using SiteBulb, DeepCrawl or Screaming Frog) to get the necessary data.

How to fix errors when moving the website

  • Check 3 times to make sure you have 301 redirects correctly.
  • Then check that your 301 and 302 redirects redirect to the correct page.
  • Check the canonical card the same way and make sure you put it in the correct position.
  • If you have to choose between canonical 1 page or 301 page redirection then obviously 301 redirect will be safer and more effective
  • Check your code to make sure you have dropped all the NOINDEX tags. Do not ignore the plugins option because maybe website developers have coded NOINDEX as titles.
  • Updated robots.txt file
  • Check and update the .htaccess file

Error of XML sitemap

The XML sitemap is responsible for listing the URLs on your website that you want search engines to crawl and index. Then, you are allowed to add to the site information such as:

  • When was the last update?
  • Page level changes?
  • The importance of that page when linking to other URLs on the website.

Although Google admits to ignoring a lot of this information. But it is important to optimize properly, especially on large structured websites.

Sitemaps are especially beneficial on sites like:

  • Some locations on that website are not available through the browsable interface
  • Webmasters use advanced content that search engines cannot handle, such as: Ajax, Silverlight or Flash
  • The site size is so large that web crawlers skip some recently updated content
  • When websites have a large number of pages that are isolated or do not link well together
  • Misuse of “crawl budget” on unimportant pages. If this is the case, use Noindex to immediately block this collection

How to find these errors?

  • Submit your sitemap to Google Search Console.
  • If you implement SEO on Bing, remember to use Bing webmaster tools when submitting sitemaps!
  • Check the sitemap errors through the steps: Crawl (Crawl) -> Sitemaps -> Sitemap errors
  • Check the log files to see when the last time your sitemap was accessed

How to fix those errors?

  • Make sure your XML sitemap is connected to Google Search Console
  • Conduct server log analysis to understand how often Google crawls your sitemap. There are many other things that I will recommend when using my server log files later.
  • Google will show you the problems and examples of them so that you can fix them accordingly.
  • If you are using a plugin to create a sitemap, make sure it is a new plugin and the file it creates works well.
  • If you do not want to use Excel to check your server logs – you can use server log analysis tools like Logz.io, Greylog, SEOlyzer or Loggly to see how to use XML sitemaps.

Error in robots.txt file

Robots.txt file is responsible for monitoring the process of search engine access to your site.

Many people believe that the file Robots.txt is the main reason preventing the indexing process of the website

However, most of the problems with robots txt often result from you not changing files when moving a website or entering the wrong syntax, like the example below.

How to know robots.txt file error occurs?

  • Check your website stats
  • Check the Google Search Console report (crawl> robots.txt test tester)

How to fix errors in robots.txt file?

  • Check out the Google Search Console report. This will help validate your file
  • Make sure the pages / directories that you DO NOT want crawled in are robots.txt files
  • Make sure you are not blocking any important directories (JS, CSS, 404, etc.)

Misuse of canonical tags

The canonical tag is part of the HTML that helps search engines decode duplicate content. If there are two similar pages, you can use this tag to notify search engines which pages you want to display on search results.

If your website runs on a CMS like WordPress or Shopify, you only need to use the plugin (preferably Yoast) to install canonical tags.

I found some websites that abuse canonical tags on things like:

  • Use Canonical tags to point to unrelated pages.
  • Use Canonical tags to point to 404 pages.
  • Combine canonical tags together
  • E-commerce and “faceted navigation”
  • Through CMS create 2 versions of the same page.

It is important to inform the search engine of the pages with inappropriate content on your website, but this has a significant impact on the indexing process and website ranking.

How to identify canonical tag errors?

  • Conduct data collection of entire website through DeepCrawl
  • Compare “Canonical link elements” with the original URL to see which page is using canonical tags to point to another page

How to fix canonical tag errors

  • Check the pages to determine if canonical tags are pointing to the wrong page
  • In addition, you should also check the entire content to know more about similar content pages or find out if there are other pages that need canonical tags.

See also: Some reasons you are not top 3 google and solutions

The misuse of robots tags

Similar to robots.txt files, robots tags can also be used in title code. From there arise more potential problems, especially robots tags used at the file level and on individual pages. In some cases, I have seen multiple robots tags appear on the same page.

This problem makes Google more susceptible to confusion and will likely be a barrier that prevents well-optimized pages from getting a chance to rank.

How to identify a problematic robots tag?

  • Check the source code in the browser to see if any robots tags have been added more than once.
  • Check the syntax and make sure you don’t confuse nofollow links with nofollow robot tags

How to fix robots tag errors?

  • Use Yoast SEOto know how to effectively manage the robots tag activity.
  • You can also use plugins to control the operation of the robot
  • Make sure you revise the templates where the robot tags are placed according to the following access steps:

Form> Subject> Repair> header.php

  • Add the Nofollow directives tool to the robots.txt file so you do not have to search for files from one file to another

Not well managed crawl budget item

Google cannot crawl all content on the internet at the same time. To save time, Googlebot has the ability to allocate to webpages depending on certain factors.

A site with higher authority will have a larger crawl budget. This means that these websites are crawling and indexing more content than websites with lower authority. (websites with fewer pages and fewer visits)

How to identify these errors?

  • Learn about the amount of data collected in Google Search Console by:

Go to Search Console> Enter your domain name> Select Crawl> select Crawl Stats

  • Use server logs to find out which Googlebot spends the most time on which part of your site. This will tell you if Googlebot is on the right page.

How to fix these errors?

  • Minimize the number of errors on your site.
  • Noindex pages that you do not want Google to see.
  • Reduce the redirect chain by finding all the links that link to the page itself which is redirected and update all the links to the new landing page.
  • Fix some of the other issues I discussed above to help increase your crawl budget or focus crawl budget on the right content in the long run.
  • Particularly for e-commerce websites, if you have not changed the actual content on the page, you should not block the parameter tags used for multidimensional navigation purposes.

Through research and exchange, I realize that people, including senior SEOs, are still making these 7 basic mistakes when deploying SEO. These mistakes can make your website flattened forever on pages 2, 3, … and do not bring conversion value for the business. Are you curious?

You think you are not among them? Then glance at it!

But I am sure you will always return to read this article when you encounter problems. So why not read it now to avoid it? “Prevention is better than cure”, right? Read it out!

7 Common mistakes in implementing SEO

Conclusion

In the article, I have introduced to you some common technical errors that every SEOer guy has ever encountered in the SEO process. At the same time, it is a way to detect errors and effective remedies for each case.

Hopefully after reading the article, you will be aware of these technical errors. In order to avoid unnecessary risks to the website. Above all, it will provide timely and proper repair to maintain the operation and development of the website.

Good luck!

Reference source and quoted image: From the Future

P / s: Maybe you need a system of SEO knowledge from A – Z to start your own journey to conquer SEO in the shortest time. Because of that GTV SEO is pleased to present the Mastermind Online SEO Course. Watch now!

 

SEO Technical Error: 13 Common Errors And Effective Ways To Fix

Leave a Reply

Your email address will not be published. Required fields are marked *