How to Debug JavaScript SEO issues in Chrome
In this guide, I will talk about how to use a very simple manual technique to efficiently diagnose rendering issues on a website.
In this guide, I will talk about how to use a very simple manual technique to efficiently diagnose rendering issues on a website.
In this case study, I identified that Google was reporting on URLs blocked in the robots.txt file in the “Valid with warnings” report and how I resolved the problem very slowly.
It has been an interesting year and I wanted to make sure I captured some quick tips for anyone starting out.
A 2020-21 business report and the lessons learned in my first year of being self-employed.
In this case study, I found one of my own blog posts had not been indexed by Google. So, naturally being a technical SEO specialist I investigated it.
In this pointless crawl budget experiment, I blocked two resources which returned a 404 and 502 Googlebot spent a lot of time crawling each day.
I created a way to replicate Googlebot to debug crawling issues called the Chromebot technique.
I did a test in April 2019 to see if rel=“next” and “prev” tags were used in crawling to discover new content.
I did a quick test in 2018 to find out if an HTML canonical tag and the rel=”canonical” HTTP header are analysed at the same speed in Google’s index.
I am happy to announce that I am now a Freelance Technical SEO Consultant.