So moving forward, this begs one very important question:
In the process of building our most recent Thought Space redesign, we attempted a new trick to SEO a one page website. I planned on having panels on the single page home screen. I made separate HTML files for each panel, complete with our sites header and footer. I linked these HTML files together like a normal site (home, contact, etc…). The single page nav scrolled to other parts of the screen instead of linking out like the other files. Essentially, we had a normal site structure as well as a single page that combined all of the other pages into a series of panels. No content was changed at all between the versions, simply structure.
Unfortunately, everything did not work as well as planned. We shortly realized that Google was indexing our main single scroller page (which ultimately led to this blog post). All hope was not lost on the previous work, as it still serves as a no JS version of the site. This led me to make one strong realization, however
For example, my main page got indexed because I placed a redirect on page load. This did not require any user interaction to trigger the action. GoogleBot happily followed along straight to my single version and indexed it instead. The entire plan for site links with Google had been shattered. I also happened to realize that some of the content on my home page wasn’t being indexed by Google. After looking into it further, the obvious came about
Interactions involving clicks, hovers, or other user interaction are not processed by GoogleBot
Comments On This Post
You could also perform the feature-based forwarding in the opposite way: Set a delayed meta-redirect for non-JS users, and employ JS to scrub the redirect from the page on-load.
Good suggestion! never thought of it that way. Regardless, this method still probably would not spoof Googlebot as the detection would occur and be processed on page load with no triggers required to set it off. It appears this method of page redirection spoofing may never work.
Great article Jareth! Maybe worth to repeat that experiment from time to time? Did you also check if googlebot carries out Ajax?
I definitely agree, but I’ve been too busy to retest this. Feel free to test it again and post your findings!
And regarding Ajax, I did not test, however there is a good bit of documentation for how to set up Ajax to be properly read by Google. Try searching around a bit and I’m sure you’ll turn something up.
Ok, we will see.. Maybe I will set up some test cases myself and get back to you to share my findings 😉 Regarding Ajax: Many online shops use Ajax for its navigation and often there´s just too many links from seo perspective. Sometimes only a party of links are written in the code, too. The rest is put in via Ajax loading. If googlebot is processcing Ajax it will probably be counting the whole number of links. And that might affect the “flow of link juice” to important pages… That´s why I want to find out if googelbot is executing Ajax;)