I am just thinking. Is there a way to hide only specific parts of page from crawlers?
I know I can set "display: none;" and then show it to user with JS. In the past time it worked, but now crawlers are (/will be) smarter (google's crawler especially) and can read JS.
So, is there any way to hide some parts from robots, but not from users in the future? Not whole page, just parts.
PS #1: I am just thinking, I have no reason to do it (yet?).
PS #2: Maybe it's possible with AJAX? But there is no reason why it should be hidden from smart-js-reading-crawler.
Thank you all,
Fundamentally, there is no significant difference between a robot and a UA operated by a person.
Robots can execute JS. Robots can bypass captcha (though image analysis, or feeding them to humans who will translate them for money, or various other techniques). Robots can implement time delays to look less like machines. Etc.
There is currently no discussion for this recipe.
This recipe can be found in it's original form on Stack Over Flow.