![]() So for example, you could simply go to the ‘response codes’ tab, click the ‘No response’ filter, press ‘control a’ to highlight all the URLs, right click and re-spider to crawl them all again. You can use the control and shift functions to highlight multiple URLs to re-spider. This is particularly useful if you’ve had connection time outs, unexpected responses or you wish to update some URLs when editing the site itself. ![]() Re-Spider URLs – You can now re-spider urls with the right click function.X-Robots-Tag & Canonical HTTP Headers – The SEO spider now offers support for both the X-robots-tag and canonical http headers.This feature is only available to licensed users and we ask everyone to use this responsibly. We also have a custom user-agent feature which allows you to specify your own user agent. User Agent Switcher – The SEO spider now have a user-agent switcher with inbuilt preset user agents for Googlebot, Bingbot, Yahoo! Slurp, various browsers and more.It narrows the default search by only crawling the URLs that match the REGEX which is particularly useful for larger sites, or sites with less intuitive URL structures. ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |