Updating multiple web pages

Rated 4.43/5 based on 888 customer reviews

The Facebook crawler needs to be able to access your content in order to scrape and share it correctly. If you require login or otherwise restrict access to your content, you'll need to whitelist our crawler.You should also exempt it from DDo S protection mechanisms.If the pages attached to a Dynamic Web Template are not properly updated, make sure that you have website metadata added to your site.To add metadata to your website, see Add or remove metadata.This also means that different versions of the same content will be treated the same, even if they're hosted on separate subdomains or are accessible over both and https://.If needed, our crawler will follow a chain of redirects to resolve the canonical URL.You can receive all changes, or just changes that match keywords you specify.

The first time someone shares a link, the Facebook crawler will scrape the HTML at that URL to gather, cache and display info about the content on Facebook like a title, description, and thumbnail image.

You can get all the new content collected in one email, or separated into several emails, i.e.

to sort the information on different topics or importance.

If the crawler isn't able to do this, then Facebook will not be able to display the resource.

This ensures that all actions such as likes and shares aggregate at the same URL rather than spreading across multiple versions of a page.

Leave a Reply