Skip to main content Skip to navigation

How search engines read content

How search engines index webpages

We tend to assume that if we create a page, people will just see it and then read it. That's not always true. There are two things you really need to remember: people scan-read on screens, and search engines depend on written content.

How it works:

  • Web crawlers (also known as bots, robots or spiders) are a type of software designed to follow links, gather information and then send that information somewhere.
  • Googlebot is the webcrawler used by Google. The information is used to update the Google index.
  • The Google index is where webpages are compared and ranked. In order for your webpages to be found in Google, they must be visible to Googlebot.

There are many ways to optimise content so that it is both searchable and right for the audience.

To find out more about this please contact the Digital Marketing Team