A common way content can be blocked from search engine spiders is by placing text within an image. Often, designers will place text that needs to be in a special font in an image. The advantage is that the image ensures the text looks the same way to all visitors. Unfortunately, search engine spiders cannot (easily) read the text contained in an image. As an alternative, you can use a font from Google's font library. Using a font from the library, will allow the words to be placed in text where Google can easily access that text. This also ensures visitors see the content in the desired font. Along with using the font library, CSS and jQuery can give you other ways to style and animate the text, providing even more design choices than an image can provide.
One additional way content can be blocked from robots is within videos. While videos can provide a great means of communicating with visitors, robots aren’t able to watch and understand the video. You can use alternative text here too by providing the content in a transcript or plain text format that robots can more easily access. Along with helping bots access this material, the transcript will also help your human visitors who don’t want to watch the video (or who did watch the video but want to refer back to the transcript in the future).
A final consideration when thinking about crawlability is hidden content. For example, it is common practice on websites to place some content behind tabs. However, this type of tab technique can hide your content from Google, which can get you penalized. Google's general rule of thumb is that if the content is hidden but still available to humans (for instance, by clicking on a tab), then that is acceptable behavior. If it is hidden and there is no means of a human being able to un-hide the content (or if the means of doing so are buried where no human is likely to find it), then you run the risk of robots not finding the content (as well as running the risk of humans being unable to see the content too).
In Google Search Console, you can use the "Fetch As Google" tool to see your website as Google. In Google Search Console, click on crawl then click on Fetch As Google.
Input your page's URL then click Fetch. After inputting your page's URL, you can make sure Google can find the content you are expecting Google to find. Helpfully, you can also select from the dropdown to fetch on desktop or mobile devices since your mobile design might hide or position some elements differently than your desktop design.
Bing also offers an ability to fetch a page on your website as Bingbot. After logging in, go to Diagnostics & Tools, then click on Fetch as Bingbot.
After inputting your URL in the search box, you will see a copy of the content as Bing sees it. You can search through this code to make sure that the most important content you need Bing to find is contained in this output.
Want help improving your website’s technical SEO factors? Contact us today to discuss how we can help review and improve your current technical structure.