can google read text in images googlebot featured image

Can Google Read Text in Images?

We believe that, yes, Google is currently at least trying to read the text in images.

Despite assertions that say Google can’t read text in images based on social media feedback from Google employees, it’s too easy for those employees to hide what’s going on behind the scenes using clever language on social media.

As the leader in internet search, Google’s goals should include the ability to parse what’s hidden in the text of images. Indeed, they have already begun efforts to interpret the content of images with or without text. For example, Google Translate technology reads real-world text and instantly produces a translated result using images produced by mobile devices.

Such technology could trickle easily into reading text included in images in various languages and fonts (and it could be already happening). We can’t say definitively, but we can draw conclusions by building a picture of what we know Google can do.

Google Guesses What’s in Images Using Several Methods

Performing simple Google Images searches with and without text produces marvelous results. By plugging in a variety of images from your website, you can start to form hypotheses based on Google Images output.

Context

The search giant seems to be adept at pulling important information from the text surrounding an image. We uploaded an image from our B2B Video Marketing for Service-Based Businesses blog post. The colored block was intended to represent the dimensionality and variety of video types for marketing.

Google gave us the result below:

google image search result for b2b video marketing abstract box

The engine correctly guessed that the block represented video – but how did it gather that information from a colored block? We suspect that Google read the text around the image and pulled the keyword video from the text.

Simple Shapes

We also plugged another image of the same blog custom-made for our post. The 3D gold stars were aligned so that the tips of the horizontal edges touched, and the shapes were pressed against a flat background so that they glowed and cast shadows slightly.

google image search result for b2b video marketing gold stars

Google still managed to understand the basic shape. While it’s possible to attribute Google’s understanding to the information provided in the image name, google-five-star-rating-3d.png, we believe that Google took a harder look at the shapes and colors it could recognize. It was able to produce visually similar images that included near matches of color, number, and shape:

google image search result for visually similar gold stars

 

We looked deeper into the provided similar images. Some results did not include star in the image name, and most of them did not include such defining keywords as gold or 3D. The image recognition technology to which Google currently has access appears to be able to collect more information than context and metadata to deliver results to searchers.

Metadata

Metadata in Google Images is still extremely viable, however. In the world of SEO, image metadata helps Google come to the right conclusions about what’s contained in images. It’s still good practice to include alt text in all images and to properly name your images with concise clues about their content.

For an experiment in how image metadata affects SEO, we tried uploading our Influencer Marketing Facts and Statistics infographic to Google Images.

google image search result for influencer marketing facts and statistics infographic

The image name and its alt text both include phrases such as influencer marketing facts and statistics. The text in the infographic, however, only mentions stats. Though Google is smart enough to understand that stats and statistics are related, it isn’t likely that the image search pulled influencer marketing facts from the title text of the infographic that reads Essential INFLUENCER MARKETING Stats.

Instead, it probably relied mostly on metadata to provide its result. However, its reliance on metadata for a Google Images result still doesn’t tell us how Google determines the value of content in a post that includes an infographic for standard search results.

Reading Infographics: Leverage Marketing Internal Study

To find out more about how Google ranks images with text, we put our own infographics to the test. We monitored the ranking changes of two infographics over several months to determine the efficacy of putting information into a different visual medium.

 influencer marketing facts and statistics infographic  social media for ecommerce facts and statistics infographic
The influencer marketing infographic ranks 3rd for its target keywords as of the publishing of this blog. The social media infographic ranks 23rd for its target keywords as of the publishing of this blog.

We included text versions of the information included in the infographic with the influencer marketing post, but not with the social media post. The social media infographic was also designed with more advanced graphical styling. It included graphs and charts as well.

We determined that the most likely factors to attribute to the difference in ranking for target keywords of each infographic was based largely on:

  • The simplicity of infographic design
  • The inability of Google to pull meaningful data from charts and graphs made for users
  • The inability of Google’s potential text parsing technology to read text disrupted by multiple colors or graphics
  • The overall contrast of background and text colors

However, both infographics are still young in the world of SEO. Both infographics continue to climb in ranking, but their overall difference in ranking is significant enough to theorize.

Google Is Likely Trying to Read Text in Images

The search giant has been experimenting with convolutional neural networks for years. They are artificial neural networks that use data input to mimic the learning and output process of a thinking creature.

The same neural networks have the potential to learn how to read text in images in a similar way to what we do. Convolutional neural networks contribute to natural language processing and image and video recognition, both of which are behaviors that govern the way humans think and react.

googlebot reading text in image of infographic

The goal of search is to understand searcher intent and, in response, use the power of computation to deliver instantaneous results. It’s a matter of course that reading text in images will, at least someday, be part of that process. And though there are skeptics who have concluded that Google isn’t actively reading text in images, we beg to differ.

SEO is hard, but Leverage Marketing has it on lock. If you don’t want to deal with the rigors of SEO, talk to our digital marketing team today about offloading the burden.

Eric Ysasi

Eric Ysasi

Online Content Specialist at Leverage Marketing
Eric is a content specialist and copywriter at Leverage Marketing in Austin, TX. Following 4 years as a Public Affairs specialist in the United States Air Force, Eric received his B.A. in English and Modern Languages, then taught English in Kikonai, Japan. Pursuing his love of language, he began a career in inbound marketing and copy writing. Outside of the office, Eric hikes, bikes, skateboards, reads, watches movies, and plays guitar and piano.
Eric Ysasi
0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *