Compression of Images on a Page

Approach 1: progressive loading of content the entire page had an api call which gave a relatively large payload along with the content. Initially, we came up with the idea of ​​splitting this into multiple api calls to reduce the payload size. This would also load the content in such a way that the dom tree is minimal for the first load. Since googlebot would see different content and the user would see different content, our seo (search engine optimization) team said no to this plan. Pros: reduced initial page load time, better user experiencecons: possibility of negative seo involvement approach .

This Would Solve the Cloaking Problem as

The content would be the same for the bot and the user VP IT Email Lists until the user clicks the call to action button. This, however, leads to a very poor user experience and so that was a no. Pros: reduced initial page load time, no risk of seo penalties for cloakingcons: leads to poor ux approach 3: optimize everything, almost everything the sure method of performance optimization was what we had to go back to, and it worked for us. Reduced api response payload size.

Highest Possible Lossless 

VP IT Email Lists

Removed all unwanted lines and optimized css optimization of our javascript code and delayed loading of some. Api response caching (although this had no impact from a crawling perspective)lazy loaded all the images and gradually loaded the embedded google maps. Gzip enabled the page now loads just fine in less than 5 seconds, however, the image loads a bit later. The initial load of this page is well within the limits of the “Less than 5 seconds” test. We crossed our fingers and submitted it to google webmaster tools to pick it up and return it. And he crawled in a few days! How did we manage the meta title and description?

Leave a comment

Your email address will not be published.