August 18 2023

Kleecks, iSmartFrame and CDN Optimizing Core Web Vitals: Here's How Google Tests Cheat and Why They're of Little Use for SEO

Let's see how some tricks can improve the Google PageSpeed ​​Insight score but fail the Core Web Vitals tests.

In the dynamic world of the web, where every detail can mean a difference in positioning and user experience, Google has made a notable qualitative leap in its evaluation methodology. Since Google introduced PageSpeed ​​Insights – tools that for years have been the reference for developers and SEOs to evaluate the performance of their sites – its evolution led to the advent of Core Web Vitals. This new suite of metrics not only measures the efficiency and speed of a website, but goes further, focusing on the actual user experience. By analyzing crucial aspects such as loading time, interactivity and visual stability, i Core Web Vitals they offer a more in-depth and complete overview of how a site is perceived by users, highlighting the importance of optimizations that go beyond simple loading times.

I Core Web Vitals: The transition from Vanity Metrics to Crucial Tools

The days of boasting a high score on PageSpeed ​​Insights are all you need. Now, Google requires a deeper understanding of a site's actual performance. THE Core Web Vitals they have become the emblem of this evolution, marking a clear distinction between what is purely aesthetic and what is fundamental to the user experience.

  1. I Core Web Vitals LAB: This is a set of tests conducted in the laboratory by Google. These tests, while rigorous and detailed, are actually simulations of a site's performance. They don't necessarily reflect the end-user experience, but they are valuable tools for developers. They work like a compass that indicates which direction to move in during the site design and optimization phases. However, it is crucial to understand that while they are indicative, they do not represent the concrete reality of how a site is perceived by users.
  2. I Core Web Vitals CRUX (Chromium Real User Experience): Here we enter the heart of the user experience. These metrics are based on real data collected by Chromium-based browsers. These include giants like Google Chrome, but also Microsoft Edge, Opera, Brave and many more. Every time a user opens a web page through these browsers, they send Google a series of detailed information about page loading, its interactivity and visual stability. Google, by analyzing this data, extracts an average of the performances of the last 28 days and establishes whether or not the site responds positively to the parameters of the Core Web Vitals, for both desktop and mobile versions.

While LABS tests offer a "theoretical" view of a site's performance, CRUX data provides a "practical" representation, based on real experiences. The latter have become of vital importance in determining the visibility of a site in the Google SERP. In other words, a site may score excellent on LABS tests, but if it falls short of CRUX metrics, its position in search results could suffer severely.

We talked about it intensively and specifically in this post: Core Web Vitals and CRUX data.

Simply put, a site with a score of 100 on PageSpeed ​​Insights does not automatically pass the tests Core Web Vitals, just as a site with a score of 60 is not necessarily destined to fail them. For example, one of our clients with an average mobile score of just 50 on PageSpeed ​​Insights still managed to pass the mobile test with flying colours. Core Web Vitals. This site ranks as the 375th most visited site in Italy, with at least 9,2 million unique visitors per month and over 15 million page views, according to data from Similarweb.com and connected Analytics. This shows that the user experience, evaluated by Core Web Vitals, can be excellent even if your PageSpeed ​​Insights score isn't perfect.

PageSpeed ​​IlCorrieredellacitta

Similarweb The city courier

 

The evolution in the pursuit of optimal web performance

In the recent digital context, where the speed and efficiency of a website can make the difference between a won customer and a lost one, attention to performance optimization has become crucial. The renewed importance attributed by the Core Web Vitals has boosted the optimization solutions industry, bringing to light a set of tools designed to help websites achieve peak performance. These tools not only help improve loading times, but also optimize the interactivity and visual stability of pages, ensuring a superior user experience.

Kleecks CDN and iSmartFrame: The Seeming Magic in Performance Optimization

Among the multiple options available to developers and website owners, stand out Kleecks CDN and iSmartFrame as two recognized leaders in providing performance-oriented solutions.

  1. The philosophy behind CDNs: CDNs, or Content Delivery Networks, represent a network of servers distributed in various geographical points, with the aim of serving content to visitors more quickly and efficiently. The main objective of these networks is to minimize the distance between the visitor and the source of web content, ensuring a reduced loading time and a smooth user experience.
  2. Kleecks CDN and iSmartFrame at work: Both of these solutions, while each having its own specific characteristics, exploit the potential of CDNs and operate as a reverse proxy. In this function, they act as intermediaries between the end user and the site's original server.

    The magic happens when they take charge of the source code of a website and optimize it, performing advanced technical operations such as:

    • Minification: Compress JS and CSS codes, reducing space and making loading faster.
    • Converting images: Exchange heavy image formats for lighter and faster formats like WebP, without compromising quality.
    • Cache and latency reduction: Thanks to caching mechanisms, frequently requested content is stored and served more quickly, minimizing user waiting times.
    • Much more.
  3. A gift to developers: The beauty of these solutions lies in their nature of PaaS, or Platform as a Service. Instead of manually handling optimization complexities, developers can rely on these platforms to do all the heavy lifting, allowing them to focus on other project challenges without having to dig into application code and troubleshoot of performance by correcting the code.

In-depth analysis of the discrepancy between LABS and CRUX

In our ongoing effort to understand the dynamics of website performance, we've come across a particular dilemma regarding the use of services like Kleecks, iSmartFrame, and other similar optimization tools. While these services promise optimal performance, the reality may be slightly different.

It is rather ironic, to put it mildly, to observe how companies that present themselves on the market as Enterprise leaders in optimizing Core Web Vitals and web services have their own institutional sites, with which they interface with the world, unable to pass the tests Core Web Vitals. This raises significant questions about the actual effectiveness of such services and their ability to deliver on the promises made to customers.

For example, during our analysis, we found that some of these services, despite being sold as cutting-edge solutions for improving performance metrics, fail to adequately optimize their websites. The images below show the results of tests conducted on these sites, highlighting how they fail to meet the standards of Core Web Vitals, despite the fact that they should be examples of excellence in this field.

This phenomenon can be attributed to various factors. In some cases, the optimizations offered may be too general and not specifically tailored to the needs of the individual site. In others, there may be a mismatch between marketing promises and the actual technical capabilities of the services.

 

Arriving in both cases to have very serious TTFB problems that exceed the second, where Google recommends a TTFB time of less than 200 ms.

In the course of our investigations, we decided to go beyond simple theory and assumptions by examining in detail some customers who make use of these emerging technologies. The goal was to better understand how these technology stacks handle requests and serve content, particularly in response to Google's specific PageSpeed ​​Insight user agent.

After a series of painstaking and meticulous tests, we have obtained surprisingly enlightening results. We found that several JavaScript files did not actually load when detected by the Google PageSpeed ​​Insight user agent. This allows sites to achieve an impressively high LABS score, almost as if they are wearing an optimization mask. However, when it comes to the tests Core Web Vitals on real user experience (CRUX), the results are less flattering: not only do they fail these crucial tests, but they also have significant shortcomings in terms of performance.

This finding reveals a problematic practice in which websites are optimized in such a way that they achieve artificially high scores in PageSpeed ​​Insight tests, masking their true performance. This approach, while it may temporarily improve the perception of site performance, does not address the fundamental issues impacting user experience. Indeed, i Core Web Vitals they are designed to evaluate users' actual interaction with the site, and therefore, if a site fails in these metrics, it means that end users may experience longer load times, less fluid interactions and insufficient visual stability.

For example, a particularly worrying figure was the very high Time to First Byte (TTFB), or latency, a key indicator of server responsiveness that Google recommends always keep below 200 milliseconds.

TTFB 200ms

In short, it seems absurd to see TTFB of more than a second in sites of companies that propose to optimize Web Performance, resulting a priori not very credible.

To make our findings more accessible and understandable, we've condensed our findings into an analysis video. We invite all interested parties to view it to gain a detailed overview and in-depth understanding of what we have discovered.

Misalignment between synthetic metrics and real data

We have observed that many websites, after integrating these solutions, show exceptional scores when analyzed using the Google PageSpeed ​​Insight LABS tests. However, these scores do not seem consistent with the results provided by Core Web Vitals (CRUX), which represent site performance for real users.

In this regard, we wanted to take as an example some sites that we can see in the video above which suggest both the discrepancy and the methodology used to go and verify the modus operandi of these "miraculous CDNs".

The apparent disconnect between these two metrics raises some concerns:

Synthetic LABS Tests: Reliability and Limits in the Real World

Synthetic tests, such as those offered by LABS, are a type of analysis that simulates user behavior on a website in a controlled environment. While they are extremely useful for identifying performance issues in development or optimization, they have some inherent limitations that could make their results less representative of actual user experiences.

How do synthetic tests work?

Such tests are performed in the laboratory, or in virtual environments, where variables such as bandwidth, latency and device resources are standardized or simulated. This allows developers to obtain performance metrics under "ideal" conditions, eliminating the fluctuations that might occur under real-world sailing conditions.

Limitations of Synthetic Tests
  1. Standardized environments: Because these tests are performed under controlled conditions, they may not account for different combinations of hardware, software, and connectivity that end users may have. A site might work well on a high-end device with a fast connection, but perform poorly on an older device or with a slow connection.
  2. External interference: Real users might have many tabs open, applications running in the background, or even security software that could affect the performance of a website. These factors are not typically simulated in synthetic tests.
  3. Caching and User Interactions: While synthetic tests may simulate some of them, they may not fully capture real user behavior, such as scrolling a page, clicking on various items, or how browsers handle caching of a site during subsequent visits.
  4. Deceptive Strategies: As mentioned earlier, techniques such as cloaking could allow a site to "cheat" synthetic tests by presenting an optimized version when it detects a test in progress. This could result in artificially high performance metrics.

Cloaking: A Deceptive Strategy For Manipulating Google Tests?

The term "cloaking" refers to a search engine optimization (SEO) practice that has raised a lot of controversy over the years. This tactic is based on presenting different versions of a web page to search engines and real users. The main goal behind this maneuver is to manipulate and improve a site's ranking in the search engine results pages (SERPs), by showing engines content that could be seen as more relevant or optimized.

How does cloaking work?

Cloaking is a sophisticated technique used to present different content depending on the user making the request to the server. The fundamental principle on which cloaking is based is the recognition of the User Agent or IP address of the requester.

User Agent recognition

The User Agent is a text string that the browser sends every time it requests a web page, providing information about the browser itself, the operating system and other details. Search engine crawlers, such as Googlebot, use specific User Agents that can be easily identified. When the server receives a request from a User Agent that is recognized as belonging to a crawler, it can respond with content that is optimized for crawling and indexing. For example, it can return a simplified HTML version of the page, which excludes dynamic elements such as JavaScript and complex CSS, making it easier for the crawler to parse the content.

IP address recognition

In addition to the User Agent, the server can also identify crawlers by IP address. Major search engines use known IP ranges, and the server can be configured to recognize these requests and respond accordingly. This method adds an additional layer of control, as User Agents can be spoofed, while IP addresses are more difficult to mask.

Operation example

Let's imagine a web page that intensively uses JavaScript to generate dynamic content. A normal visitor accessing this page will see all the interactive and dynamic elements generated by JavaScript. However, when a search engine crawler like Googlebot visits the same page, the server recognizes the crawler via User Agent or IP address and serves a static HTML version of the page. This static version is more easily indexed by search engines, thus improving the site's SEO.

CDN and Cloaking : A New Paradigm for Data Optimization Core Web Vitals ?

In light of the growing emphasis placed on metrics such as i Core Web Vitals, one might assume that some CDNs, in their mission to optimize performance, resort to tactics similar to cloaking. This would mean that when such CDNs encounter a LABS test from Google PageSpeed ​​Insight, they could serve up a “lightened” or “optimized” version of the site, dropping or tweaking some elements to get higher scores.

During our investigations, we simulated being a Google bot by modifying the User Agent of our browser and we noticed that, in some circumstances, external Javascript scripts, notoriously heavy and potentially slowing down, were not loaded. While this omission may result in seemingly faster load times during testing, it may not reflect the actual user experience.

A dangerous precedent in the WP Optimize plugin for WordPress accused of altering PageSpeed

WP-Optimize, a popular WordPress plugin to improve site performance, he has been accused of manipulating benchmarks. Gijo Varghese, developer specializing in web performance and creator of the FlyingPress plugin, pointed out that WP-Optimize disables JavaScript when testing with benchmarking tools. His claim was corroborated by a screenshot that showed how the plugin prevents JavaScript files from being loaded during testing.

This behavior has generated negative reactions from the WordPress community. Some have compared this tactic to similar scams, such as the Volkswagen emissions scandal. Users have expressed their disappointment and concern about these deceptive practices. The discussion highlighted the importance of focusing on real user experience rather than test scores. However, trust in WP-Optimize has been compromised due to these revelations.

Implications and Ethics

The potential discovery that CDNs are using cloaking techniques is not just a technical detail, it raises deeply rooted ethical and technical questions. When an organization engages in cloaking, it may actually be “masking” a site's true performance, giving the illusion of optimization that doesn't really exist. While the primary intention may appear to be to improve performance, what is actually happening is skewing test results, giving itself away as a misrepresentation of the site's actual capabilities. This can lead developers and website owners to make decisions based on erroneous data, diverting them from the optimal course.

Beyond that, it is crucial to consider the significant financial burden such solutions entail. The rates of some of these CDNs can reach considerable sums, even several thousand euros per month, which translate into considerable monthly expenses in the long run. If these large sums are spent without achieving a tangible improvement, such as exceeding the Core Web Vitals, one could legitimately ask whether these resources would have been better spent elsewhere.

Indeed, considering the current landscape of web technologies and the growing emphasis on performance, it would make absolute sense to reinvest these sums in more sustainable and enduring solutions. Committing resources to hiring or consulting experts, such as dedicated web developers and Linux systems engineers who specialize in web performance, could offer a much more significant return on investment. These professionals can address and resolve performance issues at their root, delivering tailored solutions that not only solve immediate challenges, but also prevent future problems. And all for a one-time investment, rather than onerous recurring fees.

The Impact of Javascript on Web Performance: A Double-Edged Sword

Javascript has become one of the fundamental tools in web development, allowing you to create rich, interactive and dynamic web applications. However, like any powerful tool, if not used wisely, it can have unintended consequences on a site's performance.

The Weight of Javascript on the Browser

When a browser loads a web page, it has to parse and execute the Javascript scripts included in the page. This process can be quite onerous, especially when it comes to:

  1. Large Scripts: Large scripts take longer to download, parse and execute. This can delay the processing of other crucial page elements.
  2. Intensive Execution: Some scripts, due to their nature or complexity, may be resource-intensive when running, causing a high load on the device's CPU or memory.
  3. External Dependencies: Scripts that rely on external libraries or call resources from third-party servers can introduce additional latencies, especially if those external resources are slow or unoptimized.
Direct Impacts on the User Experience

Inefficient Javascript execution can lead to various problems, including:

  • Rendering block: Scripts that run before the page is fully loaded can block the display of content, leaving users waiting.
  • Compromised interactivity: If a script takes too long to respond, user interactions, such as scrolling or clicking, may be delayed or interrupted.
Deceptive Tactics and Test LABS

To score high on synthetic tests like LABS, some CDNs may employ deceptive strategies, such as “skipping” loading problematic Javascript resources. If a website "hides" these scripts during a LABS test, the result will be a page that appears to load much faster, giving the site an artificially high performance score. However, this does not reflect the real experience of the user, who could be exposed to all the problems caused by such scripts in a real browsing context.

Conclusion: The Fine Line Between Metrics and Reality

In the complicated landscape of the web, it's easy to be seduced by perfect numbers and maximum scores. But, as often happens, all that glitters is not always gold. Google PageSpeed ​​Insight, while a vital tool, can sometimes offer partial insight into a website's actual performance.

The Deceptive Charm of Perfect Scores

A LABS score of 100 in Google PageSpeed ​​Insight may seem like the unequivocal testimony of an optimized and performing website. However, it is vital to understand that such a metric, when taken by itself, can be misleading. Some companies, well aware of this, may resort to deceptive tactics to "fix" the LABS tests, in order to exhibit these high scores especially to end customers who do not have the ability or expertise to distinguish the difference between a simulation and the real user experience.

It is important to remember that the Google PageSpeed ​​Insight LABS score is based on simulations and does not necessarily reflect actual site usage conditions. Simulations can be affected by numerous controllable factors, such as server configuration, content delivery network (CDN), image compression, and the use of advanced caching techniques. These optimizations, while useful, do not always accurately represent the experience of a real user who may face significant variations in connection speed, device power and other environmental factors.

Additionally, some companies may exploit customers' desire for high scores by offering “optimizations” that improve the LABS score without providing any real benefit to the user experience. Such practices may include optimizing exclusively for tests, reducing image quality, or removing crucial features to achieve a better score. This approach is not only deceptive, but can also damage the integrity and usability of the website.

The Flip Side: The Deceived End Customer and the Unimproved Business.

The temptation to impress the end customer with perfect scores is understandable. However, often, website owners or stakeholders are not fully aware of the technical nature and nuances of web metrics. Present a high LABS score without consistently passing the Core Web Vitals in the last 28 days may satisfy the customer in the short term, but will not bring long-term benefits, especially when visitors start to experience real problems navigating the site.

When visitors start experiencing real navigation issues, such as slow loading times, lagging interactions, or unstable layouts, the initial positive impression generated by a high LABS score quickly fades. This can lead to decreased user satisfaction, higher abandonment rates, and ultimately a decline in conversions and revenue.

Additionally, Google uses i Core Web Vitals as part of its ranking algorithm, meaning that the site's actual performance can directly influence its visibility in search results. A site that consistently outperforms these metrics will be more likely to rank well and attract quality organic traffic. Therefore, focusing only on LABS scores without considering actual field data can lead to a short-sighted and ineffective optimization strategy.

The Heart of the Matter: Real User Experience

Besides the numbers, what really matters is the CRUX – the real user experience. If a site doesn't deliver consistent and reliable performance to its visitors, perfect LABS scores become irrelevant. And over time, the site's reputation may suffer irreparable damage.

 

In Final Analysis and Conclusion

While analytics tools like Google PageSpeed ​​Insight are invaluable, they should never replace a comprehensive and authentic assessment of user experience. It's imperative for anyone running a website to look beyond the shining numbers and focus on what really matters: providing a quality browsing experience for all visitors. Page loading speed is only one aspect of user experience, and a good evaluation must also take into account usability, accessibility, coherence of design and ease of navigation.

Always remember to be wary of solutions that seem too good to be true; in fact, often they are not. Promises of drastic and immediate performance improvements without adequate optimization work often prove disappointing. Optimization techniques should be realistic and sustainable, integrating well with the site's structure and content without compromising its integrity or user experience.

Regardless of the performance optimization solution you choose to adopt for your website, it is essential not to stop at the first results, but rather to analyze and monitor the progress of Core Web Vitals in the medium term. This includes crucial metrics like LCP (Largest Contentful Paint), FID (First Input Delay) and CLS (Cumulative Layout Shift), which provide detailed insight into how users perceive the speed, interactivity and visual stability of pages.

Web technology is constantly evolving and what seems to work perfectly today may not be as effective tomorrow. For example, new browser updates, changes in user behavior or new Google guidelines can affect site performance. Therefore, a continuous and prolonged evaluation over time will allow you to have a clear and realistic vision of your site's performance. This practice not only helps to maintain high performance, but also to promptly identify any problems and quickly adapt to technological changes.

Aiming to overcome the Core Web Vitals it should not be a short-term goal, but a constant commitment, thus ensuring a quality browsing experience for your users and a solid reputation for your site in the digital landscape. Investing in the continuous improvement of website performance helps create a solid foundation for long-term success, retaining users and improving search engine rankings.

Do you have doubts? Don't know where to start? Contact us!

We have all the answers to your questions to help you make the right choice.

Chat with us

Chat directly with our presales support.

0256569681

Contact us by phone during office hours 9:30 - 19:30

Contact us online

Open a request directly in the contact area.

INFORMATION

Managed Server Srl is a leading Italian player in providing advanced GNU/Linux system solutions oriented towards high performance. With a low-cost and predictable subscription model, we ensure that our customers have access to advanced technologies in hosting, dedicated servers and cloud services. In addition to this, we offer systems consultancy on Linux systems and specialized maintenance in DBMS, IT Security, Cloud and much more. We stand out for our expertise in hosting leading Open Source CMS such as WordPress, WooCommerce, Drupal, Prestashop, Joomla, OpenCart and Magento, supported by a high-level support and consultancy service suitable for Public Administration, SMEs and any size.

Red Hat, Inc. owns the rights to Red Hat®, RHEL®, RedHat Linux®, and CentOS®; AlmaLinux™ is a trademark of AlmaLinux OS Foundation; Rocky Linux® is a registered trademark of the Rocky Linux Foundation; SUSE® is a registered trademark of SUSE LLC; Canonical Ltd. owns the rights to Ubuntu®; Software in the Public Interest, Inc. holds the rights to Debian®; Linus Torvalds holds the rights to Linux®; FreeBSD® is a registered trademark of The FreeBSD Foundation; NetBSD® is a registered trademark of The NetBSD Foundation; OpenBSD® is a registered trademark of Theo de Raadt. Oracle Corporation owns the rights to Oracle®, MySQL®, and MyRocks®; Percona® is a registered trademark of Percona LLC; MariaDB® is a registered trademark of MariaDB Corporation Ab; REDIS® is a registered trademark of Redis Labs Ltd. F5 Networks, Inc. owns the rights to NGINX® and NGINX Plus®; Varnish® is a registered trademark of Varnish Software AB. Adobe Inc. holds the rights to Magento®; PrestaShop® is a registered trademark of PrestaShop SA; OpenCart® is a registered trademark of OpenCart Limited. Automattic Inc. owns the rights to WordPress®, WooCommerce®, and JetPack®; Open Source Matters, Inc. owns the rights to Joomla®; Dries Buytaert holds the rights to Drupal®. Amazon Web Services, Inc. holds the rights to AWS®; Google LLC holds the rights to Google Cloud™ and Chrome™; Microsoft Corporation holds the rights to Microsoft®, Azure®, and Internet Explorer®; Mozilla Foundation owns the rights to Firefox®. Apache® is a registered trademark of The Apache Software Foundation; PHP® is a registered trademark of the PHP Group. CloudFlare® is a registered trademark of Cloudflare, Inc.; NETSCOUT® is a registered trademark of NETSCOUT Systems Inc.; ElasticSearch®, LogStash®, and Kibana® are registered trademarks of Elastic NV Hetzner Online GmbH owns the rights to Hetzner®; OVHcloud is a registered trademark of OVH Groupe SAS; cPanel®, LLC owns the rights to cPanel®; Plesk® is a registered trademark of Plesk International GmbH; Facebook, Inc. owns the rights to Facebook®. This site is not affiliated, sponsored or otherwise associated with any of the entities mentioned above and does not represent any of these entities in any way. All rights to the brands and product names mentioned are the property of their respective copyright holders. Any other trademarks mentioned belong to their registrants. MANAGED SERVER® is a trademark registered at European level by MANAGED SERVER SRL, Via Enzo Ferrari, 9, 62012 Civitanova Marche (MC), Italy.

JUST A MOMENT !

Would you like to see how your WooCommerce runs on our systems without having to migrate anything? 

Enter the address of your WooCommerce site and you will get a navigable demonstration, without having to do absolutely anything and completely free.

No thanks, my customers prefer the slow site.
Back to top