20 False Claims about NitroPack (and Why They're Wrong)

Last updated on Feb 10th, 2022 | 9 min

The following is a technical rebuttal from NitroPack's CTO addressing some false accusations about NitroPack being a 'Black Hat SEO plugin'.

I have reviewed the article and I noticed a lot of misrepresented information and false statements.

Some of the statements do not even require a technical understanding of the product to see how false they are.

For example, stating that NitroPack doesn't provide any of the features for "real" optimization like the rest of the products on the market clearly shows that the author didn't even bother to familiarize themselves with NitroPack's features. Accusing the product of being a scam is a blatant lie.

Furthermore, the author's impartialness is put under question when observing the numerous false and deceitful claims, but involving other plugin developers is quite frankly concerning.

A blatant example of a complete lack of factual accuracy is the claim that the "Delay script" feature was pioneered by another plugin.

This couldn't be further from the truth. NitroPack released this in 2018, while the plugin in question was launched in 2020 (as seen in the release history here). I would be very impressed if the developer went out publicly stating that they pioneered this.

The article does not take into account that the one thing that seems to be causing the author this much headache is actually an optional feature. NitroPack users have the complete freedom to disable the default script delays and use the rest of the service to their liking.

This is as simple as switching to Strong mode or going Manual and disabling it from the advanced settings. With that in mind, how is this different from having to manually disable a feature in any other plugin? How does this make NitroPack a "black hat" plugin?

NitroPack simply provides two ways of approaching JavaScript delays:

  1. Delay no scripts by default and use inclusion list for delaying only some scripts

  2. Delay most scripts by default while automatically excluding scripts known to be critical or cause issues.

No other solution on the market provides both options simultaneously.

As for deferring - the end result is almost identical when you disable the delay option. Using defer, however, is much less flexible because of inter-script dependencies or dependencies on page lifecycle events, which are both solved issues in NitroPack's implementation, improving the stability of this feature.

The author makes a case for freedom for advanced users and the ability to manually fine-tune the service's behavior, but it doesn't seem like they want to fiddle with the settings if they refuse to disable just one option. NitroPack actually provides a lot of options to fine-tune how it works - on par with other solutions.

However, it is vital that we keep a civilized and constructive dialogue whilst calling out all false statements, as we owe it to the community to call a spade a spade.

I have prepared a complete list of everything that is false or misrepresented in the article, which you can find below, but before that, I would like to address the one thing that is somewhat true:

Delaying scripts can have a negative impact on the user experience in some cases. Whilst delaying scripts can be a bad fit for some sites, it is a game-changer for many others. Also, these cases are far fewer than what is represented. Of course, not all sites are a good fit for this and that's why we've made it very easy to disable the option or switch to another mode.

For simplicity's sake, I have prepared this as an ordered list, so feedback on each of the points below can easily be provided.

20 wrong or deceitful points made in the article:

1. Not acknowledging that delaying JS loading is an optional feature.

2. Not acknowledging all the valid use cases in which it works. For example, sites with CSS-based menus (https://purecss.io/menus/) and cases where the first interaction is not a click.

3. Presenting NitroPack as a cheat because of an optional feature - how is this different from optional features of any other caching plugin which can break your site? For the longest time, file concatenation of CSS and JS has been considered a "dangerous" function of caching plugins. Is every caching plugin that has this option bad because the option can cause issues?

4. Disregarding all other features of NitroPack, falsely suggesting that there are no other features and the entire service is based around script delays.

5. Even though NitroPack doesn't provide features like those of Asset Cleanup, we have never stated that these tools are not useful in what they do, nor do we say that they should not be used. On the contrary, they are great tools for experienced users. We have recommended them many times to our users as an addition to NitroPack.

6. Not educating users through videos. We've always strived to help the community with as much insight and educational materials as we can. So far, we've been actively creating content for our blog and Help Center that is publically available. With the company's growth, we've also expanded our content team and prioritized video materials to help customers. We upload new videos content regularly on our YouTube channel.

7. Misrepresented results from GTmetrix. Forcing a resource to be loaded from cache is represented as a bad thing. Falsely stating that testing tools fail to identify these resources while in reality, they identify the resource. After all, regardless of the network report, these resources are being transferred over the network which adds to the total page load time, if anything it will be reflected there. I wonder, since when is making use of the browser's cache a bad thing?

8. Misrepresented speed index results and missing information on the testing methodology - no links to the test results are provided, only "summaries". The fact that WebPageTest did not provide a speed index likely means that the page was blank, which can happen if a page preloader is in use or the page renders blank for any reason. Speed Index works based on painted content and it has nothing to do with how resources are loaded like the author seems to suggest.

You can learn more about this here - https://docs.webpagetest.org/metrics/speedindex/. Also, take a look at this code comment.

9. Misrepresented network results from WebPageTest, claiming that the test tool detects 46 requests, but if the page is loaded via the local web browser, the same page will serve 149 requests makes me very interested in how this test was performed? Can the author provide evidence of this happening? We'd be happy to work together and investigate this further.

10. Claiming of "unproven inclusion of WP Rocket" - we don't want to involve any third party in this, especially a competing solution, but doesn't WP Rocket's new version of Delay JS prove that the approach works?

11. HTML Lazy loading - true, we don't have this released publicly, but we have it in our source code. The code for this "breakthrough" feature is 3 lines in NitroPack's ecosystem. Just because we have not released something doesn't mean we have not experimented with it. Another example that the author might be interested in is Reduce Unused CSS being privately available for almost a year. Adaptive Image Sizing has also been part of the service since its very creation, as well as other options.

12. Misrepresented our Critical CSS and Reduce Unused CSS (RUCSS) functionality. The author seems to suggest that NitroPack uses a single critical CSS file and that we do not offer removal of unused CSS. Both claims are wrong. Has the author tried our Critical CSS feature to verify how it handles different layouts? Feel free to browse around our website and actually compare the critical CSS on each page. I would be glad if the author could share their results. Also, feel free to test each page on mobile and desktop separately while you are at it. You might be surprised to find out that NitroPack prepares separate critical CSS for each form factor as well and it has always been like that.

13. False claims that the CSS files with NitroPack are never loaded in testing tools hiding FOUC issues. This cannot be further from the truth. All CSS files are being loaded, just as they are for real visitors (and any other environment for that matter). This can easily be double-checked with Chrome's DevTools, as described in this article on our blog.

14. Claims that checking a site's score via Lighthouse in Chrome compared to PSI will have the score drop from 100 in PSI to 70 in Chrome. Very interesting, is it possible that the author tested with different versions of Lighthouse in Chrome compared to the one being used in PSI? What were the network conditions this was tested with? What hardware was used? In any case, please share a reproducible scenario, so we can investigate this. I am genuinely interested.

15. The following statement - "No automatically generated JS can be functional until the parsing of the jquery.min.js file is accomplished" - is somewhat confusing to me. What does automatically generated JS mean? Also, what if that JS doesn't depend on jQuery?

16. False claims that NitroPack doesn't allow users to manually delay JS. This suggests that the author did not bother to inspect NitroPack's settings. In Strong or Medium modes, users can manually configure which scripts to be delayed, with all other ones running normally. This has been part of NitroPack since the very beginning.

17. Incorrectly comparing lazy loading images vs. JS. Even though JS files are indeed prefetched, their size is much much smaller compared to images. The biggest impact of lazy loading images comes from not downloading them because they are large. With JavaScript, the biggest impact on speed comes when executing JS. The metaphor "lazy loading JS" is simply used to give a broad idea of the benefit - reducing the biggest impact from this resource. Transferring JS over the network has almost no speed impact.

18. Sites receiving reduced traffic are falsely attributed to this being a "Black Hat" strategy when in reality, it is a result of delaying the analytics scripts. There is already a default exclude list of scripts that we exclude when scripts are delayed, some of which are analytics scripts. Of course, not all cases are yet covered, but we are constantly expanding the list and improving the feature. As with any other feature, sometimes it has to be fine-tuned to work correctly. The author keeps talking about how they like tweaking settings, but they do not seem to realize that it is the same case with that option. Excluding the analytics scripts will be all that is needed to recover the "lost" traffic in your analytics software.

19. False statement that Flying Scripts pioneered selective script delay. In reality, NitroPack had this available for customers in 2018. Flying Scripts came out in 2020.

20. Claiming that sites will be penalized for using NitroPack is definitely interesting, but is there any fact to back it up? A statement from Google which must be followed? A rule that is being ignored?

You can also find a straightforward answer for these accusations on the official GSC forum from a Platinum Product Expert - https://support.google.com/webmasters/thread/110575153/is-nitropack-plugin-black-hat-seo-for-speed-optimization?hl=en

We truly believe that the community benefits from constructive dialogue but suffers from such occurrences of spreading so much misinformation.

We are, have been, and always will be eager to address constructive criticism.

Yours truly,

Ivailo

Ivailo Hristov

Passionate about all things performance optimizations, Ivailo is the driving force behind our product and core NitroPack features.