bentocorp 4 days ago

I use Jekyll for my company website [1] and managed to get a lot of speed optimisations simply by using a post-processing tool on the statically generated output.

The tool I use is Jampack and I'd highly recommend it: https://jampack.divriots.com

For my product website, it reduced the overall size by 590MB or approximately 59% in size, along with changing the HTML/CSS to make the optimisations that this article notes.

[1] https://www.magiclasso.co/

  • maccard 4 days ago

    Do you mean KB or do you have a 1GB website??

    • vallode 4 days ago

      They are most likely referring to the overall size of the whole website including all generated HTML for posts, static content, styles etc.

      • rossant 4 days ago

        1gb is huge even for a whole website. Unless there are videos or hundred of high definition photos. Curious to know what makes up this space.

        • dewey 4 days ago

          Maybe they included node_modules in that count.

    • bentocorp 4 days ago

      It is a 991MB website, before optimisations.

      The large majority of this comes from large header images for the Insights post content: https://www.magiclasso.co/insights/

      These are png images that are relatively large in size until optimisation creates smaller, multiple image set versions.

      I wouldn't say that this is an unusually large site. Any site with a medium amount of content would likely grow to such a size.

    • exiguus 4 days ago

      I guess mostly picture. If you have 500 posts, one picture for each, and then optimize image for responsive and browser compatibility, there should be 3 to 12 images per image source set:

      Mobile, Tablet, Desktop as breakpoints / AVIF with webp or jpeg fallback, Retina/Normal.

      And this will increase the overall-size to 12 x 500 x 50kB = 300MB

      • broken-kebab 4 days ago

        I doubt there's a practical reason for an average website, especially something blog-like to have pre-optimized pictures for that many screen types.

  • thomas_witt 4 days ago

    In my opinion the whole point of Jekyll is not having to use some npm/JS packages and dependencies.

  • arclight_ 4 days ago

    This looks fantastic! I really like how it's static site generator-agnostic, and the intelligent CSS and above-the-fold optimizations.

    I'll definitely give this a try - only wish I knew about it before I wrote my thing!

Sayrus 4 days ago

The page doesn't load any CSS, I get a 301 loop on main-e03066e7ad7653435204aa3170643558.css leading to ERR_TOO_MANY_REDIRECTS and over 100 requests and a nearly 1 second load time. Sections are displayed lazily and on each scroll I get dozens of requests sent to google-assets-formatavif-width672-4399f142b495ab9796b3b91053c097b9.avif with the same 301. This leads to section taking over 200ms for 4 lines of text.

While that may be great for Google Pagespeed, it leads to issues that wouldn't exist with a static page and a degraded experience for the end user. I'm not sure if the issue is related to the plugin discussed in the article.

With this being said, I can see many use-cases for such a plugin. Having compile-time image transformation/compression is really nice.

  • arclight_ 4 days ago

    No idea what would cause that as I'm just hosting static assets on Cloudflare Pages. But thanks for letting me know!

  • ameliaquining 4 days ago

    The page is loading fine for me.

    • Sayrus 4 days ago

      Indeed it does load now. At the time of writing it didn't.

firefoxd 4 days ago

When it comes to images, developers tend to look for automation to resolve the format. But the optimization should come at creation time. Yes, it's a good idea to have multiple resolution to have the browser select the best size, but the image on the screenshot clearly should have been a jpeg or webp.

I commented a couple days ago about how I taught my team about image formats [0], and just published a blog post this morning [1].

[0]: https://news.ycombinator.com/item?id=44683506

[1]: https://idiallo.com/blog/react-and-image-format

  • zerocrates 4 days ago

    The "mistake" of all PNG has its advantages: you can convert to JPEG (or whatever else) after the fact with equivalent results to if you had done it that way in the first place; not the case the other way around.

  • arclight_ 4 days ago

    Thanks for sharing, those are good points.

    Another aspect of image optimization that I'm aware of, but haven't even bothered with yet, is handling the art direction problem of responsive images.

    Like "pan and scan" conversions of widescreen movies to the old 4:3 TV size, if you're serving a narrower image to mobile devices than say a desktop browser, the ideal image to serve is probably not a generic resize, or center-cropped version of the image. Mozilla has a nice page on responsive images that explains it better than I could: https://developer.mozilla.org/en-US/docs/Web/HTML/Guides/Res...

  • exiguus 4 days ago

    If i see it correctly it is avif with webp and png fallback. My browser load the screenshots in avif.

thomas_witt 4 days ago

Great effort - really happy to see people keeping Jekyll alive.

Q: Why did you decide to rewrite the whole image handling instead of just relying on the jekyll_picture_tag gem (https://github.com/rbuchberger/jekyll_picture_tag) - I am using that since years and it just works just fine.

  • arclight_ 4 days ago

    Simply because when I google'd my use case I didn't discover that plugin!

    Maybe it would make sense to decouple the image processing code from my library so the `jekyll_picture_tag` could be used, since it's a bit orthogonal to the Propshaft-esque asset loading.

Akronymus 4 days ago

> And at this point, the library seems to have been abandoned as it hasn't been updated in over 5 years.

Why is the automatic assumption foe something not being updated for a few years to be abandoned instead of done? Are libraries not allowed to be stable/done?

  • hombre_fatal 4 days ago

    Before you even consider github issue/PR activity, it's a complex asset-pipeline project with over 1000 commits.

    The idea that this kind of project is "done" without even occasional chore updates just has no shot. It's obviously off of its maintainers' "rotation".

  • the_sleaze_ 4 days ago

    Aside from the specific tools in use (JS has a higher maintenance burden than a golang lib, for example)

    You really need to look at the issues/updates ratio. Are there 57 open issues that haven't been triaged or addressed? Are there multiple open PRs or requests that should be easily added and are just sitting there rotting?

    • Akronymus 4 days ago

      Yes, those are reasonable metrics. But ive seen too many people outright dismiss using a library that has been stable for a long time for not having had an update in like half a year and rather go for one that is only half baked but had one in the last few weeks.

      Thats why I push back against the notion that no updates = abandoned. Personal, painful, experience.

      • luckylion 4 days ago

        But "a few months" isn't 5 years, right? Last update in January 2025 seems fine for a lot of things. If that was January 2020 I'd probably bypass it as well.

        • Akronymus 4 days ago

          This second comment was more meant as an elaboration on why I personally dislike the not updated recently = abandoned assumption that a lot of people take by default, rather than actually checking on whether its abandoned.

          As in, "I've personally witnessed people passing over mature libraries that just don't need any more updates in favor of ones that aren't really production ready but get frequent updates, which causes quite a bad dev experience down the line".

          I am not really good at articulating my thoughts properly, so thanks for making me write this longer comment.

philipwhiuk 4 days ago

Google web-fonts are very annoying for the reason specified. Ended up removing them because it was too annoying to import them properly.

azangru 4 days ago

I really, really, really doubt the "crawled but not indexed" status can have anything to do with page loading speed. Lower ranking, sure; although Google doesn't say how much weight they give to page speed when ranking. But not exclusion from the index.

corentin88 4 days ago

On the YouTube embed aspect, using a component can take lots of time and efforts.

Just sharing another approach where you keep the YouTube embed iframe, but replace the domain "youtube.com" by this specific domain "embedlite.com". It loads only the thumbnail of the video and when someone clicks on it, it loads the full YouTube player.

More info: https://www.embedlite.com

  • Raicuparta 4 days ago

    Doesn't sound very smart to iframe to some unknown third party that could be compromised. But their implementation is pretty simple, can easily be copied and implemented on your own domain.

    Their example doesn't even seem to work on mobile at least (just iframes the homepage itself), which doesn't really inspire confidence.

est 4 days ago

I have a blog with pagespeed score of 98 but still got several pages de-indexed from google. Guess my content isn't that important.

  • ZanderEarth 4 days ago

    Canonical tags set up correctly? Basics in place like meta data? Content rendering in the HTML? Google will generally index a page pretty quickly, even from a site with lower authority.

  • dewey 4 days ago

    In the Google Search console you can usually see the reason why it's not included. Page speed is rarely a reason for indexing / non-indexing unless it's really bad.

    • arclight_ 4 days ago

      I've clicked into my non-indexed pages, and everywhere that people say it should say the reason why the page is not indexed, there simply isn't a reason mentioned. It just says the generic crawled but not indexed even when clicked into specific pages.

      I've even watched YouTube videos of people going into their Search Console dashboard to make sure I'm not missing anything (and indeed I've seen where those people for some pages do see a reason, but for mine I do not).

    • dado3212 4 days ago

      I don’t think it says anything useful for crawled but not indexed, unfortunately. My suspicion is that it’s almost always backlinks, but not totally sure.

pavel_lishin 4 days ago

Is doing things like lazy-loading images further down the page actually good for users and readers, or only for making sure Google indexes your site?

I'd be hella miffed if I loaded a page on my laptop, then opened it up somewhere without internet access, and realized that half the page fully didn't exist.

  • exiguus 4 days ago

    It saves bandwidth for the user. Especially if you imagine that the user does not read the entire article. And it may load the images that are in the viewport faster, as the bandwidth is not taken up by images that the user cannot yet see.

dewey 4 days ago

And yet when I open the page it first loads without CSS and then after 0.5s applies the style.

  • chrismorgan 4 days ago

    Responsible:

      <link rel="stylesheet" href="/_digested/assets/css/main-e03066e7ad7653435204aa3170643558.css" media="print" onload="this.media='all'">
      <noscript><link rel="stylesheet" href="/_digested/assets/css/main-e03066e7ad7653435204aa3170643558.css"></noscript>
    
    This is deliberate FOUC. Why, I have no notion whatsoever. It should read:

      <link rel="stylesheet" href="/_digested/assets/css/main-e03066e7ad7653435204aa3170643558.css">
    • porridgeraisin 4 days ago

      Yeah. Maybe they separated critical and non-critical CSS? and delayed non-critical CSS, that way you'll see a degraded version of the page on slow connections.

      Can't think of any other reason why you would do this.

    • arclight_ 4 days ago

      IIRC I followed the advice of PageSpeed or another Google page to do it that way. ¯\_(ツ)_/¯

      Appreciate the feedback!

  • microflash 4 days ago

    They probably forgot to inline the critical CSS.

Brajeshwar 4 days ago

Honestly, I believe you have made it a tad complex than it needs to be. I migrated to Jekyll from WordPress about 4 years ago. It does have all 100s in Google PageSpeed. I used Jekyll as just another tool so I can stick to GitHub Pages. There is nothing tied to anything, and I can move all the Markdown contents to another system within hours, if not minutes. I can also upload the generated HTML via FTP, and it will work as well.

Almost all audio, images, and videos are rather ornamental, and the content will be OK without them. I try to have all content as standalone on its own as possible. For instance, the posts follow the pattern "_posts/YYYY/YYYY-MM-DD-url-goes-here.md," so I know where the yearly content is, despite each post having its own designated published date. I also have a folder "_posts/todo" where published (but work-in-progress) and future dated posts live.

For images, I stopped worrying about serving multiple sources. I optimized it somewhere between good enough for both mobile and higher (I now consider tablet and desktop the same).

https://brajeshwar.com/2021/brajeshwar.com-2021/

impostervt 4 days ago

Over the weekend I stood up a small site with a blog of only a few articles. I've done this in the past with Wordpress and Jekyl, but I do it pretty rarely so I forget exactly how to do it and how to make the sites fast.

So I let Claude write it. I told it I wanted a simple static website without any js frameworks. It made the whole thing. Any time I add a blog post, it updates the blog index page.

The site is, of course, very fast. But the main gain, for me, was not having to figure out how to get the underlying tech working. Yes, I'm probably dumber for it, but the site was up in a few hours and I got to go on with my life.