GithubHelp home page GithubHelp logo

Comments (22)

garritfra avatar garritfra commented on July 22, 2024 1

Pingdom unfortunately has different results for page sizes. Examples:

fanrongbin.com

GTMetrix: 184 kB

Bildschirmfoto 2023-12-14 um 13 13 46

Pingdom: 97.5 kB

Bildschirmfoto 2023-12-14 um 13 13 25
www.blitzw.in

GTMetrix: 21 kB

Bildschirmfoto 2023-12-14 um 13 13 36

Pingdom: 23.5 kB

Bildschirmfoto 2023-12-14 um 13 12 36

from 512kb.club.

kevquirk avatar kevquirk commented on July 22, 2024 1

I'm wondering if we take this as serendipity and close the project down? I don't have the time to go through all these sites and re-test them.

What do you think, @garritfra ?

from 512kb.club.

evtn avatar evtn commented on July 22, 2024 1

FWIW, I checked if https://pagespeed.web.dev/ might be an alternative, but they don't seem to expose page weight at all. πŸ€”

Actually they do, under Performance -> Passed Audits -> Avoids enormous network payloads

Although I'm not sure if it's a size after compression

from 512kb.club.

michaelnordmeyer avatar michaelnordmeyer commented on July 22, 2024 1

Although I'm not sure if it's a size after compression

It's the size after compression. You can test with your browser's developer tools on the network tab. They display both sizes.

from 512kb.club.

kevquirk avatar kevquirk commented on July 22, 2024 1

I just checked on my site and the results are very close. GTMetrix is reporting 89.4kb and Cloudflare is reporting 91.5kb.

While 2kb is quite a lot when it comes to the smaller sites, it's about the closest we've come out of everything we've tried. Being Cloudflare, it's unlikely to go paid anytime soon too. Also, having the API, we could potentially go back and re-test all sites with CF (although I'd have no idea how to do that haha).

TL;DR I think we have a winner. If you're happy @garritfra we can update the instructions on the site.

from 512kb.club.

garritfra avatar garritfra commented on July 22, 2024 1

@kevquirk I'll take care of it.

Thanks @JLO64 for the suggestion!

from 512kb.club.

kevquirk avatar kevquirk commented on July 22, 2024

Oh damn. Might need to do some research as to what we can use.

from 512kb.club.

garritfra avatar garritfra commented on July 22, 2024

FWIW, I checked if https://pagespeed.web.dev/ might be an alternative, but they don't seem to expose page weight at all. πŸ€”

from 512kb.club.

kevquirk avatar kevquirk commented on July 22, 2024

I don't think that will work as it doesn't mention page size.

1mb.club is using DebugBear doing a test and comparing it to GT Metrix, it comes our with different scores though. πŸ€·β€β™‚οΈ

GTMetrix (90kb) - https://gtmetrix.com/reports/kevquirk.com/oMqg7SG2/
DebugBear (50.2kb) - https://www.debugbear.com/test/website-speed/5PV7nucN/overview?metric=pageWeight

from 512kb.club.

garritfra avatar garritfra commented on July 22, 2024

DebugBear seems to be quite close to the compressed size reported by GTMetrix. If all else fails, that might be our only option. πŸ€·β€β™‚οΈ

from 512kb.club.

kevquirk avatar kevquirk commented on July 22, 2024

Yeah, but it means we will need to re-do over 1000 sites to make sure it's accurate. 😳

from 512kb.club.

garritfra avatar garritfra commented on July 22, 2024

I wouldn't want to close the project down per se. I'll contact you to discuss some details.

from 512kb.club.

meduzen avatar meduzen commented on July 22, 2024

I'm wondering if we take this as serendipity and close the project down?

I was coming to update a size and stumbled upon the same issue. Closing the project could be a nice testimony from the (recent) past, but at the same time it’s also kinda useful to discover new things.

Without automation, maintenance efficiency is kinda limited.

from 512kb.club.

michaelnordmeyer avatar michaelnordmeyer commented on July 22, 2024

Some services include the reply headers in the reported weight, some don't. You can use curl to find out, but you only get the data for the HTML page.

With compression:

curl --compressed -so /dev/null -w "Header: %{size_header} bytes, Download: %{size_download} bytes\n" https://kevquirk.com/
Header: 259 bytes, Download: 3814 bytes

Without compression:

curl -so /dev/null -w "Header: %{size_header} bytes, Download: %{size_download} bytes\n" https://kevquirk.com/ 
Header: 236 bytes, Download: 31883 bytes

wget, as someone mentioned in #749, would be an easy solution, but it doesn't work for CSS files importing other files.

from 512kb.club.

derspyy avatar derspyy commented on July 22, 2024

wow this whole stuff sucks : /
i wholeheartedly think the project should continue at least with compressed sizes.
it was always awesome to see what could be done!

from 512kb.club.

JLO64 avatar JLO64 commented on July 22, 2024

I know this isn't productive towards the current conversation, but it gets worse with GTmetrix. Even if you sign up for a free account there is a limit for how many reports you can generate. After you hit that limit (I'm not sure what it is) you are forced to pay for a plan in order to generate more.

image

Regarding shutting down the project, I'd argue against it. It's genuinely nice having a list of websites like these in a format like this. (I'm totally not in it for the badge on my site) Maybe it can be restarted instead with a new list starting from scratch based on whatever new metric is agreed upon? As nice as debugbear is I think it would be wise to avoid third party services to avoid another situation like what we're going through now.

from 512kb.club.

garritfra avatar garritfra commented on July 22, 2024

I think there's no way around a third party service like debug bear to get consistent results.

We won't shut down the project. Once we find a solution we're happy with, updating all sites shouldn't be an issue using scripts. Ideally we should try to collaborate with the third party service owner (whoever that might be) to lift any possible scan restrictions or speed up the process.

from 512kb.club.

bradleytaunt avatar bradleytaunt commented on July 22, 2024

Thought I would throw my two cents in:

For 1mb.club I have been toying with the idea of using a custom script that will:

  1. Download all the files of a given website (CSS, images, JS, main HTML page)
  2. Place all these files into a temporary folder
  3. Get the size of all those files together as the total

I finally pushed out what I working on way back and called it "sizegrab": https://git.sr.ht/~bt/sizegrab (it's written in ruby)

It's far from perfect and most likely doesn't cover every use case. But the idea is to try and avoid depending on third party companies / services. If anyone wants to help improve that ugly script of mine, please do so! (or even re-write everything in a language others prefer!)

/ end rant

from 512kb.club.

JLO64 avatar JLO64 commented on July 22, 2024

It's far from perfect and most likely doesn't cover every use case. But the idea is to try and avoid depending on third party companies / services. If anyone wants to help improve that ugly script of mine, please do so! (or even re-write everything in a language others prefer!)

I am probably the least qualified person in this thread to be working on something like this, but I have a bunch of free time on my hands I'm willing to put towards this. For right now, I'm thinking of rewriting this as a Python script (I'm sorry I'm not more comfortable with Ruby!) and deploying it as an AWS Lambda function. After I get that set up, I can make a widget and/or website to query and display results. Since it'll be a Lambda function, maybe it can be used to better automate the project in the future?

from 512kb.club.

JLO64 avatar JLO64 commented on July 22, 2024

Lol. While working on a solution of my own I stumbled across Cloudflare's URL Scanner. It's pretty robust, free to use with no restrictions, and has an API (you have to sign up and generate a token to use the API tho)
image

They even show both compressed and uncompressed network transfers!
image

from 512kb.club.

garritfra avatar garritfra commented on July 22, 2024

@JLO64 huh, nice! I'm not sure if our audience will be kosher using cloudflare, but I'd personally be fine with this. @kevquirk what do you think?

from 512kb.club.

garritfra avatar garritfra commented on July 22, 2024

#1383 is merged, so I'll close this issue. Thanks for all the input!

Any contributions regarding the automatic size checker or other topics discussed here are welcome. I don't think I'll have the time to rewrite that script any time soon.

from 512kb.club.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.