You are pretty much wasting the x-axis, which is pretty much redundant with the colors. It would be better to do a scatter plot of time and size, rather than imposing the ratio on people like me who think it is a silly metric.
This may be a realistic way to measure time for new users, but that is irrelevant to your complaint, since you are not a new user. The large size of the site includes files that don’t change much, like the javascript. These load quickly, at least per byte, contributing to a fast measure. Some users will be bandwidth-limited. But when you return to the site, these files are cached and the total download is much smaller. But you still have to wait for database queries to build the main content. This has only a very loose relation to number of bytes, even the number of bytes the pages, and using number of bytes as an excuse for speed seems silly to me. (But there is a lot of variation from time to time and if you had caught it at a bad time it would probably look bad by your metric, too.)
Hm. I don’t know much about the way websites are parsed, but I figured it wasn’t a perfect metric. At any rate, that’s why I did both raw time and website size.
Perhaps you can’t see the graph for some reason. (But why you could see one and not the other is beyond me) There were other complaints as well, and I had a problem getting them to show up just for me. I tried uploading them as images, but I got a “Bad image” message. I’d suggest just opening up the spreadsheet I linked to, it’s in there.
You are pretty much wasting the x-axis, which is pretty much redundant with the colors. It would be better to do a scatter plot of time and size, rather than imposing the ratio on people like me who think it is a silly metric.
This may be a realistic way to measure time for new users, but that is irrelevant to your complaint, since you are not a new user. The large size of the site includes files that don’t change much, like the javascript. These load quickly, at least per byte, contributing to a fast measure. Some users will be bandwidth-limited. But when you return to the site, these files are cached and the total download is much smaller. But you still have to wait for database queries to build the main content. This has only a very loose relation to number of bytes, even the number of bytes the pages, and using number of bytes as an excuse for speed seems silly to me. (But there is a lot of variation from time to time and if you had caught it at a bad time it would probably look bad by your metric, too.)
Hm. I don’t know much about the way websites are parsed, but I figured it wasn’t a perfect metric. At any rate, that’s why I did both raw time and website size.
No, you did not do raw time. That’s the whole point of my comment.
Perhaps you can’t see the graph for some reason. (But why you could see one and not the other is beyond me) There were other complaints as well, and I had a problem getting them to show up just for me. I tried uploading them as images, but I got a “Bad image” message. I’d suggest just opening up the spreadsheet I linked to, it’s in there.
Sorry. My fault.