0

This topic has been discussed before, however I was unable to find anything applicable.

I have a large number of datasets (numbering in the thousands), which I'd like to display and then run through a JS pagination.

That obviously requires me to fetch all the results, which just takes a few milliseconds and is barely noticeable.

However upon constructing the actual page, the images, which are largely recurring, lead to a page loading time of 10 seconds+, which is decidedly too long.

My question is fairly simple: Is there some way to tell the browser to download each picture just once and reuse that instead of apparently downloading the same pictures over and over again?

Or do you know any other tricks to speed up page loading for many datasets with recurring pictures?

I'd like to avoid having to do the pagination via AJAX if possible.

My table structure looks like this:

<table> <tr> <td>Number</td> <td>Image</td> <td>Image</td> <td>Text</td> <td>Text</td> <td>Text</td> <td>Text</td> <td>Image</td> <td>Text</td> <td>Image</td> <td>Image</td> </tr> </table> 

The last image is the same for all rows, for images one and two it's always one of two different images, four is one of 13 different images.

I hope you can help me out a little.

Edit: Thanks to the previous replies, I managed to get rid of the image loading times, which is per definition great, however it didn't really reduce my loading times all that much.

Firebug tells me, that it wastes most of the time on Waiting, which gives me an entirely new angle on the problem I can explore.

Thanks for helping out so far, if I get stuck again, I'll open a new question. :)

On a final note, it appears the problem wasn't the server/client transfer connection, but much rather my Firefox is having issues rendering the large block. So yeah...guess I'm out of luck.

7
  • 5
    The browser doesn't download the same image twice in a page unless you messed the server settings (check in the headers that the image is cachable, look at the network panel of your browser's dev env) or give the images different URLs Commented Jul 24, 2015 at 10:29
  • Are the images thumbnails? And are the images optimized for web (i.e. filesize minimized) ? Commented Jul 24, 2015 at 10:33
  • How are you loading the page? Do you use flush()? Commented Jul 24, 2015 at 10:54
  • @DenysSéguret The server is a standard apache2 on a linux machine. I haven't messed around with the configuration at all, aside from configuring the page itself. However I will look again. However how will the images having different URLs help me load them faster? Commented Jul 24, 2015 at 11:43
  • @DavidFariña No, they are orignal sized, specifically designed for that application with a whopping 20x20 pixels per picture. Commented Jul 24, 2015 at 11:44

2 Answers 2

2

If you have proper caching set on the server, then the browser wont download the same image even if its been used multiple times in the page.

If your web server is apache based, head over here to configure your apache config or .htaccess so that it sets an expiry for the images. https://stackoverflow.com/a/447039/1716437

Sign up to request clarification or add additional context in comments.

1 Comment

I'll go check it out immediately. Thanks for the hint.
1

Couple of options you have:

  • Use a lazy loader so it only loads the image as you get to it (should drastically speed everything up
  • Use PHP to paginate and use AJAX for the requests (i know you want to avoid it, but putting it here in-case anyone else has the issue further down the line)
  • Or just generally use PHP to paginate without AJAX

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.