Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

6
  • "Create a new page that renders all images. My server only sends the pre-signed URLs from GCP bucket. Add some basic JS functionality to download all images on the page." You don't need to render them on the page in order to download them with JS. You just need the URLs.. Commented Sep 24, 2024 at 16:09
  • 1
    20Gb downloaded from a browser? Forget about it, that's an unrealistic requirement. SFTP won't help you here. What will you do when connection drops for whatever reason? Torrent protocol - that's what you need. Commented Sep 24, 2024 at 17:25
  • 2
    But honestly, this sounds like XY problem. Make thumbnails out of those images and serve them to users, paginated. No one looks at 1000 images at the same time. Serve full images only on demand. Commented Sep 24, 2024 at 17:27
  • maybe : medium.com/google-cloud/… Commented Sep 24, 2024 at 18:30
  • As freakish said, so the user can scroll through page after page of images without delay. I might add two levels of compressed images and only use the 20 MB for zooming in or for print. Commented Sep 24, 2024 at 21:05