I have a React SPA application on S3. I'm fronting that with CloudFront solely to get an SSL certificate with a custom domain - I honestly don't really care about the caching functionality of CloudFront for this, although I suppose it's a nice perk.
Everything works fine, except when I do an update to the S3 bucket, the page remains cached in Chrome. If I clear the application memory in Chrome, it goes out and gets the most recent version.
I'm deploying the app by creating a production build, and then uploading everything to S3 and issuing a CF invalidation:
echo "Pushing to S3." aws s3 sync --acl public-read ./ s3://${DEPLOY_BUCKET} --metadata-directive REPLACE --cache-control 'max-age=0,public,s-maxage=2' echo "Invalidating CloudFront cache." aws cloudfront create-invalidation --distribution-id ${DEPLOY_DISTRIBUTION} --paths '/' '/index.html' '/login' '/*' The request to CloudFront seems to return reasonable headers to me:
accept-ranges: bytes cache-control: max-age=0,public,s-maxage=2 content-length: 716 content-type: text/html date: Wed, 18 May 2022 17:15:36 GMT etag: "660fb0d86eb442f658af12ead96b2c83" last-modified: Wed, 18 May 2022 16:55:25 GMT server: AmazonS3 via: 1.1 ....cloudfront.net (CloudFront) x-amz-cf-id: eLXohvep_...== x-amz-cf-pop: BOS50-C1 x-amz-version-id: 8V0DR... x-cache: Miss from cloudfront Specifically - the cache-control header shows a max-age of 0, so I thought that should tell Chrome to not cache things and always go out and check for a new version.
CloudFront is setup to use the origin headers, so it should be pulling those from S3.
What am I missing? How do I get Chrome/CloudFront/S3 to always check for the latest version of the application?
registerServiceWorker()that one of our devs ripped from somewhere. Disabling it seems to get around the problem, but is there a way to get it to respect the cache-control settings? I.e. even if you have this in local cache, if it's older than X you should still try and check for a new version...