What I mean is: I'm using the "3 standard deviations" value to visualize my data, but this changes seasonally so I need to update what the values actually are for each image. Can I create a "3 standard deviations" palette that will automatically calculate these new values for each image?
In pseudocode terms, I'm imagining something along these lines:
var Oa04_palette = { bands: ['Oa04_radiance'], min: sminus3, max: splus3, sminus3: standard deviation of 'Oa04_radiance' * -3, splus3: standard deviation of 'Oa04_radiance' * 3 }; So when I add a bunch of layers all using this palette, it will always use the appropriate min-max range for that layer.