I have a list of variable size image and wish to standardise them into 256x256 size. I used the following code
import tensorflow as tf import matplotlib.pyplot as plt file_contents = tf.read_file('image.jpg') im = tf.image.decode_jpeg(file_contents) im = tf.image.resize_images(im, 256, 256) sess = tf.Session() sess.run(tf.initialize_all_variables()) img = sess.run(im) plt.imshow(img) plt.show() However, tf.resize_images() tend to mess up the image. However, using tf.reshape() seems to allow resize_image() function correctly
Tensorflow version : 0.8.0
I know skimage package can handle what I need, however I wish to enjoy the function from tf.train.shuffle_batch(). I try to avoid maintaining 2 identical dataset ( with 1 fixed image size ) since Caffe seems to have no problem handling them.


method=ResizeMethod.BILINEARormethod=ResizeMethod.BICUBIC. If it still fails, could you file an issue with an image that causes this, so it can be fixed on TensorFlow side?ResizeMethod.NEAREST_NEIGHBORwork, rest of the method produce similar result as above, I will raise an issue in github.