0

I'm building a fully convolutional neural network that inputs and outputs an image. I want my images to be of the different sizes and resizing or adding padding doesn't suit me.

As it was said here: Can Keras deal with input images with different size?, I can build such a model specifying input_shape = (1, None, None), but how should I prepare a dataset that I feed to my network?

I have this function for loading images for fixed image size:

def load_images(path): all_images = [] for image_path in sorted(os.listdir(path)): img = imread(path + image_path , as_gray=True) all_images.append(img) return np.array(all_images).reshape(len(all_images),img_size,img_size,1) 

How should I change it so that 2 dimensions of the output numpy array are not fixed? np.reshape allows only one dimension to be unknown.

4
  • Depends on your model if your model is fully convolutional for example (as said in the question), sure no problem. But if you have a Dense/LSTM layer you won't be able to do this. Commented Dec 4, 2019 at 22:38
  • Yes, my model contains only convolutional layers, so, I know that it theoretically should be possible. But I don't understand how to correctly pass data to the model. Commented Dec 4, 2019 at 22:47
  • What does the distribution of image sizes look like? Can they be generally grouped into large/small or portrait/landscape type categories that could be made to match with minor adjustment, or does it vary widely? Commented Dec 4, 2019 at 23:01
  • The variation is pretty small: it's always the square image with size about 50-100 pixels. But it is important for my task to maintain the original image size Commented Dec 4, 2019 at 23:46

1 Answer 1

1

I'm not entirely sure, if this will solve your problem entirely.

So here's my approach and this unfortunately depends on the fact that you can create batch of data having the same (height width). But the height width between batches can change. This is what image_gen() is doing.

Then you can directly create a dataset the following way and train your model.

import numpy as np import tensorflow as tf def image_gen(): for _ in range(100): rand = np.random.choice([0,1,2]) res = [(np.random.normal(size=(5, 256, 256, 3)), np.random.normal(size=(5, 256, 256, 3))), (np.random.normal(size=(5, 128, 128, 3)), np.random.normal(size=(5, 128, 128, 3))), (np.random.normal(size=(5, 64, 64, 3)), np.random.normal(size=(5, 64, 64, 3))) ] yield res[rand] dataset = tf.data.Dataset.from_generator(image_gen, output_types=(tf.float32, tf.float32), output_shapes=([5, None, None, 3],[5, None, None, 3])) it = dataset.make_initializable_iterator() with tf.Session() as sess: sess.run(it.initializer) model.fit(dataset) 

And the model

from tensorflow.keras import layers, models inp = layers.Input(shape=(None, None, 3)) out = layers.Conv2D(32, (3,3), strides=(2,2), padding='same')(inp) out = layers.Conv2D(64, (3,3), strides=(2,2), padding='same')(out) out = layers.Conv2DTranspose(32, (3,3), strides=(2,2), padding='same')(out) out = layers.Conv2DTranspose(3, (3,3), strides=(2,2), padding='same')(out) model = models.Model(inputs=inp, outputs=out) model.compile(optimizer='adam', loss='categorical_crossentropy') model.summary() 
Sign up to request clarification or add additional context in comments.

3 Comments

It causes 'AttributeError: 'DatasetV1Adapter' object has no attribute 'ndim''. Or am I missing something?
Which TF version are you using? I tested with 1.15
Sorry, my mistake. I accidentally imported from keras, not tensorflow.keras. Yes, it works now, thank you

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.