How to use Reshape keras layer with two None dimension?

I have a keras 3D/2D model. In this model a 3D layer has a shape of [None, None, 4, 32]. I want to reshape this into [None, None, 128]. However, if I simply do the following:

reshaped_layer = Reshape((-1, 128))(my_layer)

my_layer has a shape of [None, 128] and therefore I cannot apply afterwards any 2D convolution, like:

conv_x = Conv2D(16, (1,1))(reshaped_layer)

I've tried to use tf.shape(my_layer) and tf.reshape, but I have not been able to compile the model since tf.reshape is not a Keras layer.

Just to clarify, I'm using channels last; this is not tf.keras, this is just Keras. Here I send a debug of the reshape function: Reshape in keras

This is what I'm doing right now, following the advice of anna-krogager:

def reshape(x):
    x_shape = K.shape(x)
    new_x_shape = K.concatenate([x_shape[:-2], [x_shape[-2] * x_shape[-1]]])
    return K.reshape(x, new_x_shape)

reshaped = Lambda(lambda x: reshape(x))(x)
reshaped.set_shape([None,None, None, 128])
conv_x = Conv2D(16, (1,1))(reshaped)

I get the following error: ValueError: The channel dimension of the inputs should be defined. Found None

You can use K.shape to get the shape of your input (as a tensor) and wrap the reshaping in a Lambda layer as follows:

def reshape(x):
    x_shape = K.shape(x)
    new_x_shape = K.concatenate([x_shape[:-2], [x_shape[-2] * x_shape[-1]]])
    return K.reshape(x, new_x_shape)

reshaped = Lambda(lambda x: reshape(x))(x)
reshaped.set_shape([None, None, None, a * b]) # when x is of shape (None, None, a, b)

This will reshape a tensor with shape (None, None, a, b) to (None, None, a * b).

Reshape is used to change the shape of the input. For example, if reshape with argument (2,3) is applied to layer having input shape as (batch_size, 3, 2), then the output shape of the layer will be (batch_size, 2, 3) Reshape has one argument as follows − keras.layers.v(target_shape) A simple example to use Reshape layers is as follows −

Digging into the base_layer.py, I have found that reshaped is:

tf.Tensor 'lambda_1/Reshape:0' shape=(?, ?, ?, 128) dtype=float32.

However its atribute "_keras_shape" is (None, None, None, None) even after the set_shape. Therefore, the solution is to set this attribute:

def reshape(x):
    x_shape = K.shape(x)
    new_x_shape = K.concatenate([x_shape[:-2], [x_shape[-2] * x_shape[-1]]])
    return K.reshape(x, new_x_shape)

reshaped = Lambda(lambda x: reshape(x))(x)
reshaped.set_shape([None, None, None, 128])
reshaped.__setattr__("_keras_shape", (None, None, None, 128))
conv_x = Conv2D(16, (1,1))(reshaped)

Pre-trained models and datasets built by Google and the community

Since you are reshaping the best you can obtain from (4,32), without losing dimensions, is either (128, 1) or (1, 128). Thus you can do the following:

# original has shape [None, None, None, 4, 32] (including batch)

reshaped_layer = Reshape((-1, 128))(original) # shape is [None, None, 128]
conv_layer = Conv2D(16, (1,1))(K.expand_dims(reshaped_layer, axis=-2)) # shape is [None, None, 1, 16]

What related GitHub issues or StackOverflow threads have you found by searching the web for your problem? keras-team/keras#4302 Environment info Operating System: Mac (CPU) If installed from binary pip package, provide: A link to the pip

Layer that reshapes inputs into the given shape. Input shape. Arbitrary, although all dimensions in the input shape must be known/fixed. Use the keyword argument input_shape (tuple of integers, does not include the samples/batch size axis) when using this layer as the first layer in a model.

Fine-tuning a CNN using the updated input dimensions Figure 4: Changing Keras input shape dimensions for fine-tuning produced the following accuracy/loss training plot. To fine-tune our CNN using the updated input dimensions first make sure you’ve used the “Downloads” section of this guide to download the (1) source code and (2) example

I tried Y = Reshape( (-1, nb_filters))(X) but keras returns an error, basically saying that dimension can not be None. For me, this operation is deterministic in the way that it always squeeze the two middle axes into a single one while keeping the first axis and the last axis unchanged.

Comments
  • I've tried your solution but, then reshaped has a shape of [None, None, None] and when I do a Conv2D of reshaped I get the following error: ValueError: The channel dimension of the inputs should be defined. Found None.
  • Try adding reshaped.set_shape([None, None, 128]) before passing it on to the Conv2D layer
  • Even though it seems to work: reshaped is <tf.Tensor 'lambda_1/Reshape:0' shape=(?, ?, ?, 128) dtype=float32> When I try to apply a Conv2D I still get the following error: ValueError: The channel dimension of the inputs should be defined. Found None.
  • Are you using channels first or channels last in your Conv2D layer? For me it works (with channels last, which is the default). I suppose you get the error because of channels first, and your data indeed has None as its first dimension. Where does this None come from? You may want to consider if your architecture even makes sense
  • I've just posted a debug screenshot. I'm using channels last, my model was working correctly when the input_shape was defined (128, 128, 10, 1). However, when I try to develop a model with an input_shape of (None, None, 10, 1) it breaks.