2

I want to use 3CNN with 3GRU layers. Here is the architecture:

layer_a = Input(shape=(120,), dtype='float32',name='main')
layer_b = Input(shape=(9,), dtype='float32', name='site')
layer_c = Input(shape=(4,), dtype='float32', name='access')
model = Model(inputs=[layer_a, layer_b,layer_c], outputs=[layer_f])
model.compile(optimizer='adam',loss=smape_error)

But when I tried to fit into my data, it produces an error:

Input 0 is incompatible with layer gru_14: expected ndim=3, found ndim=2.

Not sure what went wrong?

Allan Tanaka
  • 297
  • 3
  • 11

1 Answers1

3

GRU layers need the following dimension (batch_size, seq_len, dim_per_seq) also it returns (batch_siz, number_of_neurons) so in order to put 2 GRU after each other, the first GRU layer need to set the parameter return_sequences=True.

Also, when building keras models it's always a good idea to use model.summary()(just build part of the model before the erroe appears) to debug. Often the problem lies in an unexpected shape.

Your architecture is not suited for using GRU layers at all. First you can't flatten the tensors because this will destroy your sequence like structure. This will make concatenating the layers impossible. You could conv and pool your tree layers layer_t, layer_tt and layer_ttt to the same second dimension (should be bigger then 1). This way you could concatenate the last dimension and get a tensor with a sequence like shape to put into a gru layer.

dennis-w
  • 2,166
  • 1
  • 13
  • 23
  • Hi, I've set return_sequences=True in the first GRU layer but still the error persists... – Allan Tanaka Apr 04 '18 at 13:18
  • 1
    You hvae to this in every GRU layer except the last one. Also check which shape the first gru layer receivves as input. – dennis-w Apr 04 '18 at 13:19
  • The input for the first GRU layer is the CNN concatenate layers (None, 15552). I have updated the input shape for the respective layer above. I use return_sequences for the fist two GRU layers but still the error persists... Not sure how to reshape the input for GRU. – Allan Tanaka Apr 04 '18 at 13:45
  • Note this is time series data without having words. So I don't think embedding layer as GRU input is necessary, isn't it? – Allan Tanaka Apr 04 '18 at 13:56
  • I updated my answer, The problem is that you can't use a tensor of shape (None, 15552) as input to a GRU layer. Using an Embedding layer could be an alternative solution beside the one in my answer. – dennis-w Apr 04 '18 at 14:11
  • This answer exactly solved my issue. I used GRU layers as; `model.add(GRU(vec_size, return_sequences=True)) model.add(GRU(vec_size))` – Prabath Jul 03 '21 at 14:28