Keras — Transfer learning — changing Input tensor shape
$begingroup$
This post seems to indicate that what I want to accomplish is not possible. However, I'm not convinced of this -- given what I've already done, I don't see why what I want to do can not be achieved...
I have two image datasets where one has images of shape (480, 720, 3) while the other has images of shape (540, 960, 3).
I initialized a model using the following code:
input = Input(shape=(480, 720, 3), name='image_input')
initial_model = VGG16(weights='imagenet', include_top=False)
for layer in initial_model.layers:
layer.trainable = False
x = Flatten()(initial_model(input))
x = Dense(1000, activation='relu')(x)
x = BatchNormalization()(x)
x = Dropout(0.5)(x)
x = Dense(1000, activation='relu')(x)
x = BatchNormalization()(x)
x = Dropout(0.5)(x)
x = Dense(14, activation='linear')(x)
model = Model(inputs=input, outputs=x)
model.compile(loss='mse', optimizer='adam', metrics=['mae'])
Now that I've trained this model on the former dataset, I'd like to pop the input tensor layer off and prepend the model with a new input tensor with a shape that matches the image dimensions of the latter dataset.
model = load_model('path/to/my/trained/model.h5')
old_input = model.pop(0)
new_input = Input(shape=(540, 960, 3), name='image_input')
x = model(new_input)
m = Model(inputs=new_input, outputs=x)
m.save('transfer_model.h5')
which yields this error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/engine/topology.py", line 2506, in save
save_model(self, filepath, overwrite, include_optimizer)
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/models.py", line 106, in save_model
'config': model.get_config()
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/engine/topology.py", line 2322, in get_config
layer_config = layer.get_config()
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/engine/topology.py", line 2370, in get_config
new_node_index = node_conversion_map[node_key]
KeyError: u'image_input_ib-0'
In the post that I linked, maz states that there is a dimension mismatch that prevents changing the input layer of a model -- if this was the case, then how is that I put a (480, 720, 3) input layer in front of the VGG16 model which expects (224, 224, 3) images?
I think a more likely issue is that my former model's output is expecting something different than what I'm giving it based on what fchollet is saying in this post. I'm syntactically confused, but I believe the whole x = Layer()(x)
segment is constructing the layer piece by piece from input->output and simply throwing a different input in front is breaking it.
I really have no idea though...
Can somebody please enlighten me how to accomplish what I'm trying to do or, if it's not possible, explain to me why not?
keras
$endgroup$
add a comment |
$begingroup$
This post seems to indicate that what I want to accomplish is not possible. However, I'm not convinced of this -- given what I've already done, I don't see why what I want to do can not be achieved...
I have two image datasets where one has images of shape (480, 720, 3) while the other has images of shape (540, 960, 3).
I initialized a model using the following code:
input = Input(shape=(480, 720, 3), name='image_input')
initial_model = VGG16(weights='imagenet', include_top=False)
for layer in initial_model.layers:
layer.trainable = False
x = Flatten()(initial_model(input))
x = Dense(1000, activation='relu')(x)
x = BatchNormalization()(x)
x = Dropout(0.5)(x)
x = Dense(1000, activation='relu')(x)
x = BatchNormalization()(x)
x = Dropout(0.5)(x)
x = Dense(14, activation='linear')(x)
model = Model(inputs=input, outputs=x)
model.compile(loss='mse', optimizer='adam', metrics=['mae'])
Now that I've trained this model on the former dataset, I'd like to pop the input tensor layer off and prepend the model with a new input tensor with a shape that matches the image dimensions of the latter dataset.
model = load_model('path/to/my/trained/model.h5')
old_input = model.pop(0)
new_input = Input(shape=(540, 960, 3), name='image_input')
x = model(new_input)
m = Model(inputs=new_input, outputs=x)
m.save('transfer_model.h5')
which yields this error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/engine/topology.py", line 2506, in save
save_model(self, filepath, overwrite, include_optimizer)
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/models.py", line 106, in save_model
'config': model.get_config()
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/engine/topology.py", line 2322, in get_config
layer_config = layer.get_config()
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/engine/topology.py", line 2370, in get_config
new_node_index = node_conversion_map[node_key]
KeyError: u'image_input_ib-0'
In the post that I linked, maz states that there is a dimension mismatch that prevents changing the input layer of a model -- if this was the case, then how is that I put a (480, 720, 3) input layer in front of the VGG16 model which expects (224, 224, 3) images?
I think a more likely issue is that my former model's output is expecting something different than what I'm giving it based on what fchollet is saying in this post. I'm syntactically confused, but I believe the whole x = Layer()(x)
segment is constructing the layer piece by piece from input->output and simply throwing a different input in front is breaking it.
I really have no idea though...
Can somebody please enlighten me how to accomplish what I'm trying to do or, if it's not possible, explain to me why not?
keras
$endgroup$
$begingroup$
have you solved it ?
$endgroup$
– tktktk0711
May 22 '18 at 10:03
add a comment |
$begingroup$
This post seems to indicate that what I want to accomplish is not possible. However, I'm not convinced of this -- given what I've already done, I don't see why what I want to do can not be achieved...
I have two image datasets where one has images of shape (480, 720, 3) while the other has images of shape (540, 960, 3).
I initialized a model using the following code:
input = Input(shape=(480, 720, 3), name='image_input')
initial_model = VGG16(weights='imagenet', include_top=False)
for layer in initial_model.layers:
layer.trainable = False
x = Flatten()(initial_model(input))
x = Dense(1000, activation='relu')(x)
x = BatchNormalization()(x)
x = Dropout(0.5)(x)
x = Dense(1000, activation='relu')(x)
x = BatchNormalization()(x)
x = Dropout(0.5)(x)
x = Dense(14, activation='linear')(x)
model = Model(inputs=input, outputs=x)
model.compile(loss='mse', optimizer='adam', metrics=['mae'])
Now that I've trained this model on the former dataset, I'd like to pop the input tensor layer off and prepend the model with a new input tensor with a shape that matches the image dimensions of the latter dataset.
model = load_model('path/to/my/trained/model.h5')
old_input = model.pop(0)
new_input = Input(shape=(540, 960, 3), name='image_input')
x = model(new_input)
m = Model(inputs=new_input, outputs=x)
m.save('transfer_model.h5')
which yields this error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/engine/topology.py", line 2506, in save
save_model(self, filepath, overwrite, include_optimizer)
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/models.py", line 106, in save_model
'config': model.get_config()
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/engine/topology.py", line 2322, in get_config
layer_config = layer.get_config()
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/engine/topology.py", line 2370, in get_config
new_node_index = node_conversion_map[node_key]
KeyError: u'image_input_ib-0'
In the post that I linked, maz states that there is a dimension mismatch that prevents changing the input layer of a model -- if this was the case, then how is that I put a (480, 720, 3) input layer in front of the VGG16 model which expects (224, 224, 3) images?
I think a more likely issue is that my former model's output is expecting something different than what I'm giving it based on what fchollet is saying in this post. I'm syntactically confused, but I believe the whole x = Layer()(x)
segment is constructing the layer piece by piece from input->output and simply throwing a different input in front is breaking it.
I really have no idea though...
Can somebody please enlighten me how to accomplish what I'm trying to do or, if it's not possible, explain to me why not?
keras
$endgroup$
This post seems to indicate that what I want to accomplish is not possible. However, I'm not convinced of this -- given what I've already done, I don't see why what I want to do can not be achieved...
I have two image datasets where one has images of shape (480, 720, 3) while the other has images of shape (540, 960, 3).
I initialized a model using the following code:
input = Input(shape=(480, 720, 3), name='image_input')
initial_model = VGG16(weights='imagenet', include_top=False)
for layer in initial_model.layers:
layer.trainable = False
x = Flatten()(initial_model(input))
x = Dense(1000, activation='relu')(x)
x = BatchNormalization()(x)
x = Dropout(0.5)(x)
x = Dense(1000, activation='relu')(x)
x = BatchNormalization()(x)
x = Dropout(0.5)(x)
x = Dense(14, activation='linear')(x)
model = Model(inputs=input, outputs=x)
model.compile(loss='mse', optimizer='adam', metrics=['mae'])
Now that I've trained this model on the former dataset, I'd like to pop the input tensor layer off and prepend the model with a new input tensor with a shape that matches the image dimensions of the latter dataset.
model = load_model('path/to/my/trained/model.h5')
old_input = model.pop(0)
new_input = Input(shape=(540, 960, 3), name='image_input')
x = model(new_input)
m = Model(inputs=new_input, outputs=x)
m.save('transfer_model.h5')
which yields this error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/engine/topology.py", line 2506, in save
save_model(self, filepath, overwrite, include_optimizer)
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/models.py", line 106, in save_model
'config': model.get_config()
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/engine/topology.py", line 2322, in get_config
layer_config = layer.get_config()
File "/home/aicg2/.local/lib/python2.7/site-packages/keras/engine/topology.py", line 2370, in get_config
new_node_index = node_conversion_map[node_key]
KeyError: u'image_input_ib-0'
In the post that I linked, maz states that there is a dimension mismatch that prevents changing the input layer of a model -- if this was the case, then how is that I put a (480, 720, 3) input layer in front of the VGG16 model which expects (224, 224, 3) images?
I think a more likely issue is that my former model's output is expecting something different than what I'm giving it based on what fchollet is saying in this post. I'm syntactically confused, but I believe the whole x = Layer()(x)
segment is constructing the layer piece by piece from input->output and simply throwing a different input in front is breaking it.
I really have no idea though...
Can somebody please enlighten me how to accomplish what I'm trying to do or, if it's not possible, explain to me why not?
keras
keras
asked Jul 27 '17 at 3:20
aweeeezyaweeeezy
18636
18636
$begingroup$
have you solved it ?
$endgroup$
– tktktk0711
May 22 '18 at 10:03
add a comment |
$begingroup$
have you solved it ?
$endgroup$
– tktktk0711
May 22 '18 at 10:03
$begingroup$
have you solved it ?
$endgroup$
– tktktk0711
May 22 '18 at 10:03
$begingroup$
have you solved it ?
$endgroup$
– tktktk0711
May 22 '18 at 10:03
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
You can do this by creating a new VGG16 model instance with the new input shape new_shape
and copying over all the layer weights. The code is roughly
new_model = VGG16(weights=None, input_shape=new_shape, include_top=False)
for new_layer, layer in zip(new_model.layers[1:], model.layers[1:]):
new_layer.set_weights(layer.get_weights())
$endgroup$
$begingroup$
Tried this with inceptionV3, and it gets slower and slower as the loop continues
$endgroup$
– BachT
Dec 23 '18 at 1:26
add a comment |
$begingroup$
Here is another solution, not specific to the VGG model.
Note, that the weights of the dense layer cannot be copied (and will thus be newly initialized). This makes sense, because the shape of the weights differs in the old and the new model.
import keras
import numpy as np
def get_model():
old_input_shape = (20, 20, 3)
model = keras.models.Sequential()
model.add(keras.layers.Conv2D(9, (3, 3), padding="same", input_shape=old_input_shape))
model.add(keras.layers.MaxPooling2D((2, 2)))
model.add(keras.layers.Flatten())
model.add(keras.layers.Dense(1, activation="sigmoid"))
model.compile(loss='binary_crossentropy', optimizer=keras.optimizers.Adam(lr=0.0001), metrics=['acc'], )
model.summary()
return model
def change_model(model, new_input_shape=(None, 40, 40, 3)):
# replace input shape of first layer
model._layers[1].batch_input_shape = new_input_shape
# feel free to modify additional parameters of other layers, for example...
model._layers[2].pool_size = (8, 8)
model._layers[2].strides = (8, 8)
# rebuild model architecture by exporting and importing via json
new_model = keras.models.model_from_json(model.to_json())
new_model.summary()
# copy weights from old model to new one
for layer in new_model.layers:
try:
layer.set_weights(model.get_layer(name=layer.name).get_weights())
except:
print("Could not transfer weights for layer {}".format(layer.name))
# test new model on a random input image
X = np.random.rand(10, 40, 40, 3)
y_pred = new_model.predict(X)
print(y_pred)
return new_model
if __name__ == '__main__':
model = get_model()
new_model = change_model(model)
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "557"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f21734%2fkeras-transfer-learning-changing-input-tensor-shape%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
You can do this by creating a new VGG16 model instance with the new input shape new_shape
and copying over all the layer weights. The code is roughly
new_model = VGG16(weights=None, input_shape=new_shape, include_top=False)
for new_layer, layer in zip(new_model.layers[1:], model.layers[1:]):
new_layer.set_weights(layer.get_weights())
$endgroup$
$begingroup$
Tried this with inceptionV3, and it gets slower and slower as the loop continues
$endgroup$
– BachT
Dec 23 '18 at 1:26
add a comment |
$begingroup$
You can do this by creating a new VGG16 model instance with the new input shape new_shape
and copying over all the layer weights. The code is roughly
new_model = VGG16(weights=None, input_shape=new_shape, include_top=False)
for new_layer, layer in zip(new_model.layers[1:], model.layers[1:]):
new_layer.set_weights(layer.get_weights())
$endgroup$
$begingroup$
Tried this with inceptionV3, and it gets slower and slower as the loop continues
$endgroup$
– BachT
Dec 23 '18 at 1:26
add a comment |
$begingroup$
You can do this by creating a new VGG16 model instance with the new input shape new_shape
and copying over all the layer weights. The code is roughly
new_model = VGG16(weights=None, input_shape=new_shape, include_top=False)
for new_layer, layer in zip(new_model.layers[1:], model.layers[1:]):
new_layer.set_weights(layer.get_weights())
$endgroup$
You can do this by creating a new VGG16 model instance with the new input shape new_shape
and copying over all the layer weights. The code is roughly
new_model = VGG16(weights=None, input_shape=new_shape, include_top=False)
for new_layer, layer in zip(new_model.layers[1:], model.layers[1:]):
new_layer.set_weights(layer.get_weights())
edited Feb 2 '18 at 19:31
answered Feb 2 '18 at 15:41
nilnil
213
213
$begingroup$
Tried this with inceptionV3, and it gets slower and slower as the loop continues
$endgroup$
– BachT
Dec 23 '18 at 1:26
add a comment |
$begingroup$
Tried this with inceptionV3, and it gets slower and slower as the loop continues
$endgroup$
– BachT
Dec 23 '18 at 1:26
$begingroup$
Tried this with inceptionV3, and it gets slower and slower as the loop continues
$endgroup$
– BachT
Dec 23 '18 at 1:26
$begingroup$
Tried this with inceptionV3, and it gets slower and slower as the loop continues
$endgroup$
– BachT
Dec 23 '18 at 1:26
add a comment |
$begingroup$
Here is another solution, not specific to the VGG model.
Note, that the weights of the dense layer cannot be copied (and will thus be newly initialized). This makes sense, because the shape of the weights differs in the old and the new model.
import keras
import numpy as np
def get_model():
old_input_shape = (20, 20, 3)
model = keras.models.Sequential()
model.add(keras.layers.Conv2D(9, (3, 3), padding="same", input_shape=old_input_shape))
model.add(keras.layers.MaxPooling2D((2, 2)))
model.add(keras.layers.Flatten())
model.add(keras.layers.Dense(1, activation="sigmoid"))
model.compile(loss='binary_crossentropy', optimizer=keras.optimizers.Adam(lr=0.0001), metrics=['acc'], )
model.summary()
return model
def change_model(model, new_input_shape=(None, 40, 40, 3)):
# replace input shape of first layer
model._layers[1].batch_input_shape = new_input_shape
# feel free to modify additional parameters of other layers, for example...
model._layers[2].pool_size = (8, 8)
model._layers[2].strides = (8, 8)
# rebuild model architecture by exporting and importing via json
new_model = keras.models.model_from_json(model.to_json())
new_model.summary()
# copy weights from old model to new one
for layer in new_model.layers:
try:
layer.set_weights(model.get_layer(name=layer.name).get_weights())
except:
print("Could not transfer weights for layer {}".format(layer.name))
# test new model on a random input image
X = np.random.rand(10, 40, 40, 3)
y_pred = new_model.predict(X)
print(y_pred)
return new_model
if __name__ == '__main__':
model = get_model()
new_model = change_model(model)
$endgroup$
add a comment |
$begingroup$
Here is another solution, not specific to the VGG model.
Note, that the weights of the dense layer cannot be copied (and will thus be newly initialized). This makes sense, because the shape of the weights differs in the old and the new model.
import keras
import numpy as np
def get_model():
old_input_shape = (20, 20, 3)
model = keras.models.Sequential()
model.add(keras.layers.Conv2D(9, (3, 3), padding="same", input_shape=old_input_shape))
model.add(keras.layers.MaxPooling2D((2, 2)))
model.add(keras.layers.Flatten())
model.add(keras.layers.Dense(1, activation="sigmoid"))
model.compile(loss='binary_crossentropy', optimizer=keras.optimizers.Adam(lr=0.0001), metrics=['acc'], )
model.summary()
return model
def change_model(model, new_input_shape=(None, 40, 40, 3)):
# replace input shape of first layer
model._layers[1].batch_input_shape = new_input_shape
# feel free to modify additional parameters of other layers, for example...
model._layers[2].pool_size = (8, 8)
model._layers[2].strides = (8, 8)
# rebuild model architecture by exporting and importing via json
new_model = keras.models.model_from_json(model.to_json())
new_model.summary()
# copy weights from old model to new one
for layer in new_model.layers:
try:
layer.set_weights(model.get_layer(name=layer.name).get_weights())
except:
print("Could not transfer weights for layer {}".format(layer.name))
# test new model on a random input image
X = np.random.rand(10, 40, 40, 3)
y_pred = new_model.predict(X)
print(y_pred)
return new_model
if __name__ == '__main__':
model = get_model()
new_model = change_model(model)
$endgroup$
add a comment |
$begingroup$
Here is another solution, not specific to the VGG model.
Note, that the weights of the dense layer cannot be copied (and will thus be newly initialized). This makes sense, because the shape of the weights differs in the old and the new model.
import keras
import numpy as np
def get_model():
old_input_shape = (20, 20, 3)
model = keras.models.Sequential()
model.add(keras.layers.Conv2D(9, (3, 3), padding="same", input_shape=old_input_shape))
model.add(keras.layers.MaxPooling2D((2, 2)))
model.add(keras.layers.Flatten())
model.add(keras.layers.Dense(1, activation="sigmoid"))
model.compile(loss='binary_crossentropy', optimizer=keras.optimizers.Adam(lr=0.0001), metrics=['acc'], )
model.summary()
return model
def change_model(model, new_input_shape=(None, 40, 40, 3)):
# replace input shape of first layer
model._layers[1].batch_input_shape = new_input_shape
# feel free to modify additional parameters of other layers, for example...
model._layers[2].pool_size = (8, 8)
model._layers[2].strides = (8, 8)
# rebuild model architecture by exporting and importing via json
new_model = keras.models.model_from_json(model.to_json())
new_model.summary()
# copy weights from old model to new one
for layer in new_model.layers:
try:
layer.set_weights(model.get_layer(name=layer.name).get_weights())
except:
print("Could not transfer weights for layer {}".format(layer.name))
# test new model on a random input image
X = np.random.rand(10, 40, 40, 3)
y_pred = new_model.predict(X)
print(y_pred)
return new_model
if __name__ == '__main__':
model = get_model()
new_model = change_model(model)
$endgroup$
Here is another solution, not specific to the VGG model.
Note, that the weights of the dense layer cannot be copied (and will thus be newly initialized). This makes sense, because the shape of the weights differs in the old and the new model.
import keras
import numpy as np
def get_model():
old_input_shape = (20, 20, 3)
model = keras.models.Sequential()
model.add(keras.layers.Conv2D(9, (3, 3), padding="same", input_shape=old_input_shape))
model.add(keras.layers.MaxPooling2D((2, 2)))
model.add(keras.layers.Flatten())
model.add(keras.layers.Dense(1, activation="sigmoid"))
model.compile(loss='binary_crossentropy', optimizer=keras.optimizers.Adam(lr=0.0001), metrics=['acc'], )
model.summary()
return model
def change_model(model, new_input_shape=(None, 40, 40, 3)):
# replace input shape of first layer
model._layers[1].batch_input_shape = new_input_shape
# feel free to modify additional parameters of other layers, for example...
model._layers[2].pool_size = (8, 8)
model._layers[2].strides = (8, 8)
# rebuild model architecture by exporting and importing via json
new_model = keras.models.model_from_json(model.to_json())
new_model.summary()
# copy weights from old model to new one
for layer in new_model.layers:
try:
layer.set_weights(model.get_layer(name=layer.name).get_weights())
except:
print("Could not transfer weights for layer {}".format(layer.name))
# test new model on a random input image
X = np.random.rand(10, 40, 40, 3)
y_pred = new_model.predict(X)
print(y_pred)
return new_model
if __name__ == '__main__':
model = get_model()
new_model = change_model(model)
answered 2 days ago
gebbissimogebbissimo
1113
1113
add a comment |
add a comment |
Thanks for contributing an answer to Data Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f21734%2fkeras-transfer-learning-changing-input-tensor-shape%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
have you solved it ?
$endgroup$
– tktktk0711
May 22 '18 at 10:03