3 dimensional array as input with Embedding Layer and LSTM in Keras
$begingroup$
Hey guys I have built an LSTM model that works and now I am trying(unsuccessfully) to add an Embedding layer as a first layer.
This solution didn't work for me.
I also read these questions before asking:
Keras input explanation: input_shape, units, batch_size, dim, etc,
Understanding Keras LSTMs and keras examples.
My input is a one-hot encoding(of ones and zeros) of characters of a language that consists 27 letters. I chose to represent each word as a sequence of 10 characters. Input size for each word is (10,27)
and I have 465 of them so it's X_train.shape (465,10,27)
, I also have a label of size y_train.shape (465,1)
. My goal is to train a model and while doing that to build a character embeddings.
Now this is the model that compiles and fits.
main_input = Input(shape=(10, 27))
rnn = Bidirectional(LSTM(5))
x = rnn(main_input)
de = Dense(1, activation='sigmoid')(x)
model = Model(inputs = main_input, outputs = de)
model.compile(loss='binary_crossentropy',optimizer='adam')
model.fit(X_train, y_train, epochs=10, batch_size=1, verbose=1)
After adding Embedding layer:
main_input = Input(shape=(10, 27))
emb = Embedding(input_dim=2, output_dim = 10)(main_input)
rnn = Bidirectional(LSTM(5))
x = rnn(emb)
de = Dense(1, activation='sigmoid')(x)
model = Model(inputs = main_input, outputs = de)
model.compile(loss='binary_crossentropy',optimizer='adam')
model.fit(X_train, y_train, epochs=10, batch_size=1, verbose=1)
output: ValueError: Input 0 is incompatible with layer bidirectional_31: expected ndim=3, found ndim=4
How do I fix the output shape?
Your ideas would be much appreciated.
keras tensorflow nlp word-embeddings
New contributor
$endgroup$
add a comment |
$begingroup$
Hey guys I have built an LSTM model that works and now I am trying(unsuccessfully) to add an Embedding layer as a first layer.
This solution didn't work for me.
I also read these questions before asking:
Keras input explanation: input_shape, units, batch_size, dim, etc,
Understanding Keras LSTMs and keras examples.
My input is a one-hot encoding(of ones and zeros) of characters of a language that consists 27 letters. I chose to represent each word as a sequence of 10 characters. Input size for each word is (10,27)
and I have 465 of them so it's X_train.shape (465,10,27)
, I also have a label of size y_train.shape (465,1)
. My goal is to train a model and while doing that to build a character embeddings.
Now this is the model that compiles and fits.
main_input = Input(shape=(10, 27))
rnn = Bidirectional(LSTM(5))
x = rnn(main_input)
de = Dense(1, activation='sigmoid')(x)
model = Model(inputs = main_input, outputs = de)
model.compile(loss='binary_crossentropy',optimizer='adam')
model.fit(X_train, y_train, epochs=10, batch_size=1, verbose=1)
After adding Embedding layer:
main_input = Input(shape=(10, 27))
emb = Embedding(input_dim=2, output_dim = 10)(main_input)
rnn = Bidirectional(LSTM(5))
x = rnn(emb)
de = Dense(1, activation='sigmoid')(x)
model = Model(inputs = main_input, outputs = de)
model.compile(loss='binary_crossentropy',optimizer='adam')
model.fit(X_train, y_train, epochs=10, batch_size=1, verbose=1)
output: ValueError: Input 0 is incompatible with layer bidirectional_31: expected ndim=3, found ndim=4
How do I fix the output shape?
Your ideas would be much appreciated.
keras tensorflow nlp word-embeddings
New contributor
$endgroup$
$begingroup$
Did you try adding a flatten layer after the embedding? As the error says, problem is that you added an extra dimension with your embedding layer.
$endgroup$
– Jeremie
2 days ago
$begingroup$
I could flatten it but LSTM needs a 3 dimensional array as an input, do you mean that I could reshape it afterwards?
$endgroup$
– Art
2 days ago
add a comment |
$begingroup$
Hey guys I have built an LSTM model that works and now I am trying(unsuccessfully) to add an Embedding layer as a first layer.
This solution didn't work for me.
I also read these questions before asking:
Keras input explanation: input_shape, units, batch_size, dim, etc,
Understanding Keras LSTMs and keras examples.
My input is a one-hot encoding(of ones and zeros) of characters of a language that consists 27 letters. I chose to represent each word as a sequence of 10 characters. Input size for each word is (10,27)
and I have 465 of them so it's X_train.shape (465,10,27)
, I also have a label of size y_train.shape (465,1)
. My goal is to train a model and while doing that to build a character embeddings.
Now this is the model that compiles and fits.
main_input = Input(shape=(10, 27))
rnn = Bidirectional(LSTM(5))
x = rnn(main_input)
de = Dense(1, activation='sigmoid')(x)
model = Model(inputs = main_input, outputs = de)
model.compile(loss='binary_crossentropy',optimizer='adam')
model.fit(X_train, y_train, epochs=10, batch_size=1, verbose=1)
After adding Embedding layer:
main_input = Input(shape=(10, 27))
emb = Embedding(input_dim=2, output_dim = 10)(main_input)
rnn = Bidirectional(LSTM(5))
x = rnn(emb)
de = Dense(1, activation='sigmoid')(x)
model = Model(inputs = main_input, outputs = de)
model.compile(loss='binary_crossentropy',optimizer='adam')
model.fit(X_train, y_train, epochs=10, batch_size=1, verbose=1)
output: ValueError: Input 0 is incompatible with layer bidirectional_31: expected ndim=3, found ndim=4
How do I fix the output shape?
Your ideas would be much appreciated.
keras tensorflow nlp word-embeddings
New contributor
$endgroup$
Hey guys I have built an LSTM model that works and now I am trying(unsuccessfully) to add an Embedding layer as a first layer.
This solution didn't work for me.
I also read these questions before asking:
Keras input explanation: input_shape, units, batch_size, dim, etc,
Understanding Keras LSTMs and keras examples.
My input is a one-hot encoding(of ones and zeros) of characters of a language that consists 27 letters. I chose to represent each word as a sequence of 10 characters. Input size for each word is (10,27)
and I have 465 of them so it's X_train.shape (465,10,27)
, I also have a label of size y_train.shape (465,1)
. My goal is to train a model and while doing that to build a character embeddings.
Now this is the model that compiles and fits.
main_input = Input(shape=(10, 27))
rnn = Bidirectional(LSTM(5))
x = rnn(main_input)
de = Dense(1, activation='sigmoid')(x)
model = Model(inputs = main_input, outputs = de)
model.compile(loss='binary_crossentropy',optimizer='adam')
model.fit(X_train, y_train, epochs=10, batch_size=1, verbose=1)
After adding Embedding layer:
main_input = Input(shape=(10, 27))
emb = Embedding(input_dim=2, output_dim = 10)(main_input)
rnn = Bidirectional(LSTM(5))
x = rnn(emb)
de = Dense(1, activation='sigmoid')(x)
model = Model(inputs = main_input, outputs = de)
model.compile(loss='binary_crossentropy',optimizer='adam')
model.fit(X_train, y_train, epochs=10, batch_size=1, verbose=1)
output: ValueError: Input 0 is incompatible with layer bidirectional_31: expected ndim=3, found ndim=4
How do I fix the output shape?
Your ideas would be much appreciated.
keras tensorflow nlp word-embeddings
keras tensorflow nlp word-embeddings
New contributor
New contributor
New contributor
asked 2 days ago
ArtArt
1
1
New contributor
New contributor
$begingroup$
Did you try adding a flatten layer after the embedding? As the error says, problem is that you added an extra dimension with your embedding layer.
$endgroup$
– Jeremie
2 days ago
$begingroup$
I could flatten it but LSTM needs a 3 dimensional array as an input, do you mean that I could reshape it afterwards?
$endgroup$
– Art
2 days ago
add a comment |
$begingroup$
Did you try adding a flatten layer after the embedding? As the error says, problem is that you added an extra dimension with your embedding layer.
$endgroup$
– Jeremie
2 days ago
$begingroup$
I could flatten it but LSTM needs a 3 dimensional array as an input, do you mean that I could reshape it afterwards?
$endgroup$
– Art
2 days ago
$begingroup$
Did you try adding a flatten layer after the embedding? As the error says, problem is that you added an extra dimension with your embedding layer.
$endgroup$
– Jeremie
2 days ago
$begingroup$
Did you try adding a flatten layer after the embedding? As the error says, problem is that you added an extra dimension with your embedding layer.
$endgroup$
– Jeremie
2 days ago
$begingroup$
I could flatten it but LSTM needs a 3 dimensional array as an input, do you mean that I could reshape it afterwards?
$endgroup$
– Art
2 days ago
$begingroup$
I could flatten it but LSTM needs a 3 dimensional array as an input, do you mean that I could reshape it afterwards?
$endgroup$
– Art
2 days ago
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "557"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Art is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f46637%2f3-dimensional-array-as-input-with-embedding-layer-and-lstm-in-keras%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Art is a new contributor. Be nice, and check out our Code of Conduct.
Art is a new contributor. Be nice, and check out our Code of Conduct.
Art is a new contributor. Be nice, and check out our Code of Conduct.
Art is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Data Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f46637%2f3-dimensional-array-as-input-with-embedding-layer-and-lstm-in-keras%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Did you try adding a flatten layer after the embedding? As the error says, problem is that you added an extra dimension with your embedding layer.
$endgroup$
– Jeremie
2 days ago
$begingroup$
I could flatten it but LSTM needs a 3 dimensional array as an input, do you mean that I could reshape it afterwards?
$endgroup$
– Art
2 days ago