how to classify Iris flowers
$begingroup$
I'm working on a classification problem .I want to classify Iris flowers from the famous Iris data set using MLP. I know that I the number of neurons in output layer should be the same number of classes but can I use one neuron in output layer which output is the value (0 or 1 or -1) to refer to the three types or then it is considered as regression not classification ???? thanks
trin= [4.7 3.2 1.6 0.2;
4.8 3.1 1.6 0.2;
5.4 3.4 1.5 0.4;
5.2 4.1 1.5 0.1;
5.5 4.2 1.4 0.2;
5.7 2.6 3.5 1;
5.5 2.4 3.8 1.1;
5.5 2.4 3.7 1;
5.8 2.7 3.9 1.2;
6 2.7 5.1 1.6;
6.7 3.3 5.7 2.1;
7.2 3.2 6 1.8;
6.2 2.8 4.8 1.8;
6.1 3 4.9 1.8;
6.4 2.8 5.6 2.1
];
trout=[-1;-1;-1;-1;-1;
0;0;0;0;0;
1;1;1;1;1];
inp=size(trin,2);
out=size(trout,2);
hidden=2;
x=[-0.8000,-1.520,-0.9400,-3.040,3.800,2,-2,3.790,-1,0,4.600,4.400,0];
iw = reshape(x(1:hidden*inp),hidden,inp);
b1 = reshape(x(hidden*inp+1:hidden*inp+hidden),hidden,1);
lw = reshape(x(hidden*inp+hidden+1:hidden*inp+hidden+hidden*out),out,hidden);
b2 = reshape(x(hidden*inp+hidden+hidden*out+1:hidden*inp+hidden+hidden*out+out),out,1);
y =
tanh(tanh(trin*iw'+repmat(b1',size(trin,1),1))*lw'+repmat(b2',size(trin,1),1));
e = gsubtract(trout,y);
is this classification or it is considered as regression . I mean should I make the out put 3 bits to be consedered as classification and how to do this if yes?
machine-learning neural-network ai
$endgroup$
bumped to the homepage by Community♦ 4 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
add a comment |
$begingroup$
I'm working on a classification problem .I want to classify Iris flowers from the famous Iris data set using MLP. I know that I the number of neurons in output layer should be the same number of classes but can I use one neuron in output layer which output is the value (0 or 1 or -1) to refer to the three types or then it is considered as regression not classification ???? thanks
trin= [4.7 3.2 1.6 0.2;
4.8 3.1 1.6 0.2;
5.4 3.4 1.5 0.4;
5.2 4.1 1.5 0.1;
5.5 4.2 1.4 0.2;
5.7 2.6 3.5 1;
5.5 2.4 3.8 1.1;
5.5 2.4 3.7 1;
5.8 2.7 3.9 1.2;
6 2.7 5.1 1.6;
6.7 3.3 5.7 2.1;
7.2 3.2 6 1.8;
6.2 2.8 4.8 1.8;
6.1 3 4.9 1.8;
6.4 2.8 5.6 2.1
];
trout=[-1;-1;-1;-1;-1;
0;0;0;0;0;
1;1;1;1;1];
inp=size(trin,2);
out=size(trout,2);
hidden=2;
x=[-0.8000,-1.520,-0.9400,-3.040,3.800,2,-2,3.790,-1,0,4.600,4.400,0];
iw = reshape(x(1:hidden*inp),hidden,inp);
b1 = reshape(x(hidden*inp+1:hidden*inp+hidden),hidden,1);
lw = reshape(x(hidden*inp+hidden+1:hidden*inp+hidden+hidden*out),out,hidden);
b2 = reshape(x(hidden*inp+hidden+hidden*out+1:hidden*inp+hidden+hidden*out+out),out,1);
y =
tanh(tanh(trin*iw'+repmat(b1',size(trin,1),1))*lw'+repmat(b2',size(trin,1),1));
e = gsubtract(trout,y);
is this classification or it is considered as regression . I mean should I make the out put 3 bits to be consedered as classification and how to do this if yes?
machine-learning neural-network ai
$endgroup$
bumped to the homepage by Community♦ 4 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
$begingroup$
If you did that what would be your loss?
$endgroup$
– Robin Nicole
Dec 23 '18 at 10:44
$begingroup$
I don't know how to make the output three bits. I found this way(to make it one bit) easy but fall in trouble that it may considered as regression
$endgroup$
– Fahd
Dec 23 '18 at 10:54
$begingroup$
And what about the loss you would use?
$endgroup$
– Robin Nicole
Dec 23 '18 at 11:03
$begingroup$
I really don't know
$endgroup$
– Fahd
Dec 23 '18 at 11:30
$begingroup$
Maybe you should start looking at that, if the loss you want to use is cross entropy it is more like classification if the loss is like mean squared error it is is closer to regression.
$endgroup$
– Robin Nicole
Dec 23 '18 at 11:33
add a comment |
$begingroup$
I'm working on a classification problem .I want to classify Iris flowers from the famous Iris data set using MLP. I know that I the number of neurons in output layer should be the same number of classes but can I use one neuron in output layer which output is the value (0 or 1 or -1) to refer to the three types or then it is considered as regression not classification ???? thanks
trin= [4.7 3.2 1.6 0.2;
4.8 3.1 1.6 0.2;
5.4 3.4 1.5 0.4;
5.2 4.1 1.5 0.1;
5.5 4.2 1.4 0.2;
5.7 2.6 3.5 1;
5.5 2.4 3.8 1.1;
5.5 2.4 3.7 1;
5.8 2.7 3.9 1.2;
6 2.7 5.1 1.6;
6.7 3.3 5.7 2.1;
7.2 3.2 6 1.8;
6.2 2.8 4.8 1.8;
6.1 3 4.9 1.8;
6.4 2.8 5.6 2.1
];
trout=[-1;-1;-1;-1;-1;
0;0;0;0;0;
1;1;1;1;1];
inp=size(trin,2);
out=size(trout,2);
hidden=2;
x=[-0.8000,-1.520,-0.9400,-3.040,3.800,2,-2,3.790,-1,0,4.600,4.400,0];
iw = reshape(x(1:hidden*inp),hidden,inp);
b1 = reshape(x(hidden*inp+1:hidden*inp+hidden),hidden,1);
lw = reshape(x(hidden*inp+hidden+1:hidden*inp+hidden+hidden*out),out,hidden);
b2 = reshape(x(hidden*inp+hidden+hidden*out+1:hidden*inp+hidden+hidden*out+out),out,1);
y =
tanh(tanh(trin*iw'+repmat(b1',size(trin,1),1))*lw'+repmat(b2',size(trin,1),1));
e = gsubtract(trout,y);
is this classification or it is considered as regression . I mean should I make the out put 3 bits to be consedered as classification and how to do this if yes?
machine-learning neural-network ai
$endgroup$
I'm working on a classification problem .I want to classify Iris flowers from the famous Iris data set using MLP. I know that I the number of neurons in output layer should be the same number of classes but can I use one neuron in output layer which output is the value (0 or 1 or -1) to refer to the three types or then it is considered as regression not classification ???? thanks
trin= [4.7 3.2 1.6 0.2;
4.8 3.1 1.6 0.2;
5.4 3.4 1.5 0.4;
5.2 4.1 1.5 0.1;
5.5 4.2 1.4 0.2;
5.7 2.6 3.5 1;
5.5 2.4 3.8 1.1;
5.5 2.4 3.7 1;
5.8 2.7 3.9 1.2;
6 2.7 5.1 1.6;
6.7 3.3 5.7 2.1;
7.2 3.2 6 1.8;
6.2 2.8 4.8 1.8;
6.1 3 4.9 1.8;
6.4 2.8 5.6 2.1
];
trout=[-1;-1;-1;-1;-1;
0;0;0;0;0;
1;1;1;1;1];
inp=size(trin,2);
out=size(trout,2);
hidden=2;
x=[-0.8000,-1.520,-0.9400,-3.040,3.800,2,-2,3.790,-1,0,4.600,4.400,0];
iw = reshape(x(1:hidden*inp),hidden,inp);
b1 = reshape(x(hidden*inp+1:hidden*inp+hidden),hidden,1);
lw = reshape(x(hidden*inp+hidden+1:hidden*inp+hidden+hidden*out),out,hidden);
b2 = reshape(x(hidden*inp+hidden+hidden*out+1:hidden*inp+hidden+hidden*out+out),out,1);
y =
tanh(tanh(trin*iw'+repmat(b1',size(trin,1),1))*lw'+repmat(b2',size(trin,1),1));
e = gsubtract(trout,y);
is this classification or it is considered as regression . I mean should I make the out put 3 bits to be consedered as classification and how to do this if yes?
machine-learning neural-network ai
machine-learning neural-network ai
asked Dec 23 '18 at 10:21
FahdFahd
41
41
bumped to the homepage by Community♦ 4 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
bumped to the homepage by Community♦ 4 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
$begingroup$
If you did that what would be your loss?
$endgroup$
– Robin Nicole
Dec 23 '18 at 10:44
$begingroup$
I don't know how to make the output three bits. I found this way(to make it one bit) easy but fall in trouble that it may considered as regression
$endgroup$
– Fahd
Dec 23 '18 at 10:54
$begingroup$
And what about the loss you would use?
$endgroup$
– Robin Nicole
Dec 23 '18 at 11:03
$begingroup$
I really don't know
$endgroup$
– Fahd
Dec 23 '18 at 11:30
$begingroup$
Maybe you should start looking at that, if the loss you want to use is cross entropy it is more like classification if the loss is like mean squared error it is is closer to regression.
$endgroup$
– Robin Nicole
Dec 23 '18 at 11:33
add a comment |
$begingroup$
If you did that what would be your loss?
$endgroup$
– Robin Nicole
Dec 23 '18 at 10:44
$begingroup$
I don't know how to make the output three bits. I found this way(to make it one bit) easy but fall in trouble that it may considered as regression
$endgroup$
– Fahd
Dec 23 '18 at 10:54
$begingroup$
And what about the loss you would use?
$endgroup$
– Robin Nicole
Dec 23 '18 at 11:03
$begingroup$
I really don't know
$endgroup$
– Fahd
Dec 23 '18 at 11:30
$begingroup$
Maybe you should start looking at that, if the loss you want to use is cross entropy it is more like classification if the loss is like mean squared error it is is closer to regression.
$endgroup$
– Robin Nicole
Dec 23 '18 at 11:33
$begingroup$
If you did that what would be your loss?
$endgroup$
– Robin Nicole
Dec 23 '18 at 10:44
$begingroup$
If you did that what would be your loss?
$endgroup$
– Robin Nicole
Dec 23 '18 at 10:44
$begingroup$
I don't know how to make the output three bits. I found this way(to make it one bit) easy but fall in trouble that it may considered as regression
$endgroup$
– Fahd
Dec 23 '18 at 10:54
$begingroup$
I don't know how to make the output three bits. I found this way(to make it one bit) easy but fall in trouble that it may considered as regression
$endgroup$
– Fahd
Dec 23 '18 at 10:54
$begingroup$
And what about the loss you would use?
$endgroup$
– Robin Nicole
Dec 23 '18 at 11:03
$begingroup$
And what about the loss you would use?
$endgroup$
– Robin Nicole
Dec 23 '18 at 11:03
$begingroup$
I really don't know
$endgroup$
– Fahd
Dec 23 '18 at 11:30
$begingroup$
I really don't know
$endgroup$
– Fahd
Dec 23 '18 at 11:30
$begingroup$
Maybe you should start looking at that, if the loss you want to use is cross entropy it is more like classification if the loss is like mean squared error it is is closer to regression.
$endgroup$
– Robin Nicole
Dec 23 '18 at 11:33
$begingroup$
Maybe you should start looking at that, if the loss you want to use is cross entropy it is more like classification if the loss is like mean squared error it is is closer to regression.
$endgroup$
– Robin Nicole
Dec 23 '18 at 11:33
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
The iris dataset is meant to be used for classification. You have 3 separate classes of irises and attempting to solve it as a regression problem would be a mistake.
Think about your proposed solution, you want the output to be -1,0 or +1 (for classes a,b and c). But this implies that class a is more similar to class b than to c, and by the same principal that class c resembles class b more than a. You are adding a prior to the model that was not there before, and you should not do that (unless you are an iris specialist).
you need to take your class output labels and convert them to a one-hot-encoding representation:
class a = [1,0,0]
class b = [0,1,0]
class c = [0,0,1]
Then use categorical cross entropy for your loss function.
$endgroup$
$begingroup$
Do you have a useful links that help me doing this "Then use categorical cross entropy for your loss function."?
$endgroup$
– Fahd
Dec 30 '18 at 11:42
$begingroup$
machinelearningmastery.com/…
$endgroup$
– Mark.F
Dec 30 '18 at 12:38
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "557"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f43057%2fhow-to-classify-iris-flowers%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The iris dataset is meant to be used for classification. You have 3 separate classes of irises and attempting to solve it as a regression problem would be a mistake.
Think about your proposed solution, you want the output to be -1,0 or +1 (for classes a,b and c). But this implies that class a is more similar to class b than to c, and by the same principal that class c resembles class b more than a. You are adding a prior to the model that was not there before, and you should not do that (unless you are an iris specialist).
you need to take your class output labels and convert them to a one-hot-encoding representation:
class a = [1,0,0]
class b = [0,1,0]
class c = [0,0,1]
Then use categorical cross entropy for your loss function.
$endgroup$
$begingroup$
Do you have a useful links that help me doing this "Then use categorical cross entropy for your loss function."?
$endgroup$
– Fahd
Dec 30 '18 at 11:42
$begingroup$
machinelearningmastery.com/…
$endgroup$
– Mark.F
Dec 30 '18 at 12:38
add a comment |
$begingroup$
The iris dataset is meant to be used for classification. You have 3 separate classes of irises and attempting to solve it as a regression problem would be a mistake.
Think about your proposed solution, you want the output to be -1,0 or +1 (for classes a,b and c). But this implies that class a is more similar to class b than to c, and by the same principal that class c resembles class b more than a. You are adding a prior to the model that was not there before, and you should not do that (unless you are an iris specialist).
you need to take your class output labels and convert them to a one-hot-encoding representation:
class a = [1,0,0]
class b = [0,1,0]
class c = [0,0,1]
Then use categorical cross entropy for your loss function.
$endgroup$
$begingroup$
Do you have a useful links that help me doing this "Then use categorical cross entropy for your loss function."?
$endgroup$
– Fahd
Dec 30 '18 at 11:42
$begingroup$
machinelearningmastery.com/…
$endgroup$
– Mark.F
Dec 30 '18 at 12:38
add a comment |
$begingroup$
The iris dataset is meant to be used for classification. You have 3 separate classes of irises and attempting to solve it as a regression problem would be a mistake.
Think about your proposed solution, you want the output to be -1,0 or +1 (for classes a,b and c). But this implies that class a is more similar to class b than to c, and by the same principal that class c resembles class b more than a. You are adding a prior to the model that was not there before, and you should not do that (unless you are an iris specialist).
you need to take your class output labels and convert them to a one-hot-encoding representation:
class a = [1,0,0]
class b = [0,1,0]
class c = [0,0,1]
Then use categorical cross entropy for your loss function.
$endgroup$
The iris dataset is meant to be used for classification. You have 3 separate classes of irises and attempting to solve it as a regression problem would be a mistake.
Think about your proposed solution, you want the output to be -1,0 or +1 (for classes a,b and c). But this implies that class a is more similar to class b than to c, and by the same principal that class c resembles class b more than a. You are adding a prior to the model that was not there before, and you should not do that (unless you are an iris specialist).
you need to take your class output labels and convert them to a one-hot-encoding representation:
class a = [1,0,0]
class b = [0,1,0]
class c = [0,0,1]
Then use categorical cross entropy for your loss function.
answered Dec 23 '18 at 13:09
Mark.FMark.F
629118
629118
$begingroup$
Do you have a useful links that help me doing this "Then use categorical cross entropy for your loss function."?
$endgroup$
– Fahd
Dec 30 '18 at 11:42
$begingroup$
machinelearningmastery.com/…
$endgroup$
– Mark.F
Dec 30 '18 at 12:38
add a comment |
$begingroup$
Do you have a useful links that help me doing this "Then use categorical cross entropy for your loss function."?
$endgroup$
– Fahd
Dec 30 '18 at 11:42
$begingroup$
machinelearningmastery.com/…
$endgroup$
– Mark.F
Dec 30 '18 at 12:38
$begingroup$
Do you have a useful links that help me doing this "Then use categorical cross entropy for your loss function."?
$endgroup$
– Fahd
Dec 30 '18 at 11:42
$begingroup$
Do you have a useful links that help me doing this "Then use categorical cross entropy for your loss function."?
$endgroup$
– Fahd
Dec 30 '18 at 11:42
$begingroup$
machinelearningmastery.com/…
$endgroup$
– Mark.F
Dec 30 '18 at 12:38
$begingroup$
machinelearningmastery.com/…
$endgroup$
– Mark.F
Dec 30 '18 at 12:38
add a comment |
Thanks for contributing an answer to Data Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f43057%2fhow-to-classify-iris-flowers%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
If you did that what would be your loss?
$endgroup$
– Robin Nicole
Dec 23 '18 at 10:44
$begingroup$
I don't know how to make the output three bits. I found this way(to make it one bit) easy but fall in trouble that it may considered as regression
$endgroup$
– Fahd
Dec 23 '18 at 10:54
$begingroup$
And what about the loss you would use?
$endgroup$
– Robin Nicole
Dec 23 '18 at 11:03
$begingroup$
I really don't know
$endgroup$
– Fahd
Dec 23 '18 at 11:30
$begingroup$
Maybe you should start looking at that, if the loss you want to use is cross entropy it is more like classification if the loss is like mean squared error it is is closer to regression.
$endgroup$
– Robin Nicole
Dec 23 '18 at 11:33