What are graph embedding?
$begingroup$
I recently came across graph embedding such as DeepWalk and LINE. However, I still do not have a clear idea as what is meant by graph embeddings and when to use it (applications)? Any suggestions are welcome!
graphs
$endgroup$
|
show 1 more comment
$begingroup$
I recently came across graph embedding such as DeepWalk and LINE. However, I still do not have a clear idea as what is meant by graph embeddings and when to use it (applications)? Any suggestions are welcome!
graphs
$endgroup$
1
$begingroup$
A graph embedding is an embedding for graphs! So it takes a graph and returns embeddings for the graph, edges, or vertices. Embeddings enable similarity search and generally facilitate machine learning by providing representations.
$endgroup$
– Emre
Oct 25 '17 at 23:31
$begingroup$
@Emre what does it meant by embedding? :)
$endgroup$
– Volka
Oct 26 '17 at 0:29
1
$begingroup$
As meaning of the embed goes, fixing things onto something. Graph embedding is kind of like fixing vertices onto a surface and drawing edges to represent say a network. So example be like planar graph can be embedded on to a $2D$ surface without edge crossing. Weights can be assigned to edges and appropriate edge lengths viz. helps us to understand/estimate as @Emre mentioned similarity search etc.
$endgroup$
– Kiritee Gak
Oct 26 '17 at 0:58
$begingroup$
@KiriteeGak Thanks :) What are their real world applications? They say they can be used for recommendation and all? but how?
$endgroup$
– Volka
Oct 26 '17 at 1:25
1
$begingroup$
Youtube video recommendation can be visualised as a model where video you are currently watching is the node you are on and the next videos that is in your recommendation are the ones that are most similar to you based on the what similar users have watched next and many more factors of course which is a huge network to traverse.This paper is a simple good read on understanding the application.
$endgroup$
– Kiritee Gak
Oct 26 '17 at 1:34
|
show 1 more comment
$begingroup$
I recently came across graph embedding such as DeepWalk and LINE. However, I still do not have a clear idea as what is meant by graph embeddings and when to use it (applications)? Any suggestions are welcome!
graphs
$endgroup$
I recently came across graph embedding such as DeepWalk and LINE. However, I still do not have a clear idea as what is meant by graph embeddings and when to use it (applications)? Any suggestions are welcome!
graphs
graphs
edited Oct 26 '17 at 11:58
Kasra Manshaei
3,7391035
3,7391035
asked Oct 25 '17 at 22:54
VolkaVolka
281313
281313
1
$begingroup$
A graph embedding is an embedding for graphs! So it takes a graph and returns embeddings for the graph, edges, or vertices. Embeddings enable similarity search and generally facilitate machine learning by providing representations.
$endgroup$
– Emre
Oct 25 '17 at 23:31
$begingroup$
@Emre what does it meant by embedding? :)
$endgroup$
– Volka
Oct 26 '17 at 0:29
1
$begingroup$
As meaning of the embed goes, fixing things onto something. Graph embedding is kind of like fixing vertices onto a surface and drawing edges to represent say a network. So example be like planar graph can be embedded on to a $2D$ surface without edge crossing. Weights can be assigned to edges and appropriate edge lengths viz. helps us to understand/estimate as @Emre mentioned similarity search etc.
$endgroup$
– Kiritee Gak
Oct 26 '17 at 0:58
$begingroup$
@KiriteeGak Thanks :) What are their real world applications? They say they can be used for recommendation and all? but how?
$endgroup$
– Volka
Oct 26 '17 at 1:25
1
$begingroup$
Youtube video recommendation can be visualised as a model where video you are currently watching is the node you are on and the next videos that is in your recommendation are the ones that are most similar to you based on the what similar users have watched next and many more factors of course which is a huge network to traverse.This paper is a simple good read on understanding the application.
$endgroup$
– Kiritee Gak
Oct 26 '17 at 1:34
|
show 1 more comment
1
$begingroup$
A graph embedding is an embedding for graphs! So it takes a graph and returns embeddings for the graph, edges, or vertices. Embeddings enable similarity search and generally facilitate machine learning by providing representations.
$endgroup$
– Emre
Oct 25 '17 at 23:31
$begingroup$
@Emre what does it meant by embedding? :)
$endgroup$
– Volka
Oct 26 '17 at 0:29
1
$begingroup$
As meaning of the embed goes, fixing things onto something. Graph embedding is kind of like fixing vertices onto a surface and drawing edges to represent say a network. So example be like planar graph can be embedded on to a $2D$ surface without edge crossing. Weights can be assigned to edges and appropriate edge lengths viz. helps us to understand/estimate as @Emre mentioned similarity search etc.
$endgroup$
– Kiritee Gak
Oct 26 '17 at 0:58
$begingroup$
@KiriteeGak Thanks :) What are their real world applications? They say they can be used for recommendation and all? but how?
$endgroup$
– Volka
Oct 26 '17 at 1:25
1
$begingroup$
Youtube video recommendation can be visualised as a model where video you are currently watching is the node you are on and the next videos that is in your recommendation are the ones that are most similar to you based on the what similar users have watched next and many more factors of course which is a huge network to traverse.This paper is a simple good read on understanding the application.
$endgroup$
– Kiritee Gak
Oct 26 '17 at 1:34
1
1
$begingroup$
A graph embedding is an embedding for graphs! So it takes a graph and returns embeddings for the graph, edges, or vertices. Embeddings enable similarity search and generally facilitate machine learning by providing representations.
$endgroup$
– Emre
Oct 25 '17 at 23:31
$begingroup$
A graph embedding is an embedding for graphs! So it takes a graph and returns embeddings for the graph, edges, or vertices. Embeddings enable similarity search and generally facilitate machine learning by providing representations.
$endgroup$
– Emre
Oct 25 '17 at 23:31
$begingroup$
@Emre what does it meant by embedding? :)
$endgroup$
– Volka
Oct 26 '17 at 0:29
$begingroup$
@Emre what does it meant by embedding? :)
$endgroup$
– Volka
Oct 26 '17 at 0:29
1
1
$begingroup$
As meaning of the embed goes, fixing things onto something. Graph embedding is kind of like fixing vertices onto a surface and drawing edges to represent say a network. So example be like planar graph can be embedded on to a $2D$ surface without edge crossing. Weights can be assigned to edges and appropriate edge lengths viz. helps us to understand/estimate as @Emre mentioned similarity search etc.
$endgroup$
– Kiritee Gak
Oct 26 '17 at 0:58
$begingroup$
As meaning of the embed goes, fixing things onto something. Graph embedding is kind of like fixing vertices onto a surface and drawing edges to represent say a network. So example be like planar graph can be embedded on to a $2D$ surface without edge crossing. Weights can be assigned to edges and appropriate edge lengths viz. helps us to understand/estimate as @Emre mentioned similarity search etc.
$endgroup$
– Kiritee Gak
Oct 26 '17 at 0:58
$begingroup$
@KiriteeGak Thanks :) What are their real world applications? They say they can be used for recommendation and all? but how?
$endgroup$
– Volka
Oct 26 '17 at 1:25
$begingroup$
@KiriteeGak Thanks :) What are their real world applications? They say they can be used for recommendation and all? but how?
$endgroup$
– Volka
Oct 26 '17 at 1:25
1
1
$begingroup$
Youtube video recommendation can be visualised as a model where video you are currently watching is the node you are on and the next videos that is in your recommendation are the ones that are most similar to you based on the what similar users have watched next and many more factors of course which is a huge network to traverse.This paper is a simple good read on understanding the application.
$endgroup$
– Kiritee Gak
Oct 26 '17 at 1:34
$begingroup$
Youtube video recommendation can be visualised as a model where video you are currently watching is the node you are on and the next videos that is in your recommendation are the ones that are most similar to you based on the what similar users have watched next and many more factors of course which is a huge network to traverse.This paper is a simple good read on understanding the application.
$endgroup$
– Kiritee Gak
Oct 26 '17 at 1:34
|
show 1 more comment
3 Answers
3
active
oldest
votes
$begingroup$
Graph embedding learns a mapping from a network to a vector space, while preserving relevant network properties.
Vector spaces are more amenable to data science than graphs. Graphs contain edges and nodes, those network relationships can only use a specific subset of mathematics, statistics, and machine learning. Vector spaces have a richer toolset from those domains. Additionally, vector operations are often simpler and faster than the equivalent graph operations.
One example is finding nearest neighbors. You can perform "hops" from node to another node in a graph. In many real-world graphs after a couple of hops, there is little meaningful information (e.g., recommendations from friends of friends of friends). However, in vector spaces, you can use distance metrics to get quantitative results (e.g., Euclidian distance or Cosine Similarity). If you have quantitative distance metrics in a meaningful vector space, finding nearest neighbors is straightforward.
"Graph Embedding Techniques, Applications, and Performance: A Survey" is an overview article that goes into greater detail.
$endgroup$
add a comment |
$begingroup$
What are graph Embeddings ?
"Graph Embeddings" is a hot area today in machine learning. It basically means finding "latent vector representation" of graphs which captures the topology (in very basic sense) of the graph. We can make this "vector representation" rich by also considering the vertex-vertex relationships, edge-information etc. There are roughly two levels of embeddings in the graph (of-course we can anytime define more levels by logically dividing the whole graph into subgraphs of various sizes):
Vertex Embeddings - Here you find latent vector representation of every vertex in the given graph. You can then compare the different vertices by plotting these vectors in the space and interestingly "similar" vertices are plotted closer to each other than the ones which are dissimilar or less related. This is the same work that is done in "DeepWalk" by Perozzi.
Graph Embeddings - Here you find the latent vector representation of the whole graph itself. For example, you have a group of chemical compounds for which you want to check which compounds are similar to each other, how many type of compounds are there in the group (clusters) etc. You can use these vectors and plot them in space and find all the above information. This is the work that is done in "Deep Graph Kernels" by Yanardag.
Applications -
By looking carefully, embeddings are "latent" representations which means if a graph has a |V| * |V| adjacency matrix where |V| = 1M, its hard to use or process a 1M * 1M numbers in an algorithm. So, latent embedding of dimension 'd', where d << |V|, would make the adjacency matrix |V| * d and relatively easier to use. Another application could be - Consider a simple scenario where we want to recommend products to the people who have similar interests in a social network. By getting vertex embeddings (here it means vector representation of each person), we can find the similar ones by plotting these vectors and this makes recommendation easy. These are some applications and there are others. You can refer to a nice survey paper - Graph Embedding Techniques, a Survey.
Where from it all came ? There has been a lot of works in this area and almost all comes from the groundbreaking research in natural language processing field - "Word2Vec" by Mikolov. If you want to get started with the research on graph embeddings, I would recommend to first understand how Word2Vec works. You can find nice explanations - Word2Vec parameter learning explained and Stanford Lecture. Then you can jump to the papers that you listed. Those works can be categorized as:
Works based on "Vertex Embeddings": - DeepWalk, Node2Vec, LINE.
Works based on "Graph Embeddings": - Deep Graph Kernels, Subgraph2Vec.
$endgroup$
1
$begingroup$
Wowww!! This is absolutely a perfect answer. Thanks a lot :) Very well done :)
$endgroup$
– Volka
Oct 27 '17 at 7:34
$begingroup$
Hi Mausam Jain. Can you please let me know if I can use graph embeddings to identify important nodes in the network?
$endgroup$
– Volka
Dec 25 '17 at 17:42
$begingroup$
Hi, Volka. To answer this question, I need to know what type of graph are you working on; is it twitter, facebook, reddit or something else ?
$endgroup$
– flyingDope
Dec 26 '17 at 1:12
$begingroup$
Thank you for your reply. I am actually working in a social network where I want to identify the most social people :)
$endgroup$
– Volka
Dec 26 '17 at 4:26
add a comment |
$begingroup$
In the paper A central limit theorem for an omnibus embedding of random dot product graphs by Levin et.al. paper, a specific type of graph embedding (the Omnibus embedding) defines graph embedding as a methodology "in which the vertices of a graph are mapped to vectors in a low-dimensional Euclidean space." Check the link for more information.
$endgroup$
$begingroup$
welcome to the forum. If you want to mention a paper, please write down its name as part of the text as well (because links can be broken).
$endgroup$
– Mark.F
Dec 29 '18 at 8:05
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "557"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f24081%2fwhat-are-graph-embedding%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Graph embedding learns a mapping from a network to a vector space, while preserving relevant network properties.
Vector spaces are more amenable to data science than graphs. Graphs contain edges and nodes, those network relationships can only use a specific subset of mathematics, statistics, and machine learning. Vector spaces have a richer toolset from those domains. Additionally, vector operations are often simpler and faster than the equivalent graph operations.
One example is finding nearest neighbors. You can perform "hops" from node to another node in a graph. In many real-world graphs after a couple of hops, there is little meaningful information (e.g., recommendations from friends of friends of friends). However, in vector spaces, you can use distance metrics to get quantitative results (e.g., Euclidian distance or Cosine Similarity). If you have quantitative distance metrics in a meaningful vector space, finding nearest neighbors is straightforward.
"Graph Embedding Techniques, Applications, and Performance: A Survey" is an overview article that goes into greater detail.
$endgroup$
add a comment |
$begingroup$
Graph embedding learns a mapping from a network to a vector space, while preserving relevant network properties.
Vector spaces are more amenable to data science than graphs. Graphs contain edges and nodes, those network relationships can only use a specific subset of mathematics, statistics, and machine learning. Vector spaces have a richer toolset from those domains. Additionally, vector operations are often simpler and faster than the equivalent graph operations.
One example is finding nearest neighbors. You can perform "hops" from node to another node in a graph. In many real-world graphs after a couple of hops, there is little meaningful information (e.g., recommendations from friends of friends of friends). However, in vector spaces, you can use distance metrics to get quantitative results (e.g., Euclidian distance or Cosine Similarity). If you have quantitative distance metrics in a meaningful vector space, finding nearest neighbors is straightforward.
"Graph Embedding Techniques, Applications, and Performance: A Survey" is an overview article that goes into greater detail.
$endgroup$
add a comment |
$begingroup$
Graph embedding learns a mapping from a network to a vector space, while preserving relevant network properties.
Vector spaces are more amenable to data science than graphs. Graphs contain edges and nodes, those network relationships can only use a specific subset of mathematics, statistics, and machine learning. Vector spaces have a richer toolset from those domains. Additionally, vector operations are often simpler and faster than the equivalent graph operations.
One example is finding nearest neighbors. You can perform "hops" from node to another node in a graph. In many real-world graphs after a couple of hops, there is little meaningful information (e.g., recommendations from friends of friends of friends). However, in vector spaces, you can use distance metrics to get quantitative results (e.g., Euclidian distance or Cosine Similarity). If you have quantitative distance metrics in a meaningful vector space, finding nearest neighbors is straightforward.
"Graph Embedding Techniques, Applications, and Performance: A Survey" is an overview article that goes into greater detail.
$endgroup$
Graph embedding learns a mapping from a network to a vector space, while preserving relevant network properties.
Vector spaces are more amenable to data science than graphs. Graphs contain edges and nodes, those network relationships can only use a specific subset of mathematics, statistics, and machine learning. Vector spaces have a richer toolset from those domains. Additionally, vector operations are often simpler and faster than the equivalent graph operations.
One example is finding nearest neighbors. You can perform "hops" from node to another node in a graph. In many real-world graphs after a couple of hops, there is little meaningful information (e.g., recommendations from friends of friends of friends). However, in vector spaces, you can use distance metrics to get quantitative results (e.g., Euclidian distance or Cosine Similarity). If you have quantitative distance metrics in a meaningful vector space, finding nearest neighbors is straightforward.
"Graph Embedding Techniques, Applications, and Performance: A Survey" is an overview article that goes into greater detail.
edited yesterday
answered Oct 26 '17 at 2:32
Brian SpieringBrian Spiering
3,7581028
3,7581028
add a comment |
add a comment |
$begingroup$
What are graph Embeddings ?
"Graph Embeddings" is a hot area today in machine learning. It basically means finding "latent vector representation" of graphs which captures the topology (in very basic sense) of the graph. We can make this "vector representation" rich by also considering the vertex-vertex relationships, edge-information etc. There are roughly two levels of embeddings in the graph (of-course we can anytime define more levels by logically dividing the whole graph into subgraphs of various sizes):
Vertex Embeddings - Here you find latent vector representation of every vertex in the given graph. You can then compare the different vertices by plotting these vectors in the space and interestingly "similar" vertices are plotted closer to each other than the ones which are dissimilar or less related. This is the same work that is done in "DeepWalk" by Perozzi.
Graph Embeddings - Here you find the latent vector representation of the whole graph itself. For example, you have a group of chemical compounds for which you want to check which compounds are similar to each other, how many type of compounds are there in the group (clusters) etc. You can use these vectors and plot them in space and find all the above information. This is the work that is done in "Deep Graph Kernels" by Yanardag.
Applications -
By looking carefully, embeddings are "latent" representations which means if a graph has a |V| * |V| adjacency matrix where |V| = 1M, its hard to use or process a 1M * 1M numbers in an algorithm. So, latent embedding of dimension 'd', where d << |V|, would make the adjacency matrix |V| * d and relatively easier to use. Another application could be - Consider a simple scenario where we want to recommend products to the people who have similar interests in a social network. By getting vertex embeddings (here it means vector representation of each person), we can find the similar ones by plotting these vectors and this makes recommendation easy. These are some applications and there are others. You can refer to a nice survey paper - Graph Embedding Techniques, a Survey.
Where from it all came ? There has been a lot of works in this area and almost all comes from the groundbreaking research in natural language processing field - "Word2Vec" by Mikolov. If you want to get started with the research on graph embeddings, I would recommend to first understand how Word2Vec works. You can find nice explanations - Word2Vec parameter learning explained and Stanford Lecture. Then you can jump to the papers that you listed. Those works can be categorized as:
Works based on "Vertex Embeddings": - DeepWalk, Node2Vec, LINE.
Works based on "Graph Embeddings": - Deep Graph Kernels, Subgraph2Vec.
$endgroup$
1
$begingroup$
Wowww!! This is absolutely a perfect answer. Thanks a lot :) Very well done :)
$endgroup$
– Volka
Oct 27 '17 at 7:34
$begingroup$
Hi Mausam Jain. Can you please let me know if I can use graph embeddings to identify important nodes in the network?
$endgroup$
– Volka
Dec 25 '17 at 17:42
$begingroup$
Hi, Volka. To answer this question, I need to know what type of graph are you working on; is it twitter, facebook, reddit or something else ?
$endgroup$
– flyingDope
Dec 26 '17 at 1:12
$begingroup$
Thank you for your reply. I am actually working in a social network where I want to identify the most social people :)
$endgroup$
– Volka
Dec 26 '17 at 4:26
add a comment |
$begingroup$
What are graph Embeddings ?
"Graph Embeddings" is a hot area today in machine learning. It basically means finding "latent vector representation" of graphs which captures the topology (in very basic sense) of the graph. We can make this "vector representation" rich by also considering the vertex-vertex relationships, edge-information etc. There are roughly two levels of embeddings in the graph (of-course we can anytime define more levels by logically dividing the whole graph into subgraphs of various sizes):
Vertex Embeddings - Here you find latent vector representation of every vertex in the given graph. You can then compare the different vertices by plotting these vectors in the space and interestingly "similar" vertices are plotted closer to each other than the ones which are dissimilar or less related. This is the same work that is done in "DeepWalk" by Perozzi.
Graph Embeddings - Here you find the latent vector representation of the whole graph itself. For example, you have a group of chemical compounds for which you want to check which compounds are similar to each other, how many type of compounds are there in the group (clusters) etc. You can use these vectors and plot them in space and find all the above information. This is the work that is done in "Deep Graph Kernels" by Yanardag.
Applications -
By looking carefully, embeddings are "latent" representations which means if a graph has a |V| * |V| adjacency matrix where |V| = 1M, its hard to use or process a 1M * 1M numbers in an algorithm. So, latent embedding of dimension 'd', where d << |V|, would make the adjacency matrix |V| * d and relatively easier to use. Another application could be - Consider a simple scenario where we want to recommend products to the people who have similar interests in a social network. By getting vertex embeddings (here it means vector representation of each person), we can find the similar ones by plotting these vectors and this makes recommendation easy. These are some applications and there are others. You can refer to a nice survey paper - Graph Embedding Techniques, a Survey.
Where from it all came ? There has been a lot of works in this area and almost all comes from the groundbreaking research in natural language processing field - "Word2Vec" by Mikolov. If you want to get started with the research on graph embeddings, I would recommend to first understand how Word2Vec works. You can find nice explanations - Word2Vec parameter learning explained and Stanford Lecture. Then you can jump to the papers that you listed. Those works can be categorized as:
Works based on "Vertex Embeddings": - DeepWalk, Node2Vec, LINE.
Works based on "Graph Embeddings": - Deep Graph Kernels, Subgraph2Vec.
$endgroup$
1
$begingroup$
Wowww!! This is absolutely a perfect answer. Thanks a lot :) Very well done :)
$endgroup$
– Volka
Oct 27 '17 at 7:34
$begingroup$
Hi Mausam Jain. Can you please let me know if I can use graph embeddings to identify important nodes in the network?
$endgroup$
– Volka
Dec 25 '17 at 17:42
$begingroup$
Hi, Volka. To answer this question, I need to know what type of graph are you working on; is it twitter, facebook, reddit or something else ?
$endgroup$
– flyingDope
Dec 26 '17 at 1:12
$begingroup$
Thank you for your reply. I am actually working in a social network where I want to identify the most social people :)
$endgroup$
– Volka
Dec 26 '17 at 4:26
add a comment |
$begingroup$
What are graph Embeddings ?
"Graph Embeddings" is a hot area today in machine learning. It basically means finding "latent vector representation" of graphs which captures the topology (in very basic sense) of the graph. We can make this "vector representation" rich by also considering the vertex-vertex relationships, edge-information etc. There are roughly two levels of embeddings in the graph (of-course we can anytime define more levels by logically dividing the whole graph into subgraphs of various sizes):
Vertex Embeddings - Here you find latent vector representation of every vertex in the given graph. You can then compare the different vertices by plotting these vectors in the space and interestingly "similar" vertices are plotted closer to each other than the ones which are dissimilar or less related. This is the same work that is done in "DeepWalk" by Perozzi.
Graph Embeddings - Here you find the latent vector representation of the whole graph itself. For example, you have a group of chemical compounds for which you want to check which compounds are similar to each other, how many type of compounds are there in the group (clusters) etc. You can use these vectors and plot them in space and find all the above information. This is the work that is done in "Deep Graph Kernels" by Yanardag.
Applications -
By looking carefully, embeddings are "latent" representations which means if a graph has a |V| * |V| adjacency matrix where |V| = 1M, its hard to use or process a 1M * 1M numbers in an algorithm. So, latent embedding of dimension 'd', where d << |V|, would make the adjacency matrix |V| * d and relatively easier to use. Another application could be - Consider a simple scenario where we want to recommend products to the people who have similar interests in a social network. By getting vertex embeddings (here it means vector representation of each person), we can find the similar ones by plotting these vectors and this makes recommendation easy. These are some applications and there are others. You can refer to a nice survey paper - Graph Embedding Techniques, a Survey.
Where from it all came ? There has been a lot of works in this area and almost all comes from the groundbreaking research in natural language processing field - "Word2Vec" by Mikolov. If you want to get started with the research on graph embeddings, I would recommend to first understand how Word2Vec works. You can find nice explanations - Word2Vec parameter learning explained and Stanford Lecture. Then you can jump to the papers that you listed. Those works can be categorized as:
Works based on "Vertex Embeddings": - DeepWalk, Node2Vec, LINE.
Works based on "Graph Embeddings": - Deep Graph Kernels, Subgraph2Vec.
$endgroup$
What are graph Embeddings ?
"Graph Embeddings" is a hot area today in machine learning. It basically means finding "latent vector representation" of graphs which captures the topology (in very basic sense) of the graph. We can make this "vector representation" rich by also considering the vertex-vertex relationships, edge-information etc. There are roughly two levels of embeddings in the graph (of-course we can anytime define more levels by logically dividing the whole graph into subgraphs of various sizes):
Vertex Embeddings - Here you find latent vector representation of every vertex in the given graph. You can then compare the different vertices by plotting these vectors in the space and interestingly "similar" vertices are plotted closer to each other than the ones which are dissimilar or less related. This is the same work that is done in "DeepWalk" by Perozzi.
Graph Embeddings - Here you find the latent vector representation of the whole graph itself. For example, you have a group of chemical compounds for which you want to check which compounds are similar to each other, how many type of compounds are there in the group (clusters) etc. You can use these vectors and plot them in space and find all the above information. This is the work that is done in "Deep Graph Kernels" by Yanardag.
Applications -
By looking carefully, embeddings are "latent" representations which means if a graph has a |V| * |V| adjacency matrix where |V| = 1M, its hard to use or process a 1M * 1M numbers in an algorithm. So, latent embedding of dimension 'd', where d << |V|, would make the adjacency matrix |V| * d and relatively easier to use. Another application could be - Consider a simple scenario where we want to recommend products to the people who have similar interests in a social network. By getting vertex embeddings (here it means vector representation of each person), we can find the similar ones by plotting these vectors and this makes recommendation easy. These are some applications and there are others. You can refer to a nice survey paper - Graph Embedding Techniques, a Survey.
Where from it all came ? There has been a lot of works in this area and almost all comes from the groundbreaking research in natural language processing field - "Word2Vec" by Mikolov. If you want to get started with the research on graph embeddings, I would recommend to first understand how Word2Vec works. You can find nice explanations - Word2Vec parameter learning explained and Stanford Lecture. Then you can jump to the papers that you listed. Those works can be categorized as:
Works based on "Vertex Embeddings": - DeepWalk, Node2Vec, LINE.
Works based on "Graph Embeddings": - Deep Graph Kernels, Subgraph2Vec.
answered Oct 27 '17 at 1:57
flyingDopeflyingDope
368128
368128
1
$begingroup$
Wowww!! This is absolutely a perfect answer. Thanks a lot :) Very well done :)
$endgroup$
– Volka
Oct 27 '17 at 7:34
$begingroup$
Hi Mausam Jain. Can you please let me know if I can use graph embeddings to identify important nodes in the network?
$endgroup$
– Volka
Dec 25 '17 at 17:42
$begingroup$
Hi, Volka. To answer this question, I need to know what type of graph are you working on; is it twitter, facebook, reddit or something else ?
$endgroup$
– flyingDope
Dec 26 '17 at 1:12
$begingroup$
Thank you for your reply. I am actually working in a social network where I want to identify the most social people :)
$endgroup$
– Volka
Dec 26 '17 at 4:26
add a comment |
1
$begingroup$
Wowww!! This is absolutely a perfect answer. Thanks a lot :) Very well done :)
$endgroup$
– Volka
Oct 27 '17 at 7:34
$begingroup$
Hi Mausam Jain. Can you please let me know if I can use graph embeddings to identify important nodes in the network?
$endgroup$
– Volka
Dec 25 '17 at 17:42
$begingroup$
Hi, Volka. To answer this question, I need to know what type of graph are you working on; is it twitter, facebook, reddit or something else ?
$endgroup$
– flyingDope
Dec 26 '17 at 1:12
$begingroup$
Thank you for your reply. I am actually working in a social network where I want to identify the most social people :)
$endgroup$
– Volka
Dec 26 '17 at 4:26
1
1
$begingroup$
Wowww!! This is absolutely a perfect answer. Thanks a lot :) Very well done :)
$endgroup$
– Volka
Oct 27 '17 at 7:34
$begingroup$
Wowww!! This is absolutely a perfect answer. Thanks a lot :) Very well done :)
$endgroup$
– Volka
Oct 27 '17 at 7:34
$begingroup$
Hi Mausam Jain. Can you please let me know if I can use graph embeddings to identify important nodes in the network?
$endgroup$
– Volka
Dec 25 '17 at 17:42
$begingroup$
Hi Mausam Jain. Can you please let me know if I can use graph embeddings to identify important nodes in the network?
$endgroup$
– Volka
Dec 25 '17 at 17:42
$begingroup$
Hi, Volka. To answer this question, I need to know what type of graph are you working on; is it twitter, facebook, reddit or something else ?
$endgroup$
– flyingDope
Dec 26 '17 at 1:12
$begingroup$
Hi, Volka. To answer this question, I need to know what type of graph are you working on; is it twitter, facebook, reddit or something else ?
$endgroup$
– flyingDope
Dec 26 '17 at 1:12
$begingroup$
Thank you for your reply. I am actually working in a social network where I want to identify the most social people :)
$endgroup$
– Volka
Dec 26 '17 at 4:26
$begingroup$
Thank you for your reply. I am actually working in a social network where I want to identify the most social people :)
$endgroup$
– Volka
Dec 26 '17 at 4:26
add a comment |
$begingroup$
In the paper A central limit theorem for an omnibus embedding of random dot product graphs by Levin et.al. paper, a specific type of graph embedding (the Omnibus embedding) defines graph embedding as a methodology "in which the vertices of a graph are mapped to vectors in a low-dimensional Euclidean space." Check the link for more information.
$endgroup$
$begingroup$
welcome to the forum. If you want to mention a paper, please write down its name as part of the text as well (because links can be broken).
$endgroup$
– Mark.F
Dec 29 '18 at 8:05
add a comment |
$begingroup$
In the paper A central limit theorem for an omnibus embedding of random dot product graphs by Levin et.al. paper, a specific type of graph embedding (the Omnibus embedding) defines graph embedding as a methodology "in which the vertices of a graph are mapped to vectors in a low-dimensional Euclidean space." Check the link for more information.
$endgroup$
$begingroup$
welcome to the forum. If you want to mention a paper, please write down its name as part of the text as well (because links can be broken).
$endgroup$
– Mark.F
Dec 29 '18 at 8:05
add a comment |
$begingroup$
In the paper A central limit theorem for an omnibus embedding of random dot product graphs by Levin et.al. paper, a specific type of graph embedding (the Omnibus embedding) defines graph embedding as a methodology "in which the vertices of a graph are mapped to vectors in a low-dimensional Euclidean space." Check the link for more information.
$endgroup$
In the paper A central limit theorem for an omnibus embedding of random dot product graphs by Levin et.al. paper, a specific type of graph embedding (the Omnibus embedding) defines graph embedding as a methodology "in which the vertices of a graph are mapped to vectors in a low-dimensional Euclidean space." Check the link for more information.
edited Dec 29 '18 at 22:45
wacax
1,94521038
1,94521038
answered Dec 28 '18 at 23:56
goatboy3milliongoatboy3million
1
1
$begingroup$
welcome to the forum. If you want to mention a paper, please write down its name as part of the text as well (because links can be broken).
$endgroup$
– Mark.F
Dec 29 '18 at 8:05
add a comment |
$begingroup$
welcome to the forum. If you want to mention a paper, please write down its name as part of the text as well (because links can be broken).
$endgroup$
– Mark.F
Dec 29 '18 at 8:05
$begingroup$
welcome to the forum. If you want to mention a paper, please write down its name as part of the text as well (because links can be broken).
$endgroup$
– Mark.F
Dec 29 '18 at 8:05
$begingroup$
welcome to the forum. If you want to mention a paper, please write down its name as part of the text as well (because links can be broken).
$endgroup$
– Mark.F
Dec 29 '18 at 8:05
add a comment |
Thanks for contributing an answer to Data Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f24081%2fwhat-are-graph-embedding%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
A graph embedding is an embedding for graphs! So it takes a graph and returns embeddings for the graph, edges, or vertices. Embeddings enable similarity search and generally facilitate machine learning by providing representations.
$endgroup$
– Emre
Oct 25 '17 at 23:31
$begingroup$
@Emre what does it meant by embedding? :)
$endgroup$
– Volka
Oct 26 '17 at 0:29
1
$begingroup$
As meaning of the embed goes, fixing things onto something. Graph embedding is kind of like fixing vertices onto a surface and drawing edges to represent say a network. So example be like planar graph can be embedded on to a $2D$ surface without edge crossing. Weights can be assigned to edges and appropriate edge lengths viz. helps us to understand/estimate as @Emre mentioned similarity search etc.
$endgroup$
– Kiritee Gak
Oct 26 '17 at 0:58
$begingroup$
@KiriteeGak Thanks :) What are their real world applications? They say they can be used for recommendation and all? but how?
$endgroup$
– Volka
Oct 26 '17 at 1:25
1
$begingroup$
Youtube video recommendation can be visualised as a model where video you are currently watching is the node you are on and the next videos that is in your recommendation are the ones that are most similar to you based on the what similar users have watched next and many more factors of course which is a huge network to traverse.This paper is a simple good read on understanding the application.
$endgroup$
– Kiritee Gak
Oct 26 '17 at 1:34