GCN: an introduction, in Keras¶
In this demo we implement a basic GCN layer, and then apply it to the popular Zachary's karate club problem.
In a karate club composed of 34 members a conflict arose between the administrator "John A" and instructor "Mr. Hi", which led to the split of the club into two different clubs.
Having knowledge of the relations between the members outside the club, the problem consists in guessing the correct repartition of the members among the two groups.
import tensorflow as ts
from tensorflow.keras import layers
from tensorflow.keras.models import Model
from tensorflow.keras import backend as K
from tensorflow.keras import metrics
import numpy as np
import networkx as nx
import matplotlib.pyplot as plt
The problem is predefined in Python networkx library, and we can import data with a simple line.
G = nx.karate_club_graph()
for v in G:
print('%s %s' % (v, G.nodes[v]['club']))
0 Mr. Hi 1 Mr. Hi 2 Mr. Hi 3 Mr. Hi 4 Mr. Hi 5 Mr. Hi 6 Mr. Hi 7 Mr. Hi 8 Mr. Hi 9 Officer 10 Mr. Hi 11 Mr. Hi 12 Mr. Hi 13 Mr. Hi 14 Officer 15 Officer 16 Mr. Hi 17 Mr. Hi 18 Officer 19 Mr. Hi 20 Officer 21 Mr. Hi 22 Officer 23 Officer 24 Officer 25 Officer 26 Officer 27 Officer 28 Officer 29 Officer 30 Officer 31 Officer 32 Officer 33 Officer
Let us define the ground truth.
n = len(G)
Labels = np.zeros(n)
for v in G:
Labels[v] = G.nodes[v]['club'] == 'Officer'
print(Labels)
[0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 1. 1. 0. 0. 1. 0. 1. 0. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
Let us inspect the graph structure.
Here we show the first egienvectors of the graph laplacian.
NL = nx.normalized_laplacian_matrix(G).toarray()
lam,ev = np.linalg.eig(NL)
#print(np.dot(NL,ev[:,20]))
#print(lam[20]*ev[:,20])
import matplotlib.pyplot as plt
pos=nx.spring_layout(G) # positions for all nodes
plt.subplot(131)
nx.draw(G,pos,node_size=100, node_color=ev[:,0], cmap='bwr')
plt.subplot(132)
nx.draw(G,pos,node_size=100, node_color=ev[:,1], cmap='bwr')
plt.subplot(133)
nx.draw(G,pos,node_size=100, node_color=ev[:,2], cmap='bwr')
GCN¶
Let us come to the code for computing the GCN layer.
We try to give a pretty intutive introduction to the topic.
Suppose we have, for each node n, a vector of features X. We are interested to use these fetures to compute new features, e.g. by multiplying them with some learned parameters $\Theta$.
The idea is that, in addition to the features of the node n, we would also take into account the structure of the graph, combining X with the features of its neigbours.
For instance, if we multiply X by (I + A) we sum together the features of each node and those of its adjacent nodes. Let us call $\hat{A} = I +A$.
A problem with $\hat{A}$, is that it is not normalized and therefore the multiplication with it may completely change the scale of the feature vectors. To address this issue, we can multiply $\hat{A}$ by $D^{-1}$, where $D$ is the diagonal node degree matrix: in the resulting matrix, all rows will sum up to 1.
In practice, dynamics gets more interesting if you use a symmetric normalization, i.e. $D^{-\frac{1}{2}}AD^{−\frac{1}{2}}$ (that no longer amounts to mere averaging of neighboring nodes).
With the combination of the previous two tricks, we arrive at the GCN rule introduced in Kipf & Welling (ICLR 2017):
- $\hat{A}$ has dimension $n\times n$ (likewise D)
- X has dimension $n \times p$
- $\Theta$ has dimension $p \times q$
- the output has dimension $n \times q$
A = nx.adjacency_matrix(G)
Id = np.ones(n)
Id = np.diag(Id)
Ahat = A + Id
rowsum = np.array(Ahat.sum(1))
r_inv = np.power(rowsum, -.5).flatten()
r_inv[np.isinf(r_inv)] = 0.
r_mat_inv = np.diag(r_inv)
Anorm = np.dot(r_mat_inv,np.dot(Ahat,r_mat_inv))
We now define our costum GCN layer. We use the utility function add_weight in order to introduce the matrix $\Theta$ of learnable parameters.
The layer expects to receive as input an already normalized matrix $A$ (in addition to $X$), so we merely compute the dot product $AX\Theta$.
class GCNlayer(layers.Layer):
def __init__(self, output_dim, **kwargs):
self.output_dim = output_dim
super(GCNlayer, self).__init__(**kwargs)
def build(self, input_shape):
# Create a trainable weight variable for this layer.
#print(input_shape)
self._Theta = self.add_weight(name='Theta',
shape=(input_shape[1][2], self.output_dim),
initializer='glorot_uniform',
trainable=True)
super(GCNlayer, self).build(input_shape) # Be sure to call this at the end
def call(self,x):
A, X = x
return K.batch_dot(A, K.dot(X, self._Theta),axes=[2,1])
def compute_output_shape(self, input_shape):
return (None,input_shape[0][1], self.output_dim)
We define a simple model composed of three GCN layers. The final layer has output dimension 1, and we pass it to the logistic function to produce the output probability to belong to a given club.
noFeat = 5
Adj = layers.Input(shape=Anorm.shape)
Feat = layers.Input(shape=(n,noFeat,))
Z = GCNlayer(10)([Adj,Feat])
#Z = Activation('relu')(Z)
Z = GCNlayer(10)([Adj,Z])
#Z = Activation('relu')(Z)
Z = GCNlayer(1)([Adj,Z])
Zres = layers.Activation('sigmoid')(Z)
gcnmodel = Model(inputs=[Adj,Feat],outputs=[Zres])
gcnmodel.summary()
Model: "model" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_3 (InputLayer) [(None, 34, 34)] 0 __________________________________________________________________________________________________ input_4 (InputLayer) [(None, 34, 5)] 0 __________________________________________________________________________________________________ gc_nlayer_3 (GCNlayer) (None, 34, 10) 50 input_3[0][0] input_4[0][0] __________________________________________________________________________________________________ gc_nlayer_4 (GCNlayer) (None, 34, 10) 100 input_3[0][0] gc_nlayer_3[0][0] __________________________________________________________________________________________________ gc_nlayer_5 (GCNlayer) (None, 34, 1) 10 input_3[0][0] gc_nlayer_4[0][0] __________________________________________________________________________________________________ activation (Activation) (None, 34, 1) 0 gc_nlayer_5[0][0] ================================================================================================== Total params: 160 Trainable params: 160 Non-trainable params: 0 __________________________________________________________________________________________________
We shall train the model starting with random features, in a semi-supervised setting, where we only know the final label for the Mr.Hi (number 0, label 0) and the Officer (number 33, label 1).
The loss is just measured on these two nodes, for which we know the True labels. The connectivity of the networks allows to propagate labels to adjacent nodes.
loss = - K.log(1-Zres[:,0]) - K.log(Zres[:,33])
#loss = K.square(Zres[:,0]) + K.square(1-Zres[:,33])
gcnmodel.add_loss(loss)
gcnmodel.compile(optimizer='nadam')
X = np.random.normal(size = (n,noFeat))
gcnmodel.fit([Anorm[np.newaxis,:],X[np.newaxis,:]],epochs=300)
Epoch 1/300 1/1 [==============================] - 3s 3s/step - loss: 1.4707 Epoch 2/300 1/1 [==============================] - 0s 7ms/step - loss: 1.4618 Epoch 3/300 1/1 [==============================] - 0s 4ms/step - loss: 1.4554 Epoch 4/300 1/1 [==============================] - 0s 4ms/step - loss: 1.4495 Epoch 5/300 1/1 [==============================] - 0s 4ms/step - loss: 1.4437 Epoch 6/300 1/1 [==============================] - 0s 4ms/step - loss: 1.4379 Epoch 7/300 1/1 [==============================] - 0s 4ms/step - loss: 1.4320 Epoch 8/300 1/1 [==============================] - 0s 4ms/step - loss: 1.4260 Epoch 9/300 1/1 [==============================] - 0s 3ms/step - loss: 1.4200 Epoch 10/300 1/1 [==============================] - 0s 4ms/step - loss: 1.4139 Epoch 11/300 1/1 [==============================] - 0s 5ms/step - loss: 1.4078 Epoch 12/300 1/1 [==============================] - 0s 4ms/step - loss: 1.4018 Epoch 13/300 1/1 [==============================] - 0s 4ms/step - loss: 1.3957 Epoch 14/300 1/1 [==============================] - 0s 5ms/step - loss: 1.3897 Epoch 15/300 1/1 [==============================] - 0s 5ms/step - loss: 1.3836 Epoch 16/300 1/1 [==============================] - 0s 5ms/step - loss: 1.3777 Epoch 17/300 1/1 [==============================] - 0s 4ms/step - loss: 1.3718 Epoch 18/300 1/1 [==============================] - 0s 4ms/step - loss: 1.3659 Epoch 19/300 1/1 [==============================] - 0s 6ms/step - loss: 1.3601 Epoch 20/300 1/1 [==============================] - 0s 6ms/step - loss: 1.3543 Epoch 21/300 1/1 [==============================] - 0s 4ms/step - loss: 1.3486 Epoch 22/300 1/1 [==============================] - 0s 5ms/step - loss: 1.3430 Epoch 23/300 1/1 [==============================] - 0s 5ms/step - loss: 1.3374 Epoch 24/300 1/1 [==============================] - 0s 6ms/step - loss: 1.3319 Epoch 25/300 1/1 [==============================] - 0s 4ms/step - loss: 1.3264 Epoch 26/300 1/1 [==============================] - 0s 4ms/step - loss: 1.3209 Epoch 27/300 1/1 [==============================] - 0s 4ms/step - loss: 1.3156 Epoch 28/300 1/1 [==============================] - 0s 5ms/step - loss: 1.3102 Epoch 29/300 1/1 [==============================] - 0s 4ms/step - loss: 1.3049 Epoch 30/300 1/1 [==============================] - 0s 4ms/step - loss: 1.2996 Epoch 31/300 1/1 [==============================] - 0s 4ms/step - loss: 1.2944 Epoch 32/300 1/1 [==============================] - 0s 4ms/step - loss: 1.2892 Epoch 33/300 1/1 [==============================] - 0s 4ms/step - loss: 1.2840 Epoch 34/300 1/1 [==============================] - 0s 7ms/step - loss: 1.2788 Epoch 35/300 1/1 [==============================] - 0s 5ms/step - loss: 1.2737 Epoch 36/300 1/1 [==============================] - 0s 9ms/step - loss: 1.2686 Epoch 37/300 1/1 [==============================] - 0s 4ms/step - loss: 1.2635 Epoch 38/300 1/1 [==============================] - 0s 5ms/step - loss: 1.2584 Epoch 39/300 1/1 [==============================] - 0s 4ms/step - loss: 1.2533 Epoch 40/300 1/1 [==============================] - 0s 8ms/step - loss: 1.2482 Epoch 41/300 1/1 [==============================] - 0s 8ms/step - loss: 1.2431 Epoch 42/300 1/1 [==============================] - 0s 8ms/step - loss: 1.2381 Epoch 43/300 1/1 [==============================] - 0s 28ms/step - loss: 1.2330 Epoch 44/300 1/1 [==============================] - 0s 9ms/step - loss: 1.2279 Epoch 45/300 1/1 [==============================] - 0s 8ms/step - loss: 1.2229 Epoch 46/300 1/1 [==============================] - 0s 5ms/step - loss: 1.2178 Epoch 47/300 1/1 [==============================] - 0s 4ms/step - loss: 1.2127 Epoch 48/300 1/1 [==============================] - 0s 4ms/step - loss: 1.2077 Epoch 49/300 1/1 [==============================] - 0s 4ms/step - loss: 1.2026 Epoch 50/300 1/1 [==============================] - 0s 7ms/step - loss: 1.1975 Epoch 51/300 1/1 [==============================] - 0s 4ms/step - loss: 1.1924 Epoch 52/300 1/1 [==============================] - 0s 5ms/step - loss: 1.1873 Epoch 53/300 1/1 [==============================] - 0s 5ms/step - loss: 1.1822 Epoch 54/300 1/1 [==============================] - 0s 4ms/step - loss: 1.1771 Epoch 55/300 1/1 [==============================] - 0s 4ms/step - loss: 1.1720 Epoch 56/300 1/1 [==============================] - 0s 4ms/step - loss: 1.1668 Epoch 57/300 1/1 [==============================] - 0s 4ms/step - loss: 1.1617 Epoch 58/300 1/1 [==============================] - 0s 4ms/step - loss: 1.1565 Epoch 59/300 1/1 [==============================] - 0s 4ms/step - loss: 1.1513 Epoch 60/300 1/1 [==============================] - 0s 4ms/step - loss: 1.1461 Epoch 61/300 1/1 [==============================] - 0s 4ms/step - loss: 1.1409 Epoch 62/300 1/1 [==============================] - 0s 4ms/step - loss: 1.1356 Epoch 63/300 1/1 [==============================] - 0s 5ms/step - loss: 1.1304 Epoch 64/300 1/1 [==============================] - 0s 4ms/step - loss: 1.1251 Epoch 65/300 1/1 [==============================] - 0s 8ms/step - loss: 1.1198 Epoch 66/300 1/1 [==============================] - 0s 4ms/step - loss: 1.1145 Epoch 67/300 1/1 [==============================] - 0s 5ms/step - loss: 1.1092 Epoch 68/300 1/1 [==============================] - 0s 6ms/step - loss: 1.1038 Epoch 69/300 1/1 [==============================] - 0s 7ms/step - loss: 1.0984 Epoch 70/300 1/1 [==============================] - 0s 7ms/step - loss: 1.0930 Epoch 71/300 1/1 [==============================] - 0s 5ms/step - loss: 1.0876 Epoch 72/300 1/1 [==============================] - 0s 4ms/step - loss: 1.0821 Epoch 73/300 1/1 [==============================] - 0s 4ms/step - loss: 1.0766 Epoch 74/300 1/1 [==============================] - 0s 5ms/step - loss: 1.0711 Epoch 75/300 1/1 [==============================] - 0s 5ms/step - loss: 1.0655 Epoch 76/300 1/1 [==============================] - 0s 5ms/step - loss: 1.0600 Epoch 77/300 1/1 [==============================] - 0s 9ms/step - loss: 1.0544 Epoch 78/300 1/1 [==============================] - 0s 6ms/step - loss: 1.0487 Epoch 79/300 1/1 [==============================] - 0s 5ms/step - loss: 1.0431 Epoch 80/300 1/1 [==============================] - 0s 5ms/step - loss: 1.0374 Epoch 81/300 1/1 [==============================] - 0s 6ms/step - loss: 1.0316 Epoch 82/300 1/1 [==============================] - 0s 6ms/step - loss: 1.0259 Epoch 83/300 1/1 [==============================] - 0s 6ms/step - loss: 1.0201 Epoch 84/300 1/1 [==============================] - 0s 7ms/step - loss: 1.0142 Epoch 85/300 1/1 [==============================] - 0s 5ms/step - loss: 1.0084 Epoch 86/300 1/1 [==============================] - 0s 5ms/step - loss: 1.0024 Epoch 87/300 1/1 [==============================] - 0s 13ms/step - loss: 0.9965 Epoch 88/300 1/1 [==============================] - 0s 5ms/step - loss: 0.9905 Epoch 89/300 1/1 [==============================] - 0s 6ms/step - loss: 0.9845 Epoch 90/300 1/1 [==============================] - 0s 5ms/step - loss: 0.9784 Epoch 91/300 1/1 [==============================] - 0s 7ms/step - loss: 0.9723 Epoch 92/300 1/1 [==============================] - 0s 5ms/step - loss: 0.9662 Epoch 93/300 1/1 [==============================] - 0s 5ms/step - loss: 0.9600 Epoch 94/300 1/1 [==============================] - 0s 11ms/step - loss: 0.9538 Epoch 95/300 1/1 [==============================] - 0s 5ms/step - loss: 0.9475 Epoch 96/300 1/1 [==============================] - 0s 5ms/step - loss: 0.9412 Epoch 97/300 1/1 [==============================] - 0s 4ms/step - loss: 0.9349 Epoch 98/300 1/1 [==============================] - 0s 5ms/step - loss: 0.9285 Epoch 99/300 1/1 [==============================] - 0s 4ms/step - loss: 0.9221 Epoch 100/300 1/1 [==============================] - 0s 4ms/step - loss: 0.9156 Epoch 101/300 1/1 [==============================] - 0s 4ms/step - loss: 0.9091 Epoch 102/300 1/1 [==============================] - 0s 5ms/step - loss: 0.9026 Epoch 103/300 1/1 [==============================] - 0s 8ms/step - loss: 0.8960 Epoch 104/300 1/1 [==============================] - 0s 6ms/step - loss: 0.8893 Epoch 105/300 1/1 [==============================] - 0s 4ms/step - loss: 0.8827 Epoch 106/300 1/1 [==============================] - 0s 6ms/step - loss: 0.8759 Epoch 107/300 1/1 [==============================] - 0s 6ms/step - loss: 0.8692 Epoch 108/300 1/1 [==============================] - 0s 5ms/step - loss: 0.8624 Epoch 109/300 1/1 [==============================] - 0s 6ms/step - loss: 0.8555 Epoch 110/300 1/1 [==============================] - 0s 5ms/step - loss: 0.8487 Epoch 111/300 1/1 [==============================] - 0s 7ms/step - loss: 0.8418 Epoch 112/300 1/1 [==============================] - 0s 7ms/step - loss: 0.8348 Epoch 113/300 1/1 [==============================] - 0s 8ms/step - loss: 0.8278 Epoch 114/300 1/1 [==============================] - 0s 8ms/step - loss: 0.8208 Epoch 115/300 1/1 [==============================] - 0s 7ms/step - loss: 0.8137 Epoch 116/300 1/1 [==============================] - 0s 6ms/step - loss: 0.8066 Epoch 117/300 1/1 [==============================] - 0s 8ms/step - loss: 0.7995 Epoch 118/300 1/1 [==============================] - 0s 14ms/step - loss: 0.7924 Epoch 119/300 1/1 [==============================] - 0s 12ms/step - loss: 0.7852 Epoch 120/300 1/1 [==============================] - 0s 8ms/step - loss: 0.7780 Epoch 121/300 1/1 [==============================] - 0s 13ms/step - loss: 0.7707 Epoch 122/300 1/1 [==============================] - 0s 7ms/step - loss: 0.7635 Epoch 123/300 1/1 [==============================] - 0s 7ms/step - loss: 0.7562 Epoch 124/300 1/1 [==============================] - 0s 8ms/step - loss: 0.7489 Epoch 125/300 1/1 [==============================] - 0s 11ms/step - loss: 0.7415 Epoch 126/300 1/1 [==============================] - 0s 7ms/step - loss: 0.7342 Epoch 127/300 1/1 [==============================] - 0s 6ms/step - loss: 0.7268 Epoch 128/300 1/1 [==============================] - 0s 4ms/step - loss: 0.7195 Epoch 129/300 1/1 [==============================] - 0s 4ms/step - loss: 0.7121 Epoch 130/300 1/1 [==============================] - 0s 4ms/step - loss: 0.7047 Epoch 131/300 1/1 [==============================] - 0s 6ms/step - loss: 0.6973 Epoch 132/300 1/1 [==============================] - 0s 9ms/step - loss: 0.6899 Epoch 133/300 1/1 [==============================] - 0s 9ms/step - loss: 0.6824 Epoch 134/300 1/1 [==============================] - 0s 5ms/step - loss: 0.6750 Epoch 135/300 1/1 [==============================] - 0s 7ms/step - loss: 0.6676 Epoch 136/300 1/1 [==============================] - 0s 5ms/step - loss: 0.6602 Epoch 137/300 1/1 [==============================] - 0s 5ms/step - loss: 0.6528 Epoch 138/300 1/1 [==============================] - 0s 5ms/step - loss: 0.6453 Epoch 139/300 1/1 [==============================] - 0s 7ms/step - loss: 0.6379 Epoch 140/300 1/1 [==============================] - 0s 6ms/step - loss: 0.6306 Epoch 141/300 1/1 [==============================] - 0s 5ms/step - loss: 0.6232 Epoch 142/300 1/1 [==============================] - 0s 8ms/step - loss: 0.6158 Epoch 143/300 1/1 [==============================] - 0s 5ms/step - loss: 0.6085 Epoch 144/300 1/1 [==============================] - 0s 5ms/step - loss: 0.6011 Epoch 145/300 1/1 [==============================] - 0s 12ms/step - loss: 0.5938 Epoch 146/300 1/1 [==============================] - 0s 18ms/step - loss: 0.5865 Epoch 147/300 1/1 [==============================] - 0s 12ms/step - loss: 0.5793 Epoch 148/300 1/1 [==============================] - 0s 7ms/step - loss: 0.5721 Epoch 149/300 1/1 [==============================] - 0s 7ms/step - loss: 0.5648 Epoch 150/300 1/1 [==============================] - 0s 5ms/step - loss: 0.5577 Epoch 151/300 1/1 [==============================] - 0s 6ms/step - loss: 0.5505 Epoch 152/300 1/1 [==============================] - 0s 8ms/step - loss: 0.5434 Epoch 153/300 1/1 [==============================] - 0s 4ms/step - loss: 0.5364 Epoch 154/300 1/1 [==============================] - 0s 6ms/step - loss: 0.5293 Epoch 155/300 1/1 [==============================] - 0s 4ms/step - loss: 0.5224 Epoch 156/300 1/1 [==============================] - 0s 8ms/step - loss: 0.5154 Epoch 157/300 1/1 [==============================] - 0s 4ms/step - loss: 0.5085 Epoch 158/300 1/1 [==============================] - 0s 7ms/step - loss: 0.5017 Epoch 159/300 1/1 [==============================] - 0s 5ms/step - loss: 0.4949 Epoch 160/300 1/1 [==============================] - 0s 5ms/step - loss: 0.4881 Epoch 161/300 1/1 [==============================] - 0s 5ms/step - loss: 0.4814 Epoch 162/300 1/1 [==============================] - 0s 6ms/step - loss: 0.4748 Epoch 163/300 1/1 [==============================] - 0s 7ms/step - loss: 0.4682 Epoch 164/300 1/1 [==============================] - 0s 9ms/step - loss: 0.4616 Epoch 165/300 1/1 [==============================] - 0s 9ms/step - loss: 0.4551 Epoch 166/300 1/1 [==============================] - 0s 8ms/step - loss: 0.4487 Epoch 167/300 1/1 [==============================] - 0s 5ms/step - loss: 0.4423 Epoch 168/300 1/1 [==============================] - 0s 6ms/step - loss: 0.4360 Epoch 169/300 1/1 [==============================] - 0s 4ms/step - loss: 0.4298 Epoch 170/300 1/1 [==============================] - 0s 4ms/step - loss: 0.4236 Epoch 171/300 1/1 [==============================] - 0s 4ms/step - loss: 0.4175 Epoch 172/300 1/1 [==============================] - 0s 4ms/step - loss: 0.4114 Epoch 173/300 1/1 [==============================] - 0s 4ms/step - loss: 0.4054 Epoch 174/300 1/1 [==============================] - 0s 4ms/step - loss: 0.3995 Epoch 175/300 1/1 [==============================] - 0s 4ms/step - loss: 0.3936 Epoch 176/300 1/1 [==============================] - 0s 8ms/step - loss: 0.3878 Epoch 177/300 1/1 [==============================] - 0s 4ms/step - loss: 0.3821 Epoch 178/300 1/1 [==============================] - 0s 7ms/step - loss: 0.3764 Epoch 179/300 1/1 [==============================] - 0s 4ms/step - loss: 0.3708 Epoch 180/300 1/1 [==============================] - 0s 14ms/step - loss: 0.3653 Epoch 181/300 1/1 [==============================] - 0s 16ms/step - loss: 0.3599 Epoch 182/300 1/1 [==============================] - 0s 7ms/step - loss: 0.3545 Epoch 183/300 1/1 [==============================] - 0s 7ms/step - loss: 0.3492 Epoch 184/300 1/1 [==============================] - 0s 8ms/step - loss: 0.3439 Epoch 185/300 1/1 [==============================] - 0s 4ms/step - loss: 0.3387 Epoch 186/300 1/1 [==============================] - 0s 4ms/step - loss: 0.3336 Epoch 187/300 1/1 [==============================] - 0s 6ms/step - loss: 0.3286 Epoch 188/300 1/1 [==============================] - 0s 7ms/step - loss: 0.3236 Epoch 189/300 1/1 [==============================] - 0s 5ms/step - loss: 0.3187 Epoch 190/300 1/1 [==============================] - 0s 8ms/step - loss: 0.3139 Epoch 191/300 1/1 [==============================] - 0s 5ms/step - loss: 0.3091 Epoch 192/300 1/1 [==============================] - 0s 8ms/step - loss: 0.3044 Epoch 193/300 1/1 [==============================] - 0s 18ms/step - loss: 0.2998 Epoch 194/300 1/1 [==============================] - 0s 12ms/step - loss: 0.2953 Epoch 195/300 1/1 [==============================] - 0s 4ms/step - loss: 0.2908 Epoch 196/300 1/1 [==============================] - 0s 5ms/step - loss: 0.2863 Epoch 197/300 1/1 [==============================] - 0s 4ms/step - loss: 0.2820 Epoch 198/300 1/1 [==============================] - 0s 4ms/step - loss: 0.2777 Epoch 199/300 1/1 [==============================] - 0s 4ms/step - loss: 0.2735 Epoch 200/300 1/1 [==============================] - 0s 11ms/step - loss: 0.2693 Epoch 201/300 1/1 [==============================] - 0s 17ms/step - loss: 0.2652 Epoch 202/300 1/1 [==============================] - 0s 6ms/step - loss: 0.2612 Epoch 203/300 1/1 [==============================] - 0s 4ms/step - loss: 0.2573 Epoch 204/300 1/1 [==============================] - 0s 4ms/step - loss: 0.2534 Epoch 205/300 1/1 [==============================] - 0s 4ms/step - loss: 0.2495 Epoch 206/300 1/1 [==============================] - 0s 5ms/step - loss: 0.2457 Epoch 207/300 1/1 [==============================] - 0s 12ms/step - loss: 0.2420 Epoch 208/300 1/1 [==============================] - 0s 5ms/step - loss: 0.2384 Epoch 209/300 1/1 [==============================] - 0s 5ms/step - loss: 0.2348 Epoch 210/300 1/1 [==============================] - 0s 8ms/step - loss: 0.2313 Epoch 211/300 1/1 [==============================] - 0s 8ms/step - loss: 0.2278 Epoch 212/300 1/1 [==============================] - 0s 5ms/step - loss: 0.2244 Epoch 213/300 1/1 [==============================] - 0s 5ms/step - loss: 0.2210 Epoch 214/300 1/1 [==============================] - 0s 4ms/step - loss: 0.2177 Epoch 215/300 1/1 [==============================] - 0s 6ms/step - loss: 0.2145 Epoch 216/300 1/1 [==============================] - 0s 5ms/step - loss: 0.2113 Epoch 217/300 1/1 [==============================] - 0s 21ms/step - loss: 0.2082 Epoch 218/300 1/1 [==============================] - 0s 11ms/step - loss: 0.2051 Epoch 219/300 1/1 [==============================] - 0s 8ms/step - loss: 0.2020 Epoch 220/300 1/1 [==============================] - 0s 13ms/step - loss: 0.1991 Epoch 221/300 1/1 [==============================] - 0s 8ms/step - loss: 0.1961 Epoch 222/300 1/1 [==============================] - 0s 5ms/step - loss: 0.1933 Epoch 223/300 1/1 [==============================] - 0s 9ms/step - loss: 0.1904 Epoch 224/300 1/1 [==============================] - 0s 8ms/step - loss: 0.1877 Epoch 225/300 1/1 [==============================] - 0s 7ms/step - loss: 0.1849 Epoch 226/300 1/1 [==============================] - 0s 4ms/step - loss: 0.1823 Epoch 227/300 1/1 [==============================] - 0s 4ms/step - loss: 0.1796 Epoch 228/300 1/1 [==============================] - 0s 4ms/step - loss: 0.1770 Epoch 229/300 1/1 [==============================] - 0s 5ms/step - loss: 0.1745 Epoch 230/300 1/1 [==============================] - 0s 4ms/step - loss: 0.1720 Epoch 231/300 1/1 [==============================] - 0s 7ms/step - loss: 0.1695 Epoch 232/300 1/1 [==============================] - 0s 8ms/step - loss: 0.1671 Epoch 233/300 1/1 [==============================] - 0s 4ms/step - loss: 0.1648 Epoch 234/300 1/1 [==============================] - 0s 9ms/step - loss: 0.1624 Epoch 235/300 1/1 [==============================] - 0s 9ms/step - loss: 0.1601 Epoch 236/300 1/1 [==============================] - 0s 8ms/step - loss: 0.1579 Epoch 237/300 1/1 [==============================] - 0s 8ms/step - loss: 0.1557 Epoch 238/300 1/1 [==============================] - 0s 8ms/step - loss: 0.1535 Epoch 239/300 1/1 [==============================] - 0s 19ms/step - loss: 0.1514 Epoch 240/300 1/1 [==============================] - 0s 17ms/step - loss: 0.1493 Epoch 241/300 1/1 [==============================] - 0s 9ms/step - loss: 0.1472 Epoch 242/300 1/1 [==============================] - 0s 9ms/step - loss: 0.1452 Epoch 243/300 1/1 [==============================] - 0s 6ms/step - loss: 0.1432 Epoch 244/300 1/1 [==============================] - 0s 8ms/step - loss: 0.1413 Epoch 245/300 1/1 [==============================] - 0s 6ms/step - loss: 0.1393 Epoch 246/300 1/1 [==============================] - 0s 4ms/step - loss: 0.1375 Epoch 247/300 1/1 [==============================] - 0s 5ms/step - loss: 0.1356 Epoch 248/300 1/1 [==============================] - 0s 8ms/step - loss: 0.1338 Epoch 249/300 1/1 [==============================] - 0s 4ms/step - loss: 0.1320 Epoch 250/300 1/1 [==============================] - 0s 4ms/step - loss: 0.1302 Epoch 251/300 1/1 [==============================] - 0s 5ms/step - loss: 0.1285 Epoch 252/300 1/1 [==============================] - 0s 7ms/step - loss: 0.1268 Epoch 253/300 1/1 [==============================] - 0s 4ms/step - loss: 0.1251 Epoch 254/300 1/1 [==============================] - 0s 4ms/step - loss: 0.1235 Epoch 255/300 1/1 [==============================] - 0s 4ms/step - loss: 0.1219 Epoch 256/300 1/1 [==============================] - 0s 4ms/step - loss: 0.1203 Epoch 257/300 1/1 [==============================] - 0s 4ms/step - loss: 0.1187 Epoch 258/300 1/1 [==============================] - 0s 7ms/step - loss: 0.1172 Epoch 259/300 1/1 [==============================] - 0s 15ms/step - loss: 0.1157 Epoch 260/300 1/1 [==============================] - 0s 5ms/step - loss: 0.1142 Epoch 261/300 1/1 [==============================] - 0s 5ms/step - loss: 0.1128 Epoch 262/300 1/1 [==============================] - 0s 4ms/step - loss: 0.1113 Epoch 263/300 1/1 [==============================] - 0s 5ms/step - loss: 0.1099 Epoch 264/300 1/1 [==============================] - 0s 5ms/step - loss: 0.1086 Epoch 265/300 1/1 [==============================] - 0s 7ms/step - loss: 0.1072 Epoch 266/300 1/1 [==============================] - 0s 19ms/step - loss: 0.1059 Epoch 267/300 1/1 [==============================] - 0s 23ms/step - loss: 0.1045 Epoch 268/300 1/1 [==============================] - 0s 5ms/step - loss: 0.1033 Epoch 269/300 1/1 [==============================] - 0s 9ms/step - loss: 0.1020 Epoch 270/300 1/1 [==============================] - 0s 6ms/step - loss: 0.1007 Epoch 271/300 1/1 [==============================] - 0s 5ms/step - loss: 0.0995 Epoch 272/300 1/1 [==============================] - 0s 6ms/step - loss: 0.0983 Epoch 273/300 1/1 [==============================] - 0s 4ms/step - loss: 0.0971 Epoch 274/300 1/1 [==============================] - 0s 9ms/step - loss: 0.0959 Epoch 275/300 1/1 [==============================] - 0s 5ms/step - loss: 0.0948 Epoch 276/300 1/1 [==============================] - 0s 4ms/step - loss: 0.0937 Epoch 277/300 1/1 [==============================] - 0s 4ms/step - loss: 0.0926 Epoch 278/300 1/1 [==============================] - 0s 14ms/step - loss: 0.0915 Epoch 279/300 1/1 [==============================] - 0s 4ms/step - loss: 0.0904 Epoch 280/300 1/1 [==============================] - 0s 4ms/step - loss: 0.0893 Epoch 281/300 1/1 [==============================] - 0s 17ms/step - loss: 0.0883 Epoch 282/300 1/1 [==============================] - 0s 8ms/step - loss: 0.0873 Epoch 283/300 1/1 [==============================] - 0s 9ms/step - loss: 0.0862 Epoch 284/300 1/1 [==============================] - 0s 6ms/step - loss: 0.0852 Epoch 285/300 1/1 [==============================] - 0s 10ms/step - loss: 0.0843 Epoch 286/300 1/1 [==============================] - 0s 10ms/step - loss: 0.0833 Epoch 287/300 1/1 [==============================] - 0s 16ms/step - loss: 0.0824 Epoch 288/300 1/1 [==============================] - 0s 9ms/step - loss: 0.0814 Epoch 289/300 1/1 [==============================] - 0s 9ms/step - loss: 0.0805 Epoch 290/300 1/1 [==============================] - 0s 6ms/step - loss: 0.0796 Epoch 291/300 1/1 [==============================] - 0s 5ms/step - loss: 0.0787 Epoch 292/300 1/1 [==============================] - 0s 5ms/step - loss: 0.0778 Epoch 293/300 1/1 [==============================] - 0s 4ms/step - loss: 0.0770 Epoch 294/300 1/1 [==============================] - 0s 4ms/step - loss: 0.0761 Epoch 295/300 1/1 [==============================] - 0s 7ms/step - loss: 0.0753 Epoch 296/300 1/1 [==============================] - 0s 4ms/step - loss: 0.0745 Epoch 297/300 1/1 [==============================] - 0s 4ms/step - loss: 0.0737 Epoch 298/300 1/1 [==============================] - 0s 11ms/step - loss: 0.0729 Epoch 299/300 1/1 [==============================] - 0s 9ms/step - loss: 0.0721 Epoch 300/300 1/1 [==============================] - 0s 7ms/step - loss: 0.0713
<tensorflow.python.keras.callbacks.History at 0x7f78a01bf750>
predictions = gcnmodel.predict([Anorm[np.newaxis,:],X[np.newaxis,:]])[0,:,0]
#print(predictions)
predlabels = predictions > .5
#print(predlabels)
accuracy = np.sum(predlabels==Labels)/n
print(accuracy)
0.8529411764705882
Considering that the model received random feature description of the nodes, this is a remarkable result. The accuracy can be further improved by providing initial node features as e.g. derived by some node embedding, as in the experiment described in Kipf & Welling article.