Please wait a minute...
Tsinghua Science and Technology  2020, Vol. 25 Issue (04): 447-457    doi: 10.26599/TST.2019.9010055
Complex Network Classification with Convolutional Neural Network
Ruyue Xin, Jiang Zhang*, Yitong Shao
Ruyue Xin and Jiang Zhang are with the School of Systems Science, Beijing Normal University, Beijing 100875, China. E-mail:
Yitong Shao is with the School of Mathematical Sciences, Beijing Normal University, Beijing 100875, China. E-mail:
Download: PDF (12407 KB)      HTML
Export: BibTeX | EndNote (RIS)      


Classifying large-scale networks into several categories and distinguishing them according to their fine structures is of great importance to several real-life applications. However, most studies on complex networks focus on the properties of a single network and seldom on classification, clustering, and comparison between different networks, in which the network is treated as a whole. Conventional methods can hardly be applied on networks directly due to the non-Euclidean properties of data. In this paper, we propose a novel framework of Complex Network Classifier (CNC) by integrating network embedding and convolutional neural network to tackle the problem of network classification. By training the classifier on synthetic complex network data, we show CNC can not only classify networks with high accuracy and robustness but can also extract the features of the networks automatically. We also compare our CNC with baseline methods on benchmark datasets, which shows that our method performs well on large-scale networks.

Key wordscomplex network      network classification      DeepWalk      Convolutional Neural Network (CNN)     
Received: 18 January 2019      Published: 13 January 2020
Corresponding Authors: Jiang Zhang   
Cite this article:

Ruyue Xin, Jiang Zhang, Yitong Shao. Complex Network Classification with Convolutional Neural Network. Tsinghua Science and Technology, 2020, 25(04): 447-457.

URL:     OR

Fig. 1 Pipeline of the CNC algorithm.
SizeEnriched sizeNumber of average nodesNumber of average edgesNumber of classes
NCI1411012 33029.8032.302
COLLAB500015 00074.492457.783
RE_12K11 92935 787391.40456.8912
Table 1  Properties of the empirical data.
Fig. 2 Loss and validation error rate of the classification task on (a) BA vs. WS models and (b) food vs. chemical products.
Fig. 3 Feature maps visualization.
Fig. 4 Active nodes by the three kernels in the first layer, (a)-(c) BA network and (d)-(f) WS network.
Fig. 5 Classification results of each two small-world networks with different <i>p</i> values.
Fig. 6 Dependence of the error rates on (a) the number of nodes and (b) the number of edges in the robustness experiments.
Fig. 7 Network representations of 10 selected products in food (upper) and chemicals (bottom).
GK62.28±0.2972.84 ± 0.2877.34±0.1841.01±0.1731.82±0.08
Deep GK62.48±0.2573.09±0.2578.04±0.3941.27±0.1832.22±0.10
PSCK, k=1070.00±1.9872.60±2.1586.30±1.5849.10±0.7041.32±0.42
Table 2  Comparison of classification accuracy (± standard deviation). (%)
[1]   Dorogovtsev S. N. and Mendes J. F. F., Evolution of networks, Advances in Physics, vol. 51, no. 4, pp. 1079-1187, 2002.
[2]   Albert R. and Barabási A. L., Statistical mechanics of complex networks, Reviews of Modern Physics, vol. 74, no. 1, p. 47, 2002.
[3]   Hanneman R. A. and Riddle M., Introduction to Social Network Methods. Riverside, CA, USA: University of California, 2005.
[4]   Krizhevsky A., Sutskever I., and Hinton G. E., Imagenet classification with deep convolutional neural networks, in Proc. of Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA, 2012, pp. 1097-1105.
[5]   Graves A., Mohamed A., and Hinton G., Speech recognition with deep recurrent neural networks, in Proc. of 2013 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Vancouver, Canada, 2013, pp. 6645-6649.
[6]   Yanardag P. and Vishwanathan S. V. N., Deep graph kernels, in Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Sydney, Australia, 2015, pp. 1365-1374.
[7]   Roweis S. T. and Saul L. K., Nonlinear dimensionality reduction by locally linear embedding, Science, vol. 290, no. 5500, pp. 2323-2326, 2000.
[8]   Tenenbaum J. B., De Silva V., and Langford J. C., A global geometric framework for nonlinear dimensionality reduction, Science, vol. 290, no. 5500, pp. 2319-2323, 2000.
[9]   Perozzi B., Al-Rfou R., and Skiena S., Deepwalk: Online learning of social representations, in Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, NY, USA, 2014, pp. 701-710.
[10]   Grover A. and Leskovec J., node2vec: Scalable feature learning for networks, in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 2016, pp. 855-864.
[11]   Tang J., Qu M., Wang M., Zhang M., Yan J., and Mei Q, Line: Large-scale information network embedding, in Proceedings of the 24th International Conference on World Wide Web, Florence, Italy, 2015, pp. 1067-1077.
[12]   Scarselli F., Gori M., Tsoi A. C., Hagenbuchner M., and Monfardini G., The graph neural network model, IEEE Transactions on Neural Networks, vol. 20, no. 1, pp. 61-80, 2008.
[13]   Li Y., Tarlow D., Brockschmidt M., and Zemel R., Gated graph sequence neural networks, arXiv preprint arXiv: 1511.05493, 2015.
[14]   Defferrard M., Bresson X., and Vandergheynst P., Convolutional neural networks on graphs with fast localized spectral filtering, in Proc. of Advances in Neural Information Processing Systems, Barcelona, Spain, 2016, pp. 3844-3852.
[15]   Strogatz S. H., Exploring complex networks, Nature, vol. 410, no. 6825, p. 268, 2001.
[16]   Gilbert E. N., Random graphs, The Annals of Mathematical Statistics, vol. 30, no. 4, pp. 1141-1144, 1959.
[17]   Watts D. J. and Strogatz S. H., Collective dynamics of small-worldnetworks, Nature, vol. 393, no. 6684, p. 440, 1998.
[18]   Barabási A. L. and Albert R., Emergence of scaling in random networks, Science, vol. 286, no. 5439, pp. 509- 512, 1999.
[19]   Kashima H., Tsuda K., and Inokuchi A., Marginalized kernels between labeled graphs, in Proceedings of the 20th International Conference on Machine Learning (ICML-03), Atlanta, GA, USA, 2003, pp. 321-328.
[20]   Borgwardt K. M. and Kriegel H. P., Shortest-path kernels on graphs, in Proceedings of the Fifth IEEE International Conference on Data Mining (ICDM’05), Washington, DC, USA, 2005, p. 8.
[21]   Shervashidze N., Vishwanathan S. V. N., Petri T. H., Mehlhorn K., and Borgwardt K. M., Efficient graphlet kernels for large graph comparison, in Proc. of Artificial Intelligence and Statistics, Clearwater Beach, FL, USA, 2009, pp. 488-495.
[22]   Shervashidze N., Schweitzer P., van Leeuwen E. J., Mehlhorn K., and Borgwardt K. M., Weisfeiler-Lehman graph kernels, Journal of Machine Learning Research, vol. 12, no. , pp. 2539-2561, 2011.
[23]   Simonyan K. and Zisserman A., Very deep convolutional networks for large-scale image recognition, arXiv preprint arXiv: 1409.1556, 2014.
[24]   Long J., Shelhamer E., and Darrell T., Fully convolutional networks for semantic segmentation, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 2015, pp. 3431-3440.
[25]   Kalchbrenner N., Grefenstette E., and Blunsom P., A convolutional neural network for modelling sentences, arXiv preprint arXiv: 1404.2188, 2014.
[26]   Bruna J., Zaremba W., Szlam A., LeCun Y., Spectral networks and locally connected networks on graphs, arXiv preprint arXiv: 1312.6203, 2013.
[27]   Henaff M., Bruna J., and LeCun Y., Deep convolutional networks on graph-structured data, arXiv preprint arXiv: 1506.05163, 2015.
[28]   Kipf T. N. and Welling M., Semi-supervised classification with graph convolutional networks, arXiv preprint, arXiv: 1609.02907, 2016.
[29]   Gu W., Gong L., Lou X., and Zhang J., The hidden flow structure and metric space of network embedding algorithms based on random walks, Scientific Reports, vol. 7, no. 1, p. 13114, 2017.
[30]   Smith L. I., A tutorial on principal components analysis, , 2002.
[31]   Wang X. F. and Chen G., Complex networks: Small-world, scale-free and beyond, IEEE Circuits and Systems Magazine, vol. 3, no. 1, pp. 6-20, 2003.
[32]   Wale N., Watson I. A., and Karypis G., Comparison of descriptor spaces for chemical compound retrieval and classification, Knowledge and Information Systems, vol. 14, no. 3, pp. 347-375, 2008.
[33]   Leskovec J., Kleinberg J., and Faloutsos C., Graphs over time: Densification laws, shrinking diameters and possible explanations, in Proceedings of the Eleventh ACM SIGKDD International Conference on Knowledge Discovery in Data Mining, Chicago, IL, USA, 2005, pp. 177-187.
[34]   Niepert M., Ahmed M., and Kutzkov K., Learning convolutional neural networks for graphs, in Proceedings of International Conference on Machine Learning, New York, NY, USA, 2016, pp. 2014-2023.
[1] Guangyuan Zheng, Guanghui Han, Nouman Qadeer Soomro. An Inception Module CNN Classifiers Fusion Method on Pulmonary Nodule Diagnosis by Signs[J]. Tsinghua Science and Technology, 2020, 25(03): 368-383.
[2] Yali Zhao, Yali Li, Shengjin Wang. Person Re-Identification with Effectively Designed Parts[J]. Tsinghua Science and Technology, 2020, 25(03): 415-424.
[3] Jingchun Cheng, Yali Li, Jilong Wang, Le Yu, Shengjin Wang. Exploiting Effective Facial Patches for Robust Gender Recognition[J]. Tsinghua Science and Technology, 2019, 24(03): 333-345.
[4] Lei Chen,Jing Zhang,Lijun Cai,Ziyun Deng. Fast Community Detection Based on Distance Dynamics[J]. Tsinghua Science and Technology, 2017, 22(6): 564-585.
[5] Fulan Qian,Yang Gao,Shu Zhao,Jie Tang,Yanping Zhang. Combining Topological Properties and Strong Ties for Link Prediction[J]. Tsinghua Science and Technology, 2017, 22(6): 595-608.
[6] Wentao Han,Xiaowei Zhu,Ziyan Zhu,Wenguang Chen,Weimin Zheng,Jianguo Lu. A Comparative Analysis on Weibo and Twitter[J]. Tsinghua Science and Technology, 2016, 21(1): 1-16.
[7] . Complexity of Public Transport Networks[J]. Tsinghua Science and Technology, 2007, 12(2): 204-213.