Abstract
The essence of knowledge representation learning is to embed the knowledge graph into a low-dimensional vector space to make knowledge computable and inferable. Semantic discriminate models greatly improve the performance of knowledge embedding through increasingly complex feature engineering. For example, the projection calculation based on matrixes can achieve more detailed semantic interactions and higher accuracies. However, complex feature engineering results in high time complexity and discriminate parameters pressure, which make them difficult to effectively applied to large-scale knowledge graphs. TransGate is proposed to relieve the pressure of the huge number of parameters in semantic discriminate models and obtains better performance with much fewer parameters. We find that the gate filtering vector obtained by the traditional gate used by TransGate would rapidly fall in the state of a nearly boundary binary-valued distribution (most values are near 0 or near 1) after only a few hundred rounds of training. This means that most filtering gate values either allow the information element to pass completely or not at all, which can be called extreme filtering. We argue that this filtering pattern ignore the interaction between information elements. In this paper, TransMVG model is proposed to improve the traditional boundary binary-valued gate to a multiple-valued gate on the premise of ensuring the randomness. The experiments results show that TransMVG outperforms the state-of art baselines. This means it is feasible and necessary to multivalue the filter gate vectors in the process of knowledge representation learning based-on the gate structure.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bollacker, K., Evans, C., Paritosh, P., Sturge, T., Taylor, J.: Freebase: a collaboratively created graph database for structuring human knowledge. In: Proceedings of the 2008 ACM SIGMOD International Conference on Management of data, pp. 1247–1250 (2008)
Bordes, A., Usunier, N., Garcia-Duran, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data. In: Advances in Neural Information Processing Systems, pp. 2787–2795 (2013)
Bordes, A., Weston, J., Collobert, R., Bengio, Y.: Learning structured embeddings of knowledge bases. In: Twenty-Fifth AAAI Conference on Artificial Intelligence, pp. 301–306 (2011)
Cai, L., Wang, W.Y.: Kbgan: Adversarial learning for knowledge graph embeddings. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1470–1480 (2018)
Dettmers, T., Minervini, P., Stenetorp, P., Riedel, S.: Convolutional 2d knowledge graph embeddings. In: The Thirty-Second AAAI Conference on Artificial Intelligence, pp. 1811–1818 (2018)
Garcia-Duran, A., Niepert, M.: Kblrn: End-to-end learning of knowledge base representations with latent, relational, and numerical features. In: Proceedings of UAI (2017)
Guo, S., Wang, Q., Wang, L., Wang, B., Guo, L.: Knowledge graph embedding with iterative guidance from soft rules. In: Thirty-Second AAAI Conference on Artificial Intelligence, pp. 4816–4823 (2018)
He, S., Liu, K., Ji, G., Zhao, J.: Learning to represent knowledge graphs with gaussian embedding. In: Proceedings of the 24th ACM International on Conference on Information and Knowledge Management, pp. 623–632 (2015)
Jenatton, R., Roux, N.L., Bordes, A., Obozinski, G.: A latent factor model for highly multi-relational data. Adv. Neural Inf. Process. Syst. 4, 3167–3175 (2012)
Ji, G., He, S., Xu, L., Liu, K., Zhao, J.: Knowledge graph embedding via dynamic mapping matrix. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing, vol. 1, pp. 687–696 (2015)
Ji, G., Liu, K., He, S., Zhao, J.: Knowledge graph completion with adaptive sparse transfer matrix. In: Thirtieth AAAI Conference on Artificial Intelligence, pp. 985–991 (2016)
Li, Z., et al.: Towards binary-valued gates for robust LSTM training. In: Proceedings of the 35th International Conference on Machine Learning (2018)
Lin, Y., Liu, Z., Sun, M., Liu, Y., Zhu, X.: Learning entity and relation embeddings for knowledge graph completion. In: Twenty-ninth AAAI Conference on Artificial Intelligence, pp. 2181–2187 (2015)
Miller, G.: Wordnet: a lexical database for English. Commun. ACM. 38, 39–41 (1995)
Nguyen, D.Q., Nguyen, T.D., Nguyen, D.Q., Phung, D.: A novel embedding model for knowledge base completion based on convolutional neural network. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 327–333 (2017)
Schlichtkrull, M., Kipf, T.N., Bloem, P., Van Den Berg, R., Titov, I., Welling, M.: Modeling relational data with graph convolutional networks. In: European Semantic Web Conference, pp. 593–607 (2018)
Shi, B., Weninger, T.: Proje: Embedding projection for knowledge graph completion. In: Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, pp. 1236–1242 (2017)
Socher, R., Chen, D., Manning, C.D., Ng, A.Y.: Reasoning with neural tensor networks for knowledge base completion. In: Advances in Neural Information Processing Systems, pp. 926–934 (2013)
Tan, Z., Zhao, X., Wang, W.: Representation learning of large-scale knowledge graphs via entity feature combinations. In: Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, pp. 1777–1786 (2017)
Toutanova, K., Chen, D.: Observed versus latent features for knowledge base and text inference. In: Proceedings of the 3rd Workshop on Continuous Vector Space Models and their Compositionality, pp. 57–66 (2015)
Trouillon, T., Welbl, J., Riedel, S., Gaussier, E., Bouchard, G.: Complex embeddings for simple link prediction. In: Proceedings of the 33rd International Conference on Machine Learning, pp. 2071–2080 (2016)
Wang, Z., Zhang, J., Feng, J., Chen, Z.: Knowledge graph embedding by translating on hyperplanes. In: Twenty-Eighth AAAI Conference on Artificial Intelligence, pp. 1112–1119 (2014)
Xiao, H., Huang, M., Zhu, X.: Transg: A generative model for knowledge graph embedding. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, vol. 1, pp. 2316–2325 (2016)
Yang, B., Yih, W., He, X., Gao, J., Deng, L.: Embedding entities and relations for learning and inference in knowledge bases. In: Proceedings of the International Conference on Learning Representations (2014)
Yuan, J., Gao, N., Xiang, J.: TransGate: Knowledge graph embedding with shared gate structure. Proc. AAAI Conf. Artif. Intell. 33, 3100–3107 (2019)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9, 1735–1780 (1997)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Guo, X., Gao, N., Yuan, J., Wang, X., Wang, L., Kang, D. (2020). TransMVG: Knowledge Graph Embedding Based on Multiple-Valued Gates. In: Huang, Z., Beek, W., Wang, H., Zhou, R., Zhang, Y. (eds) Web Information Systems Engineering – WISE 2020. WISE 2020. Lecture Notes in Computer Science(), vol 12342. Springer, Cham. https://doi.org/10.1007/978-3-030-62005-9_21
Download citation
DOI: https://doi.org/10.1007/978-3-030-62005-9_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-62004-2
Online ISBN: 978-3-030-62005-9
eBook Packages: Computer ScienceComputer Science (R0)