%0 Conference Proceedings %T Probabilistic Embedding of Knowledge Graphs with Box Lattice Measures %A Vilnis, Luke %A Li, Xiang %A Murty, Shikhar %A McCallum, Andrew %Y Gurevych, Iryna %Y Miyao, Yusuke %S Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) %D 2018 %8 July %I Association for Computational Linguistics %C Melbourne, Australia %F vilnis-etal-2018-probabilistic %X Embedding methods which enforce a partial order or lattice structure over the concept space, such as Order Embeddings (OE), are a natural way to model transitive relational data (e.g. entailment graphs). However, OE learns a deterministic knowledge base, limiting expressiveness of queries and the ability to use uncertainty for both prediction and learning (e.g. learning from expectations). Probabilistic extensions of OE have provided the ability to somewhat calibrate these denotational probabilities while retaining the consistency and inductive bias of ordered models, but lack the ability to model the negative correlations found in real-world knowledge. In this work we show that a broad class of models that assign probability measures to OE can never capture negative correlation, which motivates our construction of a novel box lattice and accompanying probability measure to capture anti-correlation and even disjoint concepts, while still providing the benefits of probabilistic modeling, such as the ability to perform rich joint and conditional queries over arbitrary sets of concepts, and both learning from and predicting calibrated uncertainty. We show improvements over previous approaches in modeling the Flickr and WordNet entailment graphs, and investigate the power of the model. %R 10.18653/v1/P18-1025 %U https://aclanthology.org/P18-1025 %U https://doi.org/10.18653/v1/P18-1025 %P 263-272