A two-way semantic model is considered with two sources sharing their ideas chosen from different sets of facts. These facts may be expressed in the form of RDF (Resource Description Framework) triples. A set of conclusions can be derived by using the logical relations between these facts. This set of conclusions depends on the current interest of the network, thus not all combinations of facts lead to a useful conclusion. Users are interested in sharing only the facts that lead to these conclusions. Additionally, users do not want to use extra resources for sharing the facts that lead to the same conclusions.We consider the worst-case semantic communication performance of this network. We provide upper and lower bounds for each user to learn useful facts from one another, and show that increasing the number of rounds of interaction can improve the worst-case performance over the existing schemes by reducing the total number of bits transmitted.