Model Agnostic Solution of CSPs via Deep Learning: A Preliminary Study

Publicly Accessible Resources

Abstract

Deep Neural Networks (DNNs) have been shaking the AI scene, for their ability to excel at Machine Learning tasks without relying on complex, hand-crafted, features. Here, we probe whether a DNN can learn how to construct solutions of a CSP, without any explicit symbolic information about the problem constraints. We train a DNN to extend a feasible solution by making a single, globally consistent, variable assignment. The training is done over intermediate steps of the construction of feasible solutions. From a scientific standpoint, we are interested in whether a DNN can learn the structure of a combinatorial problem, even when trained on (arbitrarily chosen) construction sequences of feasible solutions. In practice, the network could also be used to guide a search process, e.g. to take into account (soft) constraints that are implicit in past solutions or hard to capture in a traditional declarative model. This research line is still at an early stage, and a number of complex issues remain open. Nevertheless, we already have intriguing results on the classical Partial Latin Square and N-Queen completion problems.

 

Publication available at

https://doi.org/10.1007/978-3-319-93031-2_18

 

Cite as

Galassi A., Lombardi M., Mello P., Milano M. (2018) Model Agnostic Solution of CSPs via Deep Learning: A Preliminary Study. In: van Hoeve WJ. (eds) Integration of Constraint Programming, Artificial Intelligence, and Operations Research. CPAIOR 2018. Lecture Notes in Computer Science, vol 10848. Springer, Cham. DOI: 10.1007/978-3-319-93031-2_18

 

Additional material

Presentation at CPAIOR 2018

Presentation at OLA workshop 2018: Deep Neural Networks for CSP, an initial investigation