Neuron Constraint Code

Neuron Constraints

Neuron Constraints allow to embed an Artificial Neural Network (ANN) in a Constraints Programming model. A neuron constraint models a single neuron in an ANN and has signature:

actfunction([X_i], Y, b, [w_i])

where [X_i] is a vector of variables representing the neuron input, Y is a variables representing the neuron output, b is the bias value and [w_i] is the vector of neuron weights. The term "actfunction" stand for a specific activation function (e.g. a step or a sigmoid). The constraint enforces bound consistency on the formula:

y = actfuction(b + sum_i w_i * X_i)

We have implemented on top of Google or-tools a prototype version of three neuron constraints, corresponding to the activation function "hardlim" (step), "tansig" (sigmoid) and "purelin" (linear). The naming convetion comes from the Matlab Neural Network toolbox. In detail we have:

hardlim(x) = 0 if x < 0, 1 otherwise

tansig(x) = 2 / (1 + exp(-2*x)) - 1

purelin(x) = x

Since or-tools does not provide support for real-valued variables, our protoypes use a finite precision approximation. The modified signature is:

actfunction([X_i], Y, b, [w_i], p)

where variables X_i and Y are integer valued and p is an integer precision factor. Weights and bias are still real-valued parameters. Our constraints enforce bond consistency on the formula:

y = round(actfuction(b * p + sum_i w_i * p * X_i))

Installation Instructions

We provide an archive with the prototype Neuron Constraint code. The archive can be built via makefile and is configured to work on an OSX 10.7 machine. The code itself should work under Windows and Linux (modifications to the makefile will be needed), but this has not been tested.

The archive contains the C++ code for the constraints + the python wrapper. The wrapper is designed to extend the "pywrapcp" module. In order to install it, you have to:

  • extract the archive in the folder where you indend to install the Neuron Constraint code,
  • build the C++ code with "make neuron_cst_test".
  • If something does not work out you can reset everything with "make clean" (as usual).
  • Test if the code works by calling "neuron_cst_test" (if you get no failed assertion, everything should be fine).
  • Then you can build the Python wrapper with "pywrapcp_ncst"
  • You can test if the wrapper is working by running "python test_tansig.py"
  • Running "make all" builds the whole stuff at once (C++ code + wrapper)

Note that those are just very rough directions: if you find any difficulty in installing the code you can contact Michele Lombardi.

As an important caveat, the code contains a couple of machine dependent constants (i.e. the invertibility domain for tansig, given a specific accuracy). Those values appear at the beginning of the "neuron_cst.cc" file and are computed on a 64-bit Intel machine: they may need to be recomputed when the code is ported to a different architecture.