0

z3 can solve optimisation problems using z3.Optimize. My goal is to apply it in the following way: Given a Boolean formula of propositional logic and a set of constraints, find all models that optimise the constraints, i.e. fulfil as many as possible. My ultimate goal is to select one of these models randomly.

Minimal example: Given the formula Or(x,y) and constraints Not(x) and Not(y), find the solutions [x: True, y: False] and [x: False, y: True].

While z3.Optimize.model() prints one optimal solution, it seems to be deterministic and to always print the solution [x: False, y: True]:

from z3 import * x, y = Bools('x y') s = Optimize() s.add(Or(x,y)) s.add_soft(Not(x), weight="1") s.add_soft(Not(y), weight="1") s.check() print(s.model()) 

When all_smt() provided by @alias is applied to this problem, list(all_smt(s, [x,y])) outputs:

[[x = False, y = True], [y = False, x = True], [y = True, x = True]] 

So all_smt() finds all solutions, including the non-optimal [x: True, y: True].

What is the best way to find all optimal solutions? Alternatively, is it possible to find an optimal solution in a random way without generating all optimal solutions first?

1 Answer 1

1

It's not quite clear what you mean by "optimal" in this context. It appears you allow the soft-constraints to be violated, but at least one of them must hold? (After all, ([x=True, y=True] is a valid solution, violating both your soft-constraints; which you did allow by explicitly calling all_smt.)

Assuming you are indeed looking for solutions that satisfy at least one of the soft-constraints, you should explicitly code that up and assert as a separate predicate. The easiest way to do this would be to use tracker variables:

from z3 import * def all_smt(s, initial_terms): def block_term(s, m, t): s.add(t != m.eval(t, model_completion=True)) def fix_term(s, m, t): s.add(t == m.eval(t, model_completion=True)) def all_smt_rec(terms): if sat == s.check(): m = s.model() yield m for i in range(len(terms)): s.push() block_term(s, m, terms[i]) for j in range(i): fix_term(s, m, terms[j]) yield from all_smt_rec(terms[i:]) s.pop() yield from all_smt_rec(list(initial_terms)) x, y = Bools('x y') s = Optimize() s.add(Or(x,y)) p1, p2 = Bools('p1 p2') s.add(Implies(p1, Not(x))) s.add(Implies(p2, Not(y))) asserted = If(p1, 1, 0) + If(p2, 1, 0) s.add(asserted >= 1) s.minimize(asserted) print([[(x, m[x]), (y, m[y])] for m in all_smt(s, [x,y])]) 

Here, we use p1 and p2 to track whether the soft-constraints are asserted, and we minimize the number of soft-constraints via the variable asserted yet requiring it should at least be 1. When run, the above prints:

[[(x, True), (y, False)], [(x, False), (y, True)]] 

which seems to be what you wanted to achieve in the first place.

Sign up to request clarification or add additional context in comments.

4 Comments

Thanks! Sorry about the imprecision. By “optimal” I indeed meant “minimise violations of soft constraints”.
No worries. Then the tracker-variable variant is exactly what you want.
I seem to be getting the correct results by just asserting asserted = If(Not(x), 1, 0) + If(Not(y), 1, 0) and not adding the tracker variables at all. Now I'm wondering whether the method can work without the tracker variables, or whether this may have unintended consequences?
For this particular case, yes; it appears you don't actually need trackers. Tracking variables are a general technique that you can use if you have a solver that doesn't offer optimization; though it can also make coding easier in certain cases. See stackoverflow.com/questions/70771595/… for a similar problem and further pointers.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.