Hi Marvin,
To build deterministic monitors we use the construction from Marcelo
d’Amorim and Grigoire Roşu: Efficient monitoring of ω-languages
(CAV’05, LNCS 3576) as it was described by Deian Tabakov and Moshe Y.
Vardi: Optimized Temporal Monitors for SystemC. (RV’10, LNCS 6418).
Note that (1) the latter paper uses Spot, but when it was written,
Spot could not produce monitors, and (2) these papers describe a
construction that starts from a Büchi automaton, but we start from a
TGBA instead as that is faster. So basically: translate to TGBA,
remove useless SCCs, strip acceptance condition, determinize,
minimize.
To build non-deterministic monitors (that's what your code is asking
for) we do the same, except that "determinize, minimize" is replaced
by a pass of forward&backward simulation-based reductions. (These
simulation-based reductions are presented in a Büchi context in
Section 4.2 of
https://www.lrde.epita.fr/~adl/dl/adl/babiak.13.spin.pdf
) and by default we compare the size non-deterministic and the
deterministic monitors and output the smallest of the two.
I checked and currently the only places where we give these references
are in the man page of ltl2tgba
https://spot.lrde.epita.fr/man/ltl2tgba.1.html#NOTE%20ON%20GENERATING%20MON…
as well as in the documentation of the minimize_monitor() function,
which is called indirectly by spot::translator. But I could not find
a description of the non-deterministic case.
I will make a note to improve the documentation of
spot::translator/spot::postprocessor with more clues about what is
going on.
--
Alexandre Duret-Lutz