Machine Learning Pseudo-Natural Language for Temporal Logic Requirements of Embedded Systems
Résumé
Requirements formalization is a critical part of any
verification methodology for embedded systems like those in the
automotive industry. There is a strong tension between techniques
that enter requirements as logic- or code-like formal expressions
and others that use natural language. The former are much
safer but require user training and have low productivity. As
a compromise we proposed a context-free grammar for entering
real-time system requirements and translating them to temporal
logic (TL) unambiously and reversibly. It has been demonstrated
on hundreds of examples and became validated by a recent
patent. But building or extending the grammar itself requires
a precise understanding of the translation rules. To aleviate this
new hurdle we have found that neural nets inspired by NLP
can learn and then replace the pseudo-English-to-TL translation,
and allow extending it without the explicit use of a grammar. The
paper explains how we mixed real-life and synthetic datasets and
overcame the initial limitations of the neural nets.