Download PDFOpen PDF in browser

Neural Networks as Artificial Specifications, Revisited

EasyChair Preprint no. 811

9 pagesDate: March 5, 2019


A computer can be trained to construct a specification. Although such an artificial specification cannot be expected to perfectly capture what the programmer has in mind, it does not require much manual effort to construct. Examples of use cases that can benefit from artificial specifications include automated testing and runtime verification. However, training artificial specifications is also very hard, among other things because programs, and therefore also their specifications, often operate on discrete domains. Earlier experiments reported too much false positives. The resulting debugging overhead is unlikely to be acceptable. In this paper we revisit an experiment by Vanmali et al. where we investigate several aspects that were left uninvestigated in the original work, namely the impact of using different learning modes, different aggressiveness modes, and different abstraction functions. The results are quite promising.

Keyphrases: artificial specification, automated oracles, neural network for software testing, software testing, specification mining

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
  author = {Wishnu Prasetya and Minh An Tran},
  title = {Neural Networks as Artificial Specifications, Revisited},
  howpublished = {EasyChair Preprint no. 811},
  doi = {10.29007/vnw8},
  year = {EasyChair, 2019}}
Download PDFOpen PDF in browser