2018 International Joint Conference on Neural Networks (IJCNN) 2018
DOI: 10.1109/ijcnn.2018.8489466
|View full text |Cite
|
Sign up to set email alerts
|

Learning Device Models with Recurrent Neural Networks

Abstract: Recurrent neural networks (RNNs) are powerful constructs capable of modeling complex systems, up to and including Turing Machines. However, learning such complex models from finite training sets can be difficult. In this paper we empirically show that RNNs can learn models of computer peripheral devices through input and output state observation. This enables automated development of functional software-only models of hardware devices. Such models are applicable to any number of tasks, including device validat… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 24 publications
0
3
0
Order By: Relevance
“…However, beside DIFT-based monitoring security policies it is necessary to devise efficient and scalable verification techniques and coverage metrics that are tailored for security policies and complement existing functional verification. Another interesting direction to support DIFT would be to automatically learn the data flow relations, e.g., based on neural networks [96], within a SystemC-based peripheral to avoid a manual DIFT integration into every peripheral.…”
Section: Challenges and Future Workmentioning
confidence: 99%
“…However, beside DIFT-based monitoring security policies it is necessary to devise efficient and scalable verification techniques and coverage metrics that are tailored for security policies and complement existing functional verification. Another interesting direction to support DIFT would be to automatically learn the data flow relations, e.g., based on neural networks [96], within a SystemC-based peripheral to avoid a manual DIFT integration into every peripheral.…”
Section: Challenges and Future Workmentioning
confidence: 99%
“…• MLDS-DS1 (41,000): RNNs Mimicking Simple Machines The MLDS datasets are available to download 5 and contain the trained networks in PyTorch format, a JSON file per example containing just the weights of the network, and a README describing the directory structure.…”
Section: Componentsmentioning
confidence: 99%
“…MLDS-DS1 contains 41,000 neural networks trained to mimic one of 5 simple machines of increasing complexity shown in Figure 3.2.1. These machines are variants of those originally introduced in [5], and consist of 8 input signals that can be driven high or low, and produce 8 output signals, based on the previous sequence of input signals. We use these simple machines as they are both easy to train and easy to modify with a backdoor as we'll see in MLDS-DS2.…”
Section: Mlds-ds1: Rnns Mimicking Simple Machinesmentioning
confidence: 99%