22017 IEEE International Conference on Computational Science and Engineering (CSE) and IEEE International Conference on Embedde 2017
DOI: 10.1109/cse-euc.2017.155
|View full text |Cite
|
Sign up to set email alerts
|

Password Guessing Based on LSTM Recurrent Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
4
2

Relationship

0
10

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 7 publications
0
6
0
Order By: Relevance
“…Compared to the traditional password guessing methods, it has a larger password space and the generated samples are not limited to the training set. Based on the structure of models, deep learning-based methods can be separated into two types, RNN-based password guessing [25,26] and GANbased password guessing [27][28][29]. Melicher et al [25] first applied deep learning to password guessing, introducing a password guessing model based on recurrent neural network (RNN).…”
Section: Related Work Traditional Password Guessing Methods Can Be Br...mentioning
confidence: 99%
“…Compared to the traditional password guessing methods, it has a larger password space and the generated samples are not limited to the training set. Based on the structure of models, deep learning-based methods can be separated into two types, RNN-based password guessing [25,26] and GANbased password guessing [27][28][29]. Melicher et al [25] first applied deep learning to password guessing, introducing a password guessing model based on recurrent neural network (RNN).…”
Section: Related Work Traditional Password Guessing Methods Can Be Br...mentioning
confidence: 99%
“…The RNN outputs a character in each time step and receives it as the input of the next time step. Xu improved the network architecture and replaced the RNN with LSTM, to mine long-range dependency [53]. Teng proposes PG-RNN, which increases the number of neurons and has a competitive effect on different datasets [54].…”
Section: Other Password-generation Modelsmentioning
confidence: 99%
“…The RNN generated one character at each time step and used the generated character as the input for the next time step until the terminator appeared or the maximum length was reached. In other studies, RNN [47] was replaced with LSTM [48], promoting the effect of capturing the long-range dependence of characters. Teng devised a PG-RNN that increased the number of neurons, and got competitive results on multiple datasets [34].…”
Section: Other Password Generative Modelsmentioning
confidence: 99%