2002
DOI: 10.1016/s0925-2312(01)00612-9
|View full text |Cite
|
Sign up to set email alerts
|

N-bit parity neural networks: new solutions based on linear programming

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
6
0

Year Published

2006
2006
2019
2019

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 22 publications
(6 citation statements)
references
References 9 publications
0
6
0
Order By: Relevance
“…Recurrent networks may solve some problems where there is a simple rule that generates the sequence of observations but general problems that are based on complex logic are not of this kind. Knowing beforehand that the data represents parity problem allows for setting an appropriate MLP or LSTM architecture to solve it [36,5,33,35,30,3,31,37,28,38], but for complex logic in real cases (for example, dynamics of brain processes) it will be very difficult to guess how to choose an appropriate model. Learning Boolean functions similar to parity may indeed be a great test for methods that try to evolve neural architecture to solve a given problem, but so far no such systems are in sight.…”
Section: Introductionmentioning
confidence: 99%
“…Recurrent networks may solve some problems where there is a simple rule that generates the sequence of observations but general problems that are based on complex logic are not of this kind. Knowing beforehand that the data represents parity problem allows for setting an appropriate MLP or LSTM architecture to solve it [36,5,33,35,30,3,31,37,28,38], but for complex logic in real cases (for example, dynamics of brain processes) it will be very difficult to guess how to choose an appropriate model. Learning Boolean functions similar to parity may indeed be a great test for methods that try to evolve neural architecture to solve a given problem, but so far no such systems are in sight.…”
Section: Introductionmentioning
confidence: 99%
“…Otherwise it is the even parity problem. As the change of any bit of the vector causes different results consequently, the n-bit parity problem is very hard to solve, thus it is used widely for testing training algorithm of neural networks [3][4][5][6][7][8][9][10].…”
Section: Introductionmentioning
confidence: 99%
“…Special feedforward neural network architectures have been proposed to handle parity problems [16,30,31,28,21] but they are designed only for this special case and cannot be used for other Boolean functions, even very similar to parity. Learning systems are frequently tested on benchamrk datasets that are almost lienarly separable and relatively simple to handle, but without a strong prior knowledge it is very hard to find satisfactory solution for really complex problems.…”
Section: Introductionmentioning
confidence: 99%