2020
DOI: 10.1007/978-3-030-55789-8_60
|View full text |Cite
|
Sign up to set email alerts
|

Increasing the Inference and Learning Speed of Tsetlin Machines with Clause Indexing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 6 publications
0
2
0
Order By: Relevance
“…By introducing real-valued clause weights, the number of clauses can be reduced by up to 50× without accuracy loss [14]. Also, the logical inference structure of TMs makes it possible to index the clauses on the features that falsify them, increasing inference and learning speed by up to an order of magnitude [15]. In [16], stochastic searching on the line automata [17] learns integer clause weights, performing on-par or better than Random Forest, Gradient Boosting, Neural Additive Models, StructureBoost and Explainable Boosting Machines.…”
Section: Introductionmentioning
confidence: 99%
“…By introducing real-valued clause weights, the number of clauses can be reduced by up to 50× without accuracy loss [14]. Also, the logical inference structure of TMs makes it possible to index the clauses on the features that falsify them, increasing inference and learning speed by up to an order of magnitude [15]. In [16], stochastic searching on the line automata [17] learns integer clause weights, performing on-par or better than Random Forest, Gradient Boosting, Neural Additive Models, StructureBoost and Explainable Boosting Machines.…”
Section: Introductionmentioning
confidence: 99%
“…Further, hyper‐parameter search can be simplified with multi‐granular clauses, eliminating the pattern specificity parameter (Gorji et al, 2019). By indexing the clauses on the features that falsify them, up to an order of magnitude faster inference and learning has been reported (Gorji et al, 2020). Additionally, regression TMs compare favourably with Regression Trees, Random Forest Regression and Support Vector Regression (Abeyrathna, Granmo, Zhang, & Goodwin, 2019).…”
Section: Introductionmentioning
confidence: 99%