Top Qs
Timeline
Chat
Perspective

Tsetlin machine

Artificial intelligence algorithm From Wikipedia, the free encyclopedia

Tsetlin machine
Remove ads

A Tsetlin machine is an artificial intelligence algorithm based on propositional logic.

Thumb
A simple block diagram of the Tsetlin machine

Background

A Tsetlin machine is a form of learning automaton collective for learning patterns using propositional logic. Ole-Christoffer Granmo created[1] and gave the method its name after Michael Lvovitch Tsetlin, who invented the Tsetlin automaton[2] and worked on Tsetlin automata collectives and games.[3] Collectives of Tsetlin automata were originally constructed, implemented, and studied theoretically by Vadim Stefanuk in 1962.

The Tsetlin machine uses computationally simpler and more efficient primitives compared to more ordinary artificial neural networks.[4]

As of April 2018 it has shown promising results on a number of test sets.[5][6]

Remove ads

Types

  • Original Tsetlin machine[4]
  • Convolutional Tsetlin machine[7]
  • Regression Tsetlin machine[8]
  • Relational Tsetlin machine[9]
  • Weighted Tsetlin machine[10][11]
  • Arbitrarily deterministic Tsetlin machine[12]
  • Parallel asynchronous Tsetlin machine[13]
  • Coalesced multi-output Tsetlin machine[14]
  • Tsetlin machine for contextual bandit problems[15]
  • Tsetlin machine autoencoder[16]
  • Tsetlin machine composites: plug-and-play collaboration between specialized Tsetlin machines[17][18]
  • Contracting Tsetlin machine with absorbing automata[19]
  • Graph Tsetlin machine[20]
Remove ads

Applications

Original Tsetlin machine

Summarize
Perspective
Thumb
A detailed block diagram of the original Tsetlin machine
More information , ...

Tsetlin automaton

Thumb

The Tsetlin automaton is the fundamental learning unit of the Tsetlin machine. It tackles the multi-armed bandit problem, learning the optimal action in an environment from penalties and rewards. Computationally, it can be seen as a finite-state machine (FSM) that changes its states based on the inputs. The FSM will generate its outputs based on the current states.

  • A quintuple describes a two-action Tsetlin automaton:

  • A Tsetlin automaton has states, here 6:

  • The FSM can be triggered by two input events

  • The rules of state migration of the FSM are stated as

  • It includes two output actions

  • Which can be generated by the algorithm

Boolean input

A basic Tsetlin machine takes a vector of o Boolean features as input, to be classified into one of two classes, or . Together with their negated counterparts, , the features form a literal set .

Clause computing module

A Tsetlin machine pattern is formulated as a conjunctive clause , formed by ANDing a subset of the literal set:

     .

For example, the clause consists of the literals and outputs 1 iff and .

Summation and thresholding module

The number of clauses employed is a user-configurable parameter n. Half of the clauses are assigned positive polarity. The other half is assigned negative polarity. The clause outputs, in turn, are combined into a classification decision through summation and thresholding using the unit step function :

In other words, classification is based on a majority vote, with the positive clauses voting for and the negative for . The classifier

     ,

for instance, captures the XOR-relation.

Feedback module

Type I feedback

More information , ...

Type II feedback

More information Action, Clause ...

Resource allocation

Resource allocation dynamics ensure that clauses distribute themselves across the frequent patterns, rather than missing some and overconcentrating on others. That is, for any input X, the probability of reinforcing a clause gradually drops to zero as the clause output sum

approaches a user-set target T for ( for ).

If a clause is not reinforced, it does not give feedback to its Tsetlin automata, and these are thus left unchanged. In the extreme, when the voting sum v equals or exceeds the target T (the Tsetlin Machine has successfully recognized the input X), no clauses are reinforced. Accordingly, they are free to learn new patterns, naturally balancing the pattern representation resources.

Remove ads

Implementations

Software

Hardware

  • One of the first FPGA-based hardware implementation[46][47] of the Tsetlin Machine on the Iris flower data set was developed by the μSystems (microSystems) Research Group at Newcastle University.
  • They also presented the first ASIC[48][49] implementation of the Tsetlin Machine focusing on energy frugality, claiming it could deliver 10 trillion operation per Joule.[50] The ASIC design had demoed on DATA2020.[51]
Remove ads

Further reading

Books

  • An Introduction to Tsetlin Machines [52]

Conferences

  • International Symposium on the Tsetlin Machine (ISTM) [53][54][55]

Videos

Papers

  • On the Convergence of Tsetlin Machines for the XOR Operator [63]
  • Learning Automata based Energy-efficient AI Hardware Design for IoT Applications [36]
  • On the Convergence of Tsetlin Machines for the IDENTITY- and NOT Operators [64]
  • The Tsetlin Machine - A Game Theoretic Bandit Driven Approach to Optimal Pattern Recognition with Propositional Logic [4]

Publications/news/articles

  • A low-power AI alternative to neural networks [50]
  • Can a Norwegian invention revolutionise artificial intelligence?[65]
Remove ads

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.

Remove ads