FST Specializer

a tool for employing FST special symbols (2017)

Finite State Transducers

FSTs or Finite State Transducers (or Automatons) are tool one can use for creating language models. They were especially popular in the era that pre-dated deep learning, (so before not only transformers were introduced, rather RNNs). FSTs are considered to be generative language models (LMs) since their outcome contains a graphical representation of all the strings found in that language, but no worries they can handle situations in which there is little data on particular patterns in the training (via backoff, for instance). Generally, these LMs can handle better low resourced data, compared to some deep learning approaches.

A little more

This repo builds on the OpenFST library. The Specializer is a tool that was built to effeciently process operations between one FST to another. We learned that in order to do so in python, we needed to work directly with the FST project written in C++ via Cython. There are several operations that this tool enables and are shown in the table. Each happens when two graphical representations (or lattices) are convoluted (or via a similar operation). The desired outcome will happen only when any of these symbols are on one of the lattices (of course this is a necessary but not sufficient condition)

How did I use it? In the project (Dudy et al., 2018), we used the specializer to create the second baseline (PreLM)

The special symbols behavior. Taken from the OpenFST documentation

References

2018

  1. rr.png
    A multi-context character prediction model for a brain-computer interface
    Shiran Dudy, Steven Bedrick, Shaobin Xu, and 1 more author
    In Proceedings of the conference. Association for Computational Linguistics. North American Chapter, 2018