*An Introduction to Information Theory*, 2nd ed., 1980

A non-deterministic finite state machine which generates sentences of English. (Directions: *begin at box 1, word in that box is produced; flip a fair coin, if heads, follow upper path, if tails, follow lower path to next box. Repeat.*) Is this how the human brain generates language? Despite Chomsky's famous arguments to the contrary, we should remember that so long as there is an upper bound on the depth of embedding, any natural language may be characterized by a sufficiently elaborate finite state device. If the probabilities on paths to and from each state can themselves be affected by perceptual input, we could envision a situation for which language, considered as an abstract object, was indeed *finite state*, but language production *in situ* was *context sensitive*.

## 4 comments:

How is this nondeterministic? In an NFA there's more than one possible transition per state/input pair; here, there are only two transitions per box, one each for heads and tails. You get heads, you take the upper path.

Well, coin tosses, certainly of the idealized "fair" variety considered here, are generally taken to be nondeterministic. More properly, I suppose, we would have labelled both transition arrows exiting each state with the word currently residing in the respective state-box. I guess, then, one source of confusion is the nonstandard diagram (intended, presumably, for ease of understanding to the uninitiated).

A further, more interesting point, though, is one about the identity of machines. The way the machine's been described, with the coin toss performed by the "operator," indeed divorces the nondeterministic process from the machine itself, much as I propose for natural language as a whole in the sequel. Unfortunately, I don't know if there is any highly developed complexity theory for such machine "combinations" - certainly, a combination between a weaker and a stronger machine results in a machine with the same computational power as the stronger one. But the pertinent issue here is a bit more subtle. In fact, if natural language is Turing computable in practice, but the language-storing part of the brain could be emulated with only Finite State power, that would be a fascinating result.

Well, you could do the coin-tosses in advance. The machine is (partially) defined in terms of its (current state, input, next state) transitions; in an NFA the first two elements of the tuple don't uniquely determine the third and in a DFA they do—which is the relevant determinism: given the current state and the input, you know what the next state will be. I'll grant that the combination of the machine and the method of generating the input stream is nondeterministic, but in this case that's solely because the method is nondeterministic. But you won't call the machine (operating on characters) that goes:

(1, 'a', 2)

(2, 'b', 1)

with 1 as the start state and 2 as the only end state, and input symbols 'a' and 'b', which will accept strings of the form 'a', 'aba', 'ababa', etc., nondeterministic just because you generate the

charactersby flipping a coin.This doesn't really matter, though, since any NFA can be converted to an equivalent DFA. I take this on faith but you can read the proof.

Well, the more I think about this particular device, the less it looks like a finite state machine. For one thing, there is no halt state, so technically you

couldn'tdo the coin tosses in advance, because the process would never finish.Furthermore, and perhaps more importantly, the proof of the equivalence between NFAs and DFAs makes clear that there are not actually probabilities assigned to transitions in a non-deterministic finite state automaton - I suppose this is implicitly the same as assigning equal probabilities to each transition as done here.

The diagram is from a book on Information Theory; I reexamined what the author said about it. He describes the machine as "finite state" and says it generates a "stochastic process," but makes no attempt to deal with automata theory in any rigorous way.

Post a Comment