A non-deterministic finite state machine which generates sentences of English. (Directions: begin at box 1, word in that box is produced; flip a fair coin, if heads, follow upper path, if tails, follow lower path to next box. Repeat.) Is this how the human brain generates language? Despite Chomsky's famous arguments to the contrary, we should remember that so long as there is an upper bound on the depth of embedding, any natural language may be characterized by a sufficiently elaborate finite state device. If the probabilities on paths to and from each state can themselves be affected by perceptual input, we could envision a situation for which language, considered as an abstract object, was indeed finite state, but language production in situ was context sensitive.