4.7Summary
In this section, we covered regular languages and saw two equivalent definitions:
(Machine model) Regular languages are those that are recognized by finite automata
(Generative model) Regular languages are those that are generated by regular expressions.
Some important points:
Nondeterministic FA vs Deterministic FA We saw non-deterministic finite automata and showed that non-determinism does not give increased power in the sense that every language recognized by NFAs are also recognized by DFAs. However, non-determinism is still useful as it is often easier and to design NFAs and the resulting NFAs can be exponentially smaller than the smallest equivalent DFAs.
Closure Properties Using non-determinism, we also showed that the set of regular languages are closed under several operations on languages.
DFA Minimization We also saw how to, given DFA, produce an equivalent minimal DFA.
Proving Nonregularity via Fooling Sets Finally, we showed that there are languages that are not regular using the fooling set technique. Again, the fooling set technique allows us to show that for every DFA , there is a string that the DFA incorrectly accepts or rejects. This is done by showing that there is a fooling set of size that is one larger than the number of states of . This, in turn implies that there is a pair of distinguishable strings that either accepts both or rejects both when it should not. We also saw how to use closure properties to prove nonregularity.
4.7.1Story so far¶
We summarize our current knowledge of the landscape of languages in the following diagram.

Current knowledge of landscape of languages. The set of regular languages are those that are recognized by finite automata and generated by regular expressions. The two models are equivalent. An example regular language is . An example non-regular language is .