User Tools

Site Tools


cc17:homework_1

Homework 01

Due Friday, February 17, 9:00am. Please hand it in before the beginning of the recitation session.

Problem 1

  1. Convert the following NFA to an equivalent DFA with 7 states (trap state is also counted).
  2. Show that $2$ states in your DFA can be merged without affecting the accepted language.

Problem 2

Let $\textit{rtail}$ be a function that returns all the symbols of a string except the last one. For example, $\textit{rtail}$(Lexer) = Lexe and $\textit{rtail}(1 2 0) = 1 2$. $\textit{rtail}$ is undefined for an empty string. If $R$ is a regular language, then $\textit{rtail}(R)$ applies the function to all its words.

Prove that $\textit{rtail}(R)$ is regular if $R$ is not nullable.

Hint: A language is regular if and only if there exists regular expressions for it.
Bonus: Let $|v|$ denote the length of the string $v$. Let $R$ be a regular language. Is the language $\{ u \mid uv \in R, |v|=10,\ \ u,v \in \Sigma^* \}$ always regular?

Problem 3

Let $D$ be any deterministic finite automaton. Assume that $D$ contains exactly $n$ states. Show that if it accepts at least one string of length $n$ or greater then the accepted language is infinite.

Problem 4

Find a regular expression that generates all the alternating sequences of 0 and 1 with arbitrary length. For example, the alternating sequences of length one are 0 and 1, length two are 01 and 10, length three are 010 and 101. Note that no two adjacent character can be the same in an alternating sequence.

Problem 5

Construct a DFA for the language of nested parenthesis with a maximal nesting depth of 3. For example, ε, ()(), (()(())) and (()())()(), but not (((()))) or (()(()(()))).

Problem 6

  • A lexical analyzer uses the following regular grammar for tokenizing the input string $\Sigma = \{\#,!\}$.
S -> #A | !# | !
A -> !  | #B | ε
B -> #B | !

Determine how the input string '###!' is tokenized.

  • Give a suitable priority level to the rules of the grammar so that it returns the longest matched tokens.
cc17/homework_1.txt · Last modified: 2017/02/09 12:42 by hossein