Tree automata are straightforward generalizations of familiar Finite State Automata, which are computationally attractive ways of representing properties of trees. Interestingly, as ways of expressing constraints on trees, they are equivalent to a highly expressive predicate logic, the monadic second order logic of dominance and precedence relations over trees. One can view the logic as an extremely compact and expressive specification language for tree automata or one can view tree automata as computationally efficient implementations of grammatical principles formalized in the logic. Either way, the equivalence affords tremendous benefits for the problem of developing grammars that machines can use: one can think of principle-based grammars, formalized in MSO logic, as having been written in a higher level language which is compiled into the machine language of tree automata. Of course, anyone who has tried to formalize syntactic theories in such a logic will be tempted to dispute the claim of ``high levelhood'', and an interesting topic of research is how to construct even higher-level grammar-writing languages with similar properties. Similar work has already been done in computer science, where related techniques are used for various kinds of system verification tasks.
In future work I would like to exploit the connections between logic, automata and phrase structure grammars to develop techniques for grammar specification that allow for the use of grammatical principles as the theorists propose them as devices for the management of large augmented phrase structure grammars. The essential idea is to take a collection of high-level grammatical principles and compile them into a definite clause grammar (for instance). Something like this is already commonly done in computational implementations of Head-Driven Phrase Structure Grammar, for example. In this way the succinctness of syntactic theory may provide leverage over a system which is flexible enough to retain descriptive adequacy. In many ways this division of labor into a high-level grammar specification and the low-level grammar it is compiled into, with subsequent tinkering at the low level, reflects the theoretical division of human grammar into core and peripheral properties, though for my part I consider this particular line of my own research to have more practical than theoretical goals.
On a more theoretical level, the logic-automaton connection brings with it descriptive complexity results for pure, unfettered principle-based approaches to grammar. Unfortunately, the monadic second order tree logic, because it is equivalent to tree automata in its capacity to describe sets of tree structures, is a principle language which is weakly only context free. So an interesting line of research is the pursuit of slightly more powerful logics which retain the tight connection to language-theoretic complexity results that we find in monadic second order logic. This is an interesting line of work for theoretical syntax because the devices which are used to describe structures beyond the descriptive capacity of context free systems are quite restricted--basically they reduce to the devices needed to implement and control ``head movement''--and at the same time the topic of rather intensive current theoretical and empirical research. So this is a topic in which formal and empirical concerns interact and the key ideas needed to make progress on both fronts are likely to emerge from that interaction.