exploiting a probabilistic hierarchical model for generation previous stochastic approaches to generation do not include a tree-based representation of syntax. while this may be adequate or even advantageous for some applications, other applications profit from using as much syntactic knowledge as is available, leaving to a stochastic model only those issues that are not determined by the grammar. we present initial results showing that a tree-based model derived from a tree-annotated corpus improves on a tree model derived from an unannotated corpus, and that a tree-based stochastic model with a hand-crafted grammar outperforms both. our system, fergus takes dependency structures as inputs, and produced xtag derivations by a stochastic tree model automatically acquired from an annotated corpus. the fergus system employs a statistical tree model to select probable trees and a word n-gram model to rank the string candidates generated from the best trees.