Distributed representation of a sentences cannot only be seen with the sequence but also with dependency. In this paper, we proposed an answer assessment model that considers with dependency relational each word. The dependency relational is obtained by universal dependency modelling from CoNLL format data. The dependency relational is used in Long Short-Term Memory (LSTM) architecture by modified the hidden state, which is called dependency tree LSTM. The proposed method has an improvement on QWK and accuracy in 2.38% and 2.05%, respectively, that compared with the LSTM state of the art in the English short essay. Furthermore, the proposed method with an Indonesian short essay shows the evaluation of QWK and accuracy of 68.07% and 82.51%, respectively.