Deep Neural Network models have achieved strong effects in NLP Recently, the structured neural networks have been used to encode syntactic and semantic role structures. These adds to the representation power of linear structures for encoding sentences. On the other hand, recurrent structures have been less explored for encoding graph structures, which are seen in semantics tasks such as AMR. We investigate graph structure recurrent networks for sentence encoding and semantic embedding. In particular, a special state transition mechanism is design to handle cyclic structure in graphs. Results show that our network architecture can be useful for both sentence embedding and AMR structure encoding, giving highly strong accuracies.
Yue Zhang is currently an assistant professor at Singapore University of Technology and Design. Before joining SUTD in July 2012, he worked as a postdoctoral research associate in University of Cambridge, UK. Yue Zhang received his DPhil and MSc degrees from University of Oxford, UK, and his BEng degree from Tsinghua University, China. His research interests include natural language processing, machine learning and artificial Intelligence. He has been working on statistical parsing, parsing, text synthesis, machine translation, sentiment analysis and stock market analysis intensively. Yue Zhang serves as the reviewer for top journals such as Computational Linguistics, Transaction of Association of Computational Linguistics (standing review committee) and Journal of Artificial Intelligence Research. He is the associate editor for ACM Transactions on Asian and Low Resource Language Information Processing. He is also PC member for conferences such as ACL, COLING, EMNLP, NAACL, EACL, AAAI and IJCAI. He was the area chairs of COLING 2014/18, NAACL 2015, EMNLP 2015/17, ACL 2017/18. He is the TPC chair of IALP 2017.