Parsing With Compositional Vector Grammars
Richard Socher, John Bauer, Christopher D. Manning, Andrew Y. Ng

Natural language parsing has typically been done with small sets of discrete categories such as NP and VP, but this representation does not capture the full syntactic nor semantic richness of linguistic phrases, and attempts to improve on this by lexicalizing phrases or splitting categories only partly address the problem at the cost of huge feature spaces and sparseness. Instead,
we introduce a Compositional Vector Grammar (CVG), which combines PCFGs with a syntactically untied recursive neural network that learns syntacticosemantic, compositional vector representations.
The CVG improves the PCFG of the Stanford Parser by 3.8% to obtain an F1 score of 90.4%.
It is fast to train and implemented approximately as an efficient reranker it is about 20% faster than the current Stanford factored parser. The CVG learns a soft notion of head words and improves performance on the types of ambiguities that require semantic information such as PP attachments.

Download Paper
Code
Further Analysis and Visualization
 Here are three of the learned W matrices showing how the model learns a soft version of head words:
 VPNP:
 DTNP:
 ADJPNP
 PPNP
 QPSBAR: Example for matrices that occur very rarely and hence stay close to the initialization
Bibtex
 Please cite the following paper when you use the code:
@incollection{SocherEtAl2013:CVG,
title = {{Parsing With Compositional Vector Grammars}},
author = {Richard Socher and John Bauer and Christopher D. Manning and Andrew Y. Ng},
booktitle = {{ACL}},
year = {2013}
}
Comments Critique, Questions
Copy what you write before you post, then type in the password, post (nothing happens), then copy the text and repost.
ugur? — 27 September 2016, 21:27
Is there a video for the paper? thanks,
I'am sorry, in last comment my email address have some wrong. follow is my email address.
c2010120422@gmail.com
thank you very much!
hello! can you do me a favour. I want to get the ppt of essay "Parsing with Compositional Vector Grammars" in 2013. can you send it to me through email. my email is c2010120422@gmai.com.
thank you very mush!
Hi Richard
Can you please give me some idea how to extract the vector representation of a sentence using this tool? It would be really helpful if you can give some idea about where should I look to get them.
Best
Does really work? Can i just write ANYTHING!?!?
@Manan, you have to go into the code to get them.
Manan? — 29 August 2014, 15:58
Hi,
The code which you have shared gives the parsed output, but does it even return the scores computed/vector representation ?
Manan
the tensor V is learned via backpropagation using the equations in the paper.
U is another set of parameters that we learn, it is an n dimensional vector.
Hi Sir,
Can you please clear my doubt in Recursive Neural Tensor Network. The equation (bc)^T V[1:2] (bc) +w(bc)In this how we compute V[1:2]. and in previous equations while calculating the score score = U^Tp what exactly is U^T??