Parsing With Compositional Vector Grammars

Richard Socher, John Bauer, Christopher D. Manning, Andrew Y. Ng

Natural language parsing has typically been done with small sets of discrete categories such as NP and VP, but this representation does not capture the full syntactic nor semantic richness of linguistic phrases, and attempts to improve on this by lexicalizing phrases or splitting categories only partly address the problem at the cost of huge feature spaces and sparseness. Instead, we introduce a Compositional Vector Grammar (CVG), which combines PCFGs with a syntactically untied recursive neural network that learns syntactico-semantic, compositional vector representations. The CVG improves the PCFG of the Stanford Parser by 3.8% to obtain an F1 score of 90.4%. It is fast to train and implemented approximately as an efficient reranker it is about 20% faster than the current Stanford factored parser. The CVG learns a soft notion of head words and improves performance on the types of ambiguities that require semantic information such as PP attachments.

Download Paper


Further Analysis and Visualization


Comments Critique, Questions

Copy what you write before you post, then type in the password, post (nothing happens), then copy the text and re-post.

Add Comment 
Sign as Author 
Enter code:

Yun?14 April 2017, 17:38

Hello! Can you do me a favour. I want to get the ppt of paper "Parsing with Compositional Vector Grammars" in 2013. Can you send it to me through email? My email is Thank you very much!

ugur?27 September 2016, 21:27

Is there a video for the paper? thanks,

Chen Along?28 May 2016, 04:02

I'am sorry, in last comment my email address have some wrong. follow is my email address.

thank you very much!

Chen Along?28 May 2016, 02:41

hello! can you do me a favour. I want to get the ppt of essay "Parsing with Compositional Vector Grammars" in 2013. can you send it to me through email. my email is

thank you very mush!

Shujon Naha?11 May 2015, 04:27

Hi Richard Can you please give me some idea how to extract the vector representation of a sentence using this tool? It would be really helpful if you can give some idea about where should I look to get them.


Youtube spammer?08 April 2015, 20:43

Does really work? Can i just write ANYTHING!?!?

RichardSocher14 March 2015, 05:32

@Manan, you have to go into the code to get them.

Manan?29 August 2014, 15:58


The code which you have shared gives the parsed output, but does it even return the scores computed/vector representation ?


RichardSocher13 April 2014, 23:48

the tensor V is learned via backpropagation using the equations in the paper. U is another set of parameters that we learn, it is an n dimensional vector.

Pritpal?20 March 2014, 13:29

Hi Sir,

          Can you please clear my doubt in Recursive Neural Tensor Network. The equation (bc)^T V[1:2] (bc) +w(bc)In this how we compute V[1:2]. and in previous equations while calculating the score score = U^Tp what exactly is U^T??