Abstract
In this paper, we propose a method that exploits full parsing information by representing it as features of argument classification models and as constraints in integer linear learning programs. In addition, to take advantage of SVM-based and Maximum Entropy-based argument classification models, we incorporate their scoring matrices, and use the combined matrix in the above-mentioned integer linear programs. The experimental results show that full parsing information not only increases the F-score of argument classification models by 0.7%, but also effectively removes all labeling inconsistencies, which increases the F-score by 0.64%. The ensemble of SVM and ME also boosts the F-score by 0.77%. Our system achieves an F-score of 76.53% in the development set and 76.38% in Test WSJ.
Original language | English |
---|---|
Pages | 233-236 |
Number of pages | 4 |
DOIs | |
State | Published - 2005 |
Event | 9th Conference on Computational Natural Language Learning, CoNLL 2005 - Ann Arbor, MI, United States Duration: 29 Jun 2005 → 30 Jun 2005 |
Conference
Conference | 9th Conference on Computational Natural Language Learning, CoNLL 2005 |
---|---|
Country/Territory | United States |
City | Ann Arbor, MI |
Period | 29/06/05 → 30/06/05 |