Nataliya SokolovskaThomas LavergneOlivier CappéFrançois Yvon
Conditional Random Fields (CRFs) constitute a popular and efficient approach\nfor supervised sequence labelling. CRFs can cope with large description spaces\nand can integrate some form of structural dependency between labels. In this\ncontribution, we address the issue of efficient feature selection for CRFs\nbased on imposing sparsity through an L1 penalty. We first show how sparsity of\nthe parameter set can be exploited to significantly speed up training and\nlabelling. We then introduce coordinate descent parameter update schemes for\nCRFs with L1 regularization. We finally provide some empirical comparisons of\nthe proposed approach with state-of-the-art CRF training strategies. In\nparticular, it is shown that the proposed approach is able to take profit of\nthe sparsity to speed up processing and hence potentially handle larger\ndimensional models.\n
Xian QianXiaoqian JiangQi ZhangXuanjing HuangLide Wu
Romansha ChopraNivedita SinghZhenning YangN. Ch. S. N. Iyengar
Feng JiaoShaojun WangChi‐Hoon LeeRussell GreinerDale Schuurmans