JOURNAL ARTICLE

Efficient Learning of Sparse Conditional Random Fields for Supervised Sequence Labeling

Nataliya SokolovskaThomas LavergneOlivier CappéFrançois Yvon

Year: 2010 Journal:   IEEE Journal of Selected Topics in Signal Processing Vol: 4 (6)Pages: 953-964   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Conditional Random Fields (CRFs) constitute a popular and efficient approach\nfor supervised sequence labelling. CRFs can cope with large description spaces\nand can integrate some form of structural dependency between labels. In this\ncontribution, we address the issue of efficient feature selection for CRFs\nbased on imposing sparsity through an L1 penalty. We first show how sparsity of\nthe parameter set can be exploited to significantly speed up training and\nlabelling. We then introduce coordinate descent parameter update schemes for\nCRFs with L1 regularization. We finally provide some empirical comparisons of\nthe proposed approach with state-of-the-art CRF training strategies. In\nparticular, it is shown that the proposed approach is able to take profit of\nthe sparsity to speed up processing and hence potentially handle larger\ndimensional models.\n

Keywords:
CRFS Conditional random field Sequence labeling Computer science Artificial intelligence Dependency (UML) Regularization (linguistics) Sequence (biology) Machine learning Natural language processing

Metrics

15
Cited By
1.60
FWCI (Field Weighted Citation Impact)
48
Refs
0.85
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Image and Video Retrieval Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Music and Audio Processing
Physical Sciences →  Computer Science →  Signal Processing
Image Retrieval and Classification Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.