JOURNAL ARTICLE

Motion estimation block for HEVC encoder On FPGA

Abstract

The world of digital electronics and video compression has become so popular and advanced in such a way that the capture, display and transmission of the ultra-high definition video have become a common thing and is used in various related fields. Due to limited band width and stringent requirements of real-time video play back, video coding is an indispensable process for many visual communication applications and always requires a very high compression ratio. To support this growing popularity and consumer demand, ITU has come up with a new standard HEVC (High Efficiency Video Coding). The new codec helps in achieving the same video quality of the established H.264/AVC codec utilizing just half of the band width, which is notably the greatest advantage it has to offer. Based on the study of motion vector distribution from several commonly used test image sequences, a new diamond search algorithm for fast Motion Estimation (ME) for HECV encoder is proposed in this paper. This algorithm is simple, robust and efficient ensuring low complexity paralleled with high quality in finding global minimum.

Keywords:
Computer science Encoder Codec Quarter-pixel motion Motion estimation Motion compensation Motion vector Video compression picture types Multiview Video Coding Video quality Data compression Computer vision Video processing Video capture Real-time computing Video tracking Artificial intelligence Computer engineering Computer hardware

Metrics

9
Cited By
1.36
FWCI (Field Weighted Citation Impact)
11
Refs
0.83
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Video Coding and Compression Technologies
Physical Sciences →  Computer Science →  Signal Processing
Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Data Compression Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.