JOURNAL ARTICLE

Efficient GPU-Based Query Processing with Pruned List Caching in Search Engines

Abstract

There are two inherent obstacles to effectively using Graphics Processing Units (GPUs) for query processing in search engines: (a) the highly restricted GPU memory space, and (b) the CPU-GPU transfer latency. Previously, Ao et al. presented a GPU method for lists intersection, an essential component in AND-based query processing. However, this work assumes the whole inverted index can be stored in GPU memory and does not address document ranking. In this paper, we describe and analyze a GPU query processing method which incorporates both lists intersection and top-k ranking. We introduce a parameterized pruned posting list GPU caching method where the parameter determines how much GPU memory is used for caching. This method allows list caching for large inverted indexes using the limited GPU memory, thereby making a qualitative improvement over previous work. We also give a mathematical model which can identify an approximately optimal choice of the parameter. Experimental results indicate that this GPU approach under the pruned list caching policy achieves better query throughput than its CPU counterpart, even when the inverted index size is much larger than the GPU memory space.

Keywords:
Computer science Parallel computing General-purpose computing on graphics processing units Inverted index Graphics Graphics processing unit Intersection (aeronautics) Search engine indexing Latency (audio) CUDA Ranking (information retrieval) Information retrieval Operating system

Metrics

3
Cited By
0.24
FWCI (Field Weighted Citation Impact)
37
Refs
0.61
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Caching and Content Delivery
Physical Sciences →  Computer Science →  Computer Networks and Communications
Data Management and Algorithms
Physical Sciences →  Computer Science →  Signal Processing
Web Data Mining and Analysis
Physical Sciences →  Computer Science →  Information Systems
© 2026 ScienceGate Book Chapters — All rights reserved.