BOOK-CHAPTER

Assessing Logical Reasoning Capabilities of Encoder-Only Transformer Models

Keywords:
Computer science Encoder Transformer Programming language Electrical engineering Operating system Engineering

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
42
Refs
0.45
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Formal Methods in Verification
Physical Sciences →  Computer Science →  Computational Theory and Mathematics
Fault Detection and Control Systems
Physical Sciences →  Engineering →  Control and Systems Engineering
AI-based Problem Solving and Planning
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

A Simplified Query-Only Attention for Encoder-Based Transformer Models

Hong Gi YeomKyung‐min An

Journal:   Applied Sciences Year: 2024 Vol: 14 (19)Pages: 8646-8646
JOURNAL ARTICLE

Simplifying AI reasoning: unlocking logical capabilities in large language models (LLMs)

Poorani Subramanian

Journal:   World Journal of Advanced Research and Reviews Year: 2025 Vol: 26 (2)Pages: 1835-1841
JOURNAL ARTICLE

Simplifying AI reasoning: unlocking logical capabilities in large language models (LLMs)

Subramanian, Peraschi Selvan

Journal:   Zenodo (CERN European Organization for Nuclear Research) Year: 2025
JOURNAL ARTICLE

Simplifying AI reasoning: unlocking logical capabilities in large language models (LLMs)

Subramanian, Peraschi Selvan

Journal:   Zenodo (CERN European Organization for Nuclear Research) Year: 2025
© 2026 ScienceGate Book Chapters — All rights reserved.