With the blooming of information technology and network applications/services, emerging multi-access edge computing (MEC) and in-network computing (INC) are regarded as key computing paradigms to support time-sensitive tasks and requests by task offloading. Existing studies concerning task offloading seldom considered the combinations of MEC, INC, and cloud computing. In this paper, we explore INC-enhanced task offloading in MEC networks and design a three-layer task offloading network architecture, which consists of not only user equipment, edge servers, and cloud, but also network elements like routers/switches. We focus on reducing the system latency and energy consumption (EC) and formulate the optimization problem as minimizing the weighted sum of these two indicators. To solve this problem, we propose a deep reinforcement learning-based framework and creatively map the actions of the agent to the offloading policies and resource allocation strategies for determining these two indicators simultaneously. Simulation results show that the proposed INC-enhanced task offloading framework achieves fast convergence speed with double deep Q-network and outperforms other baselines in reducing the system latency and EC.