TransCLGA : Combining Local and Global Attention in Transformer for Temporal Action Detection
PDF
HTML

Keywords

Temporal action detection
Vision Transformer
Gated mechanism
Self-attention
Action recognition

How to Cite

Zhang, B., Fang, Y., Zhang, X., & Zhang, Y. (2026). TransCLGA : Combining Local and Global Attention in Transformer for Temporal Action Detection. Instrumentation, 13(1). https://doi.org/10.15878/j.instr.202600298

Funding data

Abstract

Temporal action detection is a very important task in the field of computer vision, which aims to identify and locate specific actions or events occurring in a video. How to fully extract feature information and perform efficient and accurate feature fusion is always an important topic in time series motion detection. To solve this problem, a Combining Local and Global Attention in Transformer (TransCLGA) model is proposed in this paper, which aims to extract rich feature information at each time scale and perform regulated multi-scale fusion. In the backbone network, we use a Hybrid Attention Module that combines local multi-head attention and global multi-head attention mechanism to give full play to Transformer's advantages in processing global information and make up for its shortcomings in local feature extraction. Channel Convolutional Block is introduced in this part to reduce the interference information brought by global-local information extraction and enhance the feature representation. In the neck of the model, we designed a gated feature fusion pyramid, which realized the effective integration of information at different time scales by selectively retaining key information, and finally realized the accurate prediction of the motion detection head. Our model was tested on two datasets (THUMOS14 and ActivityNet1.3) with excellent results.

https://doi.org/10.15878/j.instr.202600298
PDF
HTML
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright (c) 2026 Bin Zhang, Yinfeng Fang, Xuguang Zhang, Yun Zhang

Downloads

Download data is not yet available.

Publication Facts

Metric
This article
Other articles
Peer reviewers 
2
2.4

Reviewer profiles  N/A

Author statements

Author statements
This article
Other articles
Data availability 
N/A
16%
External funding 
Funders: Yes
32%
Competing interests 
N/A
11%
Metric
This journal
Other journals
Articles accepted 
77%
33%
Days to publication 
328
145

Indexed in

Editor & editorial board
profiles