IJSMT Journal

International Journal of Science, Strategic Management and Technology

An International, Peer-Reviewed, Open Access Scholarly Journal Indexed in recognized academic databases · DOI via Crossref The journal adheres to established scholarly publishing, peer-review, and research ethics guidelines set by the UGC

ISSN: 3108-1762 (Online)
webp (1)

Plagiarism Passed
Peer reviewed
Open Access

AN ATTENTION U-NET BASED ON NOISE INCONSISTENCY FOR ROBUST IMAGE FORGERY DETECTION AND PRECISE PIXEL-LEVEL LOCALIZATION

AUTHORS:
P.Muniasamy
Mentor
Dr.B.Srinivasan
Affiliation
Department of Information Technology, Sri Ramakrishna Mission Vidyalaya College of Arts and Science (Autonomous and affiliated to Bharathiar University), Coimbatore
CC BY 4.0 License:
This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract

With the rapid advancement of image editing tools, digitally manipulated images have become increasingly common, raising serious concerns about the authenticity and reliability of visual information across domains such as social media, journalism, healthcare, and legal investigations. Although recent deep learning approaches, especially U-Net and attention-based models, have demonstrated promising results in forgery detection and localization, many of these methods mainly depend on visual and texture-based features. As a result, they often struggle to maintain performance when images undergo post-processing operations like compression, resizing, or contrast adjustments.


To address these challenges, this work introduces a Noise-Inconsistency Guided Attention U-Net (NIGA-UNet) for robust image forgery detection and precise pixel-level localization. The proposed model focuses on capturing noise-related inconsistencies by extracting noise residuals and analyzing local variations, which are then incorporated into a noise-guided attention mechanism. This enables the network to better highlight manipulated regions. In addition, a dual-encoder U-Net architecture with multi-scale feature fusion is employed to effectively combine both visual and noise-domain information.

Keywords
Article Metrics
Article Views
28
PDF Downloads
0
HOW TO CITE
APA

MLA

Chicago

Copy

P.Muniasamy, (2026). An Attention U-Net Based on Noise Inconsistency for Robust Image Forgery Detection and Precise Pixel-Level Localization. International Journal of Science, Strategic Management and Technology, 02(04). https://doi.org/10.55041/ijsmt.v2i3.445

P.Muniasamy, . "An Attention U-Net Based on Noise Inconsistency for Robust Image Forgery Detection and Precise Pixel-Level Localization." International Journal of Science, Strategic Management and Technology, vol. 02, no. 04, 2026, pp. . doi:https://doi.org/10.55041/ijsmt.v2i3.445.

P.Muniasamy, . "An Attention U-Net Based on Noise Inconsistency for Robust Image Forgery Detection and Precise Pixel-Level Localization." International Journal of Science, Strategic Management and Technology 02, no. 04 (2026). https://doi.org/https://doi.org/10.55041/ijsmt.v2i3.445.

References
1.Liu, Y., Li, X., Zhang, J., Li, S., Hu, S., & Lei, J. (2024). Hierarchical Progressive Image Forgery Detection and Localization Method Based on UNet. Big Data and Cognitive Computing8(9), 119. https://doi.org/10.3390/bdcc8090119

2.Choudhary, R. R., Paliwal, S., &Meena, G. (2024). Image forgery detection system using VGG16 UNET model. Procedia Computer Science235, 735-744. https://doi.org/10.1016/j.procs.2024.04.070

3.Peng, J., Li, Y., Liu, C., &Gao, X. (2023). The circular U-Net with attention gate for image splicing forgery detection. Electronics12(6), 1451. https://doi.org/10.3390/electronics120614514.Gu, A. R., Nam, J. H., & Lee, S. C. (2022). FBI-Net: Frequency-based image forgery localization via multitask learning with self-attention. IEEE access10, 62751-62762.1109/ACCESS.2022.3182024

5.Yan, C., Li, S., & Li, H. (2023). Transu 2-net: A hybrid transformer architecture for image splicing forgery detection. IEEE Access11, 33313-33323. DOI: 1109/ACCESS.2023.3264014

6.Khalil, A. H., Ghalwash, A. Z., Elsayed, H. A. G., Salama, G. I., &Ghalwash, H. A. (2023). Enhancing digital image forgery detection using transfer learning. IEEE Access11, 91583-91594. DOI: 1109/ACCESS.2023.3307357

7.Albahli, S., & Nawaz, M. (2024). MedNet: Medical deepfakes detection using an improved deep learning approach. Multimedia Tools and Applications83(16), 48357-48375. https://doi.org/10.1007/s11042-023-17562-5

8.Mallampati, B., Ishaq, A., Rustam, F., Kuthala, V., Alfarhood, S., & Ashraf, I. (2023). Brain tumor detection using 3D-UNet segmentation features and hybrid machine learning model. IEEE Access11, 135020-135034. DOI: 1109/ACCESS.2023.3337363

9.Ali, S. S., Ganapathi, I. I., Vu, N. S., Ali, S. D., Saxena, N., &Werghi, N. (2022). Image forgery detection using deep learning by recompressing images. Electronics11(3), 403. https://doi.org/10.3390/electronics11030403

10.Sariturk, B., &Seker, D. Z. (2022). A residual-inception U-Net (RIU-Net) approach and comparisons with U-shaped CNN and transformer models for building segmentation from high-resolution satellite images. Sensors22(19), 7624. https://doi.org/10.3390/s22197624
Ethics and Compliance
✓ All ethical standards met
This article has undergone plagiarism screening and double-blind peer review. Editorial policies have been followed. Authors retain copyright under CC BY-NC 4.0 license. The research complies with ethical standards and institutional guidelines.
Indexed In
Similar Articles
Study on Logistics Service Quality in KPN Courier Service in Trichy
string(7) "Kamil B" B, K.
(2026)
DOI: 10.55041/ijsmt.v2i3.271
Investigating the Gut Microbiome Using Simple Fermentation Experiments
string(13) "Meera V. Nair" Nair, M. V.et al.
(2026)
DOI: 10.55041/ijsmt.v2i2.002
AI Resume Shortlisting System using Machine Learning
string(11) "M.Prabhakar" M.Prabhakar,
(2026)
DOI: 10.55041/ijsmt.v2i3.083
Scroll to Top