Loading…

Enhancing visual quality of spatial image steganography using SqueezeNet deep learning network

The aims of improving steganographic method are divided into two groups: the first is to make the hiding capacity as high as possible; the second is to make the visible distortion as low as possible. The higher the visual quality of the stego-image, the less suspicious it becomes, which can increase...

Full description

Saved in:
Bibliographic Details
Published in:Multimedia tools and applications 2021-11, Vol.80 (28-29), p.36093-36109
Main Authors: Hamid, Nagham, Sumait, Balasem Salem, Bakri, Bilal Ibrahim, Al-Qershi, Osamah
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The aims of improving steganographic method are divided into two groups: the first is to make the hiding capacity as high as possible; the second is to make the visible distortion as low as possible. The higher the visual quality of the stego-image, the less suspicious it becomes, which can increase security. However, the distortion caused by embedding data into images is not predictable and typically image dependent. If the user has a database of possible cover images, finding a suitable cover image that can sustain high visual quality after embedding is challenging. Thus, an automatic cover selection method is needed. In this paper, the problem of visual quality of the stego-image is tackled as a classification problem, where a CNN-based classifier is employed to select images that can have high imperceptibility after the process of embedding. To achieve that, a CNN was trained to classify images into “High Quality” and “Low Quality”. The CNN was based on SqueezeNet architecture, and was trained in two scenarios; transfer learning and learning from scratch. The two classifiers were able to achieve very high classification accuracies of F 1  = 0.926 and 0.904.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-021-11315-y