Watermark and data hiding evaluation: The development of a statistical analysis framework

Hyung Cook Kim, Purdue University

Abstract

Digital watermarking is the practice of hiding a message in an image, audio, video or other digital media elements. Since the late 1990's, there has been an explosion in the number of digital watermarking algorithms published. But there are few widely accepted tools and metrics that can be used to validate the performance claims being asserted by members of the research community. Robust image watermarks are watermarks designed to survive attacks including signal processing operations and spatial transformations. To evaluate robust watermarks, we need to evaluate how attacks affect the fidelity of an image. The mean square error (MSE) is the most popular metric to measure fidelity. MSE, as it is, cannot measure the fidelity for images that went through geometric attacks such as rotation, pixel loss attacks such as cropping, or valumetric attacks such as gamma correction. We take the approach of evaluating attacks using MSE by compensating valumetric, pixel loss, and geometric attacks using conditional mean, error concealment, and motion compensation, respectively. Robust watermarks are evaluated in terms of fidelity and robustness. To measure robustness, bit error rate, message error rate and the receiver operating characteristic (ROC) of the watermark decoder and detector are currently used in the literature. We extend this framework by adopting reliability testing. We define reliability as the probability that a watermarking algorithm will correctly detect or decode a watermark for a specified fidelity requirement under a given set of attacks and images. We evaluate three watermarking algorithms in terms of quality (fidelity), load (watermark strength, payload), capacity (minimum watermark strength, maximum payload), and performance (robustness). We adopt the Taguchi loss function which is a compromise between average and percentile to summarize fidelity and performance. To facilitate the use of a watermark evaluation method, we need a watermark evaluation benchmark that implements that method. To meet this need, we developed the watermark evaluation testbed (WET). This system consists of reference software that includes both watermark embedders and detectors, attacks, evaluation modules and image database.

Degree

Ph.D.

Advisors

Delp, Purdue University.

Subject Area

Electrical engineering

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS