Abstract

Many proposed image watermarking techniques are sensitive to geometric distortions such as rotation, scaling, and translation. Geometric distortion, even by a slight amount, can disable a watermark decoder. In this study, a geometric distortion-invariant watermarking technique is designed by utilizing Tschebycheff moments of the original image to estimate the geometric distortion parameters of corrupted watermarked images. The Tschebycheff moments of an original image can be used as a private key for watermark extraction. The embedding process is a closed-loop system that modifies the embedding intensity according to the results of the performance analysis. The convergence of the closed-loop system is proved. Different from early heuristic methods, the optimal blind watermark detector is designed with the introduction of dual-channel detection utilizing high-order spectra detection and likelihood detection. Even with a small signal-to-noise ratio (SNR), the detector can still get a satisfying detection probability if there is enough high-order spectra information. When the high-order spectra are small, this dual-channel detection system will become a likelihood detection system. The watermark decoder extracts a watermark by blindly utilizing independent component analysis (ICA). The computational aspects of the proposed watermarking technique are also discussed in detail. Experimental results demonstrate that the proposed watermarking technique is robust with respect to attacks performed by the popular watermark benchmark, StirMark.

© 2007 Optical Society of America

1. Introduction

The digital watermarking technique is an effective means by which to resolve copyright protection and information security problems by embedding watermark information into protected digital media [1]. In watermarking applications, robustness against geometric attacks is essential. Geometric distortions inevitably damage synchronization that is crucial for correct detection. This is owing to the fact that minor geometric manipulation could dramatically reduce the detector’s effectiveness. Despite the efforts and progress made in this direction, a watermark’s robustness against geometric distortions has not been well-addressed, and it still remains an open problem. The effect of geometric attacks can be better understood by drawing an analogy between a watermarking system and a communication system. The principle of synchronization between the encoder and the decoder in the communication system holds true also for the watermarking system.

Research has been done to deal with the watermark’s vulnerability to geometric distortions. Various methods have been proposed [2,3,4], which are summarized and categorized as follows. The first category is based on distortion inversion. A registration pattern is inserted into the host signal along with a watermark [5], or the watermark is designed with a special structure [6] so that in the stage of watermark detection the involved geometric distortions can be identified and measured. Thus the geometric distortions can be removed by an inversion process. The second category is based on image normalization [7]. An image can be normalized to a certain position, orientation, and size, which are invariant to image translation, rotation, and scaling. The host image is normalized prior to watermark insertion, and the watermarked image is denormalized back to its original look after the watermark insertion. With the watermark detector, the test image has to undergo the same normalization process before watermark detection. The third category is based on invariance properties of some image features. Image features that have been used for invariance watermarking include Fourier-Mellin coefficients [8], geometric moment invariants [9], and Zernike moments [10].

Moments of orthogonal basis functions, such as Legendre and Zernike polynomials [11] can be used to represent the image by a set of mutually independent descriptors with a minimal amount of information redundancy. However, these moments present several problems. The most important of these problems is their inaccuracy due to quantization errors introduced by the discrete approximation of continuous integrals and the coordinate space. Tschebycheff moments were proposed [12] to address these problems. They are discrete orthogonal moments and present a number of advantages over moments of continuous orthogonal basis [13]. Mukundan’s study showed that the implementation of Tschebycheff moments does not involve any numerical approximation, since the basis set is orthogonal in the discrete domain of the image coordinate space. This property makes Tschebycheff moments superior to the conventional continuous orthogonal moments in terms of preserving the analytical property needed to ensure information redundancy in a moment set. Yap also proposed a local watermarking technique based on Krawtchouk moments [14], which is a private method.

Observing a traditional watermarking embedding process shown in Fig. 1, it can be found that it is an open-loop system from the viewpoint of system structure. In order to ensure invisibility, the embedded intensity of traditional watermarking must be limited. The watermarking technique proposed in this paper is a closed-loop system shown in Fig. 2.

 

Fig. 1. Traditional watermark embedding system.

Download Full Size | PPT Slide | PDF

 

Fig. 2. Closed-loop watermark embedding system.

Download Full Size | PPT Slide | PDF

Recently, blind source separation by independent component analysis (ICA) has received much more attention because of its potential application in signal processing such as in speech recognition systems, medical signal processing, and telecommunications. ICA is a general-purpose statistical technique that extracts the linear transformation for a given set of observed data such that the resulting variables are as statistically independent as possible.

Higher-order spectra retain both amplitude and phase information from Fourier transform of a signal. Higher-order spectra are translation-invariant because linear phase terms are cancelled in the products of the Fourier coefficients that define them. Functions that can serve as features for pattern recognition can be defined from higher-order spectra that satisfy other desirable invariance properties such as scaling, amplification, and rotation invariance. Higher-order spectra are zero for Gaussian noise and thus provide high noise immunity to such features.

A geometric distortion-invariant watermarking is proposed utilizing the Tschebycheff moment of the original image to estimate the geometric distortion parameters. This estimation method can be used as preprocess of watermark detection to recovery synchronization of the watermarking system. The watermark is generated randomly, independent of the original image, and embedded by modifying perceptual Tschebycheff moments of the original image. The embedded intensity is modified according to the results of performance analysis to obtain optimal embedding. The initial embedded intensity can be selected randomly. The convergence of a closed-loop system is proved. An optimum blind watermark detector is designed utilizing dual-channel detection with the introduction of high-order spectra detection and likelihood detection. ICA is used to extract a watermark blindly. Computational aspects of proposed watermarking are discussed in detail. Experimental results have demonstrated that the proposed watermarking is robust against the watermark benchmark, StirMark.

2. Background

In this section, how to estimate geometric distortion parameters will be described utilizing one or more Tschebycheff moments of the original image in detail.

2.1 Definition of Tschebycheff moments

Discrete Tschebycheff moments with order p + q of image f(x, y) are defined as [12]

Tpq=1ρ(p,N)ρ(q,N)x=0N1y=0N1tp(x)tp(y)f(x,y),p,q=0,1,2,,N1,

where ρ(n,N)=x=0N1{tn(x)}2=N2(N21)(N222)(N2n2)2n+1=(2n)!N+n2n+1.

The discrete Tschebycheff polynomials are expressed as

tn(x)=n!k=0n(1)nkN1knkn+knxk.

The inverse moment transform can be defined as

f(x,y)=p=0N1q=0N1Tpqtp(x)tq(y).

2. 2 Relationship of geometric moments and Tschebycheff moments

If geometric moments with p + q order of an image f(x, y) are expressed as [15]

mpq=x=0N1y=0N1xpyqf(x,y),

Tschebycheff moments can be simplified as

Tpq=1ρ˜(p,N)ρ˜(q,N)i=0pj=0qcj,q,Nci,p,Nmi,j.

It is seen that Tschebycheff moments depend on geometric moments up to the same order. The explicit expression of Tschebycheff moments in terms of geometric moments up to 2nd order [for β(n,N) = Nn] are as follows:

T00=m00N2,T10=6m10+3(1N)m00N(N21),T01=6m01+3(1N)m00N(N21),
T20=30m20+30(1N)m10+5(1N)(2N)m00(N21)(N22),T02=30m02+30(1N)m01+5(1N)(2N)m00(N21)(N22),
T11=36m11+18(1N)(m10+m01)+9(1N2)m00(N21)2.

2.3 Property of geometric transform

Geometric attacks are common techniques applied to images that do not actually remove the embedded watermark itself but distort the synchronization of the watermark detector. The transformation sequences applied to images are important, since a translation followed by a rotation is not necessarily equivalent to the converse. Geometric distortions can be written as:

x2y2=abcd×x1y1+ef,

where (x 1, y 1) is the pixel of an input image, and (x 2, y 2) is the pixel of the output image.

Here, some properties that hold for geometric distortions are described:

Property 1: A geometric distortion is defined uniquely by three pairs of points. That is, if a, b, and c are non-collinear points, and a′, b′, and c′ are corresponding points, then there exists a unique geometric distortion mapping three points to its corresponding point. According to this property, given three pairs of corresponding points, the geometric distortion parameters can be computed related by solving linear equations.

Property 2: Ratios of triangle areas are preserved. That is, given two sets of noncollinear points, {a,b,c} and {d, e, f} (not necessarily distinct from a, b and c), if T is a geometric distortion, then ΔabcΔdef=ΔT(a)T(b)T(c)ΔT(d)T(e)T(f), where Δx is the area of triangle x.

These properties are important for our method to estimate the parameters of geometric distortions.

3. Methodology of geometric distortion parameters estimation by Tschebycheff moments

3.1 Rotation angle estimation

The rotation operator performs a geometric transform, which maps the position of a picture element in an input image onto a position in the corresponding output image by rotating it through a user-specified angle θ around an original. We define f (x′, y′) as the rotated watermarked image; that is, pixel (x′, y′) is obtained by rotating pixel (x, y) by θ degree. We assume that the watermarked image is rotated by the center normalized as (0,0). The rotation operator can be expressed as

[xy]=[cosθsinθsinθcosθ]×[xy],

since m 00 is rotation-invariant. The relationships of 2nd order geometric moments are

m20=(1+cosθ2)m20(sin2θ)m11+(1cosθ2)m02,m02=(1cosθ2)m20+(sin2θ)m11+(1+cosθ2)m02,
m11=(sin2θ2)m20+(cos2θ)m11(sin2θ2)m02,

where m 20, m 02, and m 11 are geometric moments of the original and m20, m02 and m11 are geometric moments of the rotated watermarked image. It can be deduced easily that

m20+m02=m20+m02.

That is, m 20 + m 02 is rotation-invariant. The relationships of 1st order Tschebycheff moments are

T10=T10cosθm01sinθ+3NN+1(cosθsinθ+1),
T01=T10sinθT01cosθ+3NN+1(cosθ+sinθ1).

So it can be obtained that

T10cosθ+T01sinθ=T10+3(1N)N2N(N21)(cosθ+sinθ1)T00,

where T 00, T 10 and T10, T01 are Tschebycheff moments of original and corrupted images, respectively. Set Δ=T10cosθ+T10sinθT103(1N)N2N(N21)(cosθ+sinθ1)T00. θ can be computed by numerical analysis. Once θ satisfies Δ < ε, estimated results will be obtained. ε is a small enough positive value, which is set ε = 10-5 in our experiments.

3.2 Scaling factor estimation

The scale operator performs a geometric transformation, which can be used to shrink in or zoom out the size of an image (or part of an image). Scaling can be divided into two categories. One is symmetric scaling, in which the scaling factor is the same in different directions. The other is asymmetric scaling, sometimes called as shearing, in which scaling factors are different in different directions. Since Tschebycheff transforms compute Tschebycheff moments for the images with the same size in different directions, only symmetric scaling is considered. Suppose f(xa,ya) is a scaled watermarked image, f (x, y) is the original image, and a is the scaling factor. The relationship of geometric moments with order p + q can be expressed as

mpq=ap+q+2mpq.

It can be deduced that

m00=a2m00.

It can be obtained that

mpq(m00)(p+q+2)2=mpq(m00)(p+q+2)2.

We define ηpq=μpq(μ00)(p+q+2)2, which is translation- and scaling-invariant. μnm is the central moment [15]. The original image size is N × N and the scaled watermarked image size is N′× N′. So the scaling factor can be estimated by

a=N(N21)T103(1N)N2T00N(N21)T103(1N)N2T00,3

where T 00, T 10 and T00, T10 are Tschebycheff moments of original and corrupted images.

3.3 Translation parameter estimation

The translate operator performs a geometric transformation, which maps the position of each picture element in an input image into a new position in an output image. Under the translation, an image element located at (x 1, y 1) in the original is shifted to a new position in the corresponding output image by displacing it through a user-specified translation (c,d). The translation operator can be expressed as

{x=x+cy=y+d,

where, c and d are the translation parameters in the x and y directions, respectively. The Tschebycheff moments of the original image can be computed as

T10=6x=0N1y=0N1xf(x,y)+3(1N)x=0N1y=0N1f(x,y)N(N21),T01=6x=0N1y=0N1yf(x,y)+3(1N)x=0N1y=0N1f(x,y)N(N21).

The Tschebycheff moments of the translated watermarked image can be expressed as

T10=6x=0N1y=0N1(x+c)f(x,y)+3(1N)x=0N1y=0N1f(x,y)N(N21),T10=6x=0N1y=0N1(y+d)f(x,y)+3(1N)x=0N1y=0N1f(x,y)N(N21)

So it can be deduced that

T10=6x=0N1y=0N1xf(x,y)+3(1N)x=0N1y=0N1f(x,y)N(N21)+6x=0N1y=0N1cf(x,y)N(N21)=T10+6cNT00N21,
T10=6x=0N1y=0N1yf(x,y)+3(1N)x=0N1y=0N1f(x,y)N(N21)+6x=0N1y=0N1df(x,y)N(N21)=T01+6dNT00N21.

From Eq. (20), c and d can be estimated by Tschebycheff moments

c=(N21)(T10T10)(6NT00),d=(N21)(T01T01)(6NT00).

3.4 Combined distortions parameters, estimation-scaling, and rotation

Suppose (x 1, y 1), (x′, y′), and (x 1, y 1) are pixels of the original image, an image rotated θ degree counterclockwise, and a corrupted watermarked image with a as the scaling factor. The corrupted image is obtained by scaling the watermarked image with a and then rotating the scaled image by θ degree counterclockwise. That is

{x2=xcosθysinθ=x1acosθy1asinθy2=xsinθ+ycosθ=x1asinθ+y1acosθ.

Suppose T 10, T 01, T10, T01, and T10, T01 are Tschebycheff moments of the original image, a corrupted watermarked image, and a scaled watermarked image, respectively. The size of original image, the corrupted watermarked image, and the scaled watermarked image is N × N, N′ × N′, and N″ × N″, respectively. From previous sections, it can be deduced that

{N(N21)T103(1N)N2T00=a3[N(N21)T103(1N)N2T00]N(N21)T013(1N)N2T00=a3[N(N21)T013(1N)N2T00]N(N21)T103(1N)N2T00=N(N21)(T10cosθT01sinθ)3(1N)N2T00(cosθsinθ)N(N21)T013(1N)N2T00=N(N21)(T10sinθ+T01cosθ)3(1N)N2T00(cosθ+sinθ).

The scaling factor and rotation angle can be estimated as

{a=(T00N2)(T00N2)N(N21)(T10cosθT01sinθ)3(1N)N2T00(cosθ+sinθ)=a3[N(N21)T103(1N)N2T00].

3.5 Combined distortions parameters, estimation-scaling, and translation

Suppose (x 1,y 1), (x 2,y 2)are the pixels of the original image, translated as m and n in the x and y directions, respectively, of the watermarked image. Suppose the scaling followed by the translation is applied to the watermarked image from the previous section; it can be deduced that

{x2=x1a+my2=y1a+n.

Suppose T 10, T 01 and T 10′, T 01′ are Tschebycheff moments of the original image and the corrupted watermarked image, respectively. The size of the original image and the corrupted watermarked image are N × N, N′× N′, respectively. From previous sections, it can be deduced that

a=(T00N2)(T00N2)
c=[N(N21)T10a3N(N21)T103(1N)N2T00+3(1N)N2T00](6N2T00)
d=[N(N21)T01a3N(N21)T013(1N)N2T00+3(1N)N2T00](6N2T00).

4. Methodology of geometric distortion invariant watermarking

In this section, the proposed geometric invariant watermarking scheme based on Tschebycheff moments and dual channel detection is described.

4.1 Watermark embedding process

The watermark is imperceptibly embedded into the Tschebycheff moments of the original image, and the embedding intensity is modified according to the results of the performance analysis. The embedding steps of the proposed watermarking system are described as follows:

Step 1: Creation of the watermark. The watermark is generated independent from the original image and having the same size as the original image.

Step 2: The watermark embedding process. A set of low-order Tschebycheff moments To={T om1,n1, ⋯T omk,nk} and Tw={T wm1,n1, ⋯T wmk,nk} are constructed from the original image f (x, y) and the watermark W, respectively. The watermark is embedded by adjusting the Tschebycheff moments of f (x, y)with

T̂mi,ni=Tomi,ni+Nwk(mi,ni)Twmi,ni.

Then a set of adjusted Tschebycheff moments T̂ = {T̂m1,n1, ⋯T̂mk,nk} is obtained. The closed-loop watermark embedding process is as follows:

  1. Embed watermark with initial intensity N w0, selected randomly. Suppose the initial step for modification of the embedded intensity is q 0. Set q0 = ρN w0, where 0 < ρ < 1. Set k = 1.
  2. Whether or not the embedded watermark renders a visible artifact is detected. Both subjective and objective evaluation can be used to estimate the quality of the embedded watermark. Since a subjective evaluation cannot be accomplished by a computer automatically, an objective evaluation, peak signal-to-noise ratio (PSNR), is utilized to describe the difference between the watermarked image y(k) i,j and the original image x i,j, which is defined as

    PSNRk=10log{xmax2[N2i=1Ni=1N(xi,jyi,j(k))2]},

  3. where x max is the maximum luminance of the original image pixel. Suppose the threshold for PSNR is PSNR 0. If PSNR kPSNR 0, then the embedded watermark is invisible. Otherwise the embedded watermark is visible; that is, the embedded watermark renders a visible artifact.

    If PSNR k < PSNR 0, find an integer nk > 0 such that the watermark embedded with intensity N wk-1-nkq k-1 is visible. While the watermark embedded with intensity N wk-1 + (nk+1)q k-1 is invisible, N wk is the maximal probable embedded intensity. Set qk = ρq k-1, k = k +1, and go to (3). If PSNR kPSNR 0, find an integer nk > 0 such that the watermark embedded with intensity N wk = N wk-1 + nkq k-1 is invisible. While the watermark embedded with intensity N wk-1 + (nk+1)q k-1 is visible, N wk is the maximal probable embedded intensity. Set qk = ρq k-1, k = k+1.

  4. Calculate the detection probability pd and the false alarm probability fp to test whether the watermark that is embedded meets requirements in different applications. Suppose the thresholds of pd and fp are pd0 and pf0, respectively. The relationship between probabilities and the embedded watermark is shown in Table 1.
Tables Icon

Table 1. The relation between the probabilities and the embedded watermark

pd < pd0 or pf > pf0 means that the watermarking does not meet requirements. Increase the embedded intensity until it can meet the requirements. That is, find an integer m k-1 < 0 such that the watermark embedded with intensity N wk-1 + mk-1ρq k-1 yields results pd < pd0 or pf > pf0. If the embedded watermark intensity N wk = N wk-1 + ρ(m k-1)q k-1 yields results pdpd0 and pfpf0, set qk = ρq k-1, k = k +1, and go to (2). If the embedded watermark meets requirements, that is, pdpd0 and pfpf0, find an integer m k-1 < 0, such that watermark embedded with intensity N wk = N wk-1 - ρm k-1 q k-1 yields results pdpd0 and pfpf0. If the embedded watermark with N wk-1 - ρ(m k-1+1)q k-1 yields results pd < pd0 or pf0 > pf0, N wk is the final value of the embedded intensity of the invisible watermark.

From above, it can been seen that if k is an even number, N wk can be written as

Nwk=Nw0i=0k2ρ2(i1)q0((n2i1+1)ρ(m2i1+1)),
Nwk+1Nwk=ρkq0(mk+1).

If k is an odd number, N wk can be written as

Nwk=Nw0i=0(k1)2ρ2(i1)q0((n2i1+1)ρ(m2i1+1))ρk1q0(nk+1),
Nwk+1Nwk=ρkq0(mk+1+1).

Since 0 < ρ < 1, it can be concluded that

limk(Nwk+1Nwk)=0.

From Eq. (33), a conclusion can be drawn:

Conclusion 1: The closed-oop watermarking system shown in Fig. 2 is converged.

Proof: Suppose the system can be divided into two processes. One is whether the embedded watermark is invisible or not, namely, the P1 process. The other is whether the embedded watermark meets requirements in different real applications or not, namely, the P2 process. The embedded watermark intensity satisfies Nw ∈ (0, N × N), where the size of the image is N × N.

Step 1: Select initial watermark embedded intensity randomly. Suppose the initial step size for modification of the embedded intensity is q 0. Set q 0 = ρN w0, where 0 < ρ < 1, set k = 1.

Step 2: If the P2 process does not meet requirements, modify the embedded intensity N wk+1Nwk + ρq k. Determine the P2 process again. If it still cannot meet requirements, modify the embedded intensity again. Suppose the final embedded intensity N wk+1Nwk + ρn k q k. satisfies the P2 process where nk is an integer. Sometimes the P1 process may not satisfy requirements.

Step 3: If the P1 process does not meet requirements, decrease the intensity step. Set q k+1 = ρq k, N wk+1Nwk + ρn k q k - ρ2 q k. Determine the P1 process again. Suppose after mk steps the P1 process meets the requirements. Then Nwk+1Nwk+ρ(nkqkρmkqk).mk<1ρis the ceiling function.

Step 4: Repeat Step 2 and Step 3 until both the P1 process and the P2 process meet requirements. So it will be that

Nwk+1=Nwk+ρ(nkqkρmkqk)=Nw0+(n0m0ρ)q0+ρ(n1ρm1)q1++ρ(nkρmk)qk
=Nwo+[(n0ρm0)+ρ2(n1ρm1)++ρ2k(nkρmk)]q0.

Set: max(nk - ρmk) = u. Then

Nwk+1Nwk=ρ2k(nkρmk)q0<ρ2kuq00.

It can be concluded that when k → ∞ that N wk+1-Nwk → 0. The closed-loop system is converged.

Step 5: Perform the inverse Tschebycheff transform to retrieve the watermarked image.

4.2 Watermark detection

Many existing watermark detectors assume that a portion of the watermark is known in advance. In this paper, the detector is assumed not to be informed any information regarding the watermark and attacks. The watermark detection process first decides whether the watermark is present by dual-channel detection. If the watermark is detected, ICA is utilized to extract it blindly. The extraction steps of the proposed watermarking system are described as follows:

Step 1: Dual-channel detection. The proposed dual-channel watermark detector is shown in Fig. 3. The decision logic of the dual-channel detector is as follows:

  • If both the SNR and the high-order spectra are large enough, both the likelihood channel and the high-order spectra channel will detect the watermark.
  • If the high-order spectra are small while the SNR is large enough, the likelihood channel will detect the watermark while the high-order spectra channel will not detect the watermark.
  • If the SNR is small while the high-order spectra are large enough, the high-order spectra channel will detect the watermark while the likelihood channel will not detect the watermark.
  • If both the SNR and the high-order spectra are small, the likelihood channel and the high-order spectra channel will not detect the watermark.
 

Fig. 3 Watermark detection with dual-channel detection.

Download Full Size | PPT Slide | PDF

Step 2: Watermark extraction. If a watermark is detected, ICA is utilized to extract the watermark blindly. The ICA process is the core of the watermark detector accomplished by the FASTICA algorithm [16]. Before applying the ICA model, it is helpful to conduct preprocessing such as centering and whitening to simplify the ICA process. The process of the watermark extractor is described in detail as follows:

  • Preprocessing of the test image for centering and whitening. The observed variable x is centered by subtracting the mean vector m=E{x} from the observed variable; this makes x a zero-mean variable. This preprocessing method is designed to simplify ICA algorithms. After estimating the mixing matrix A with the centered data, the estimation is completed by adding the mean vector of the original source signal back to the centered estimates of the source data. Another preprocessing method is designed to whiten the observed variables. Whitening means to transform the variable x linearly so that the new variablex x͂ is white, i.e., its components are uncorrelated and their variances equal unity. Whitening can be computed by eigenvalue decomposition of the covariance matrix E{xxT}= EDET, where E is the orthogonal matrix of eigenvector of E{xxT} and D is a diagonal matrix of its eigenvalues. Note that E{xxT} can be estimated in a standard way from the availability of x.
  • Perform ICA to the signal that has been centered and whitened; that is; to find the separate matrix L:
  • Choose an initial (e.g., random) weight vector L; let L + = E{yG(LTy)} - E{yG(LTy)}L, L = L +/∥L +∥, where, E(•) is the mean compute factor, G(·) is a non-linear function, and the following choices of G(•) have been proved to be very useful: G 1(u) tanh(a1u), G2(u)=uexp(u22). If the difference between the iterative results is less than the threshold, that is, ∣L + - L∣<ε, it can be concluded that the process is converged and the cycle will terminate; otherwise, go back to(2) until the result is converged. The threshold ε can be defined by the user, and ε = 10-6 is used in our experiments. If the result still is not converged after 3000 cycles, then the process will be forced to terminate and a conclusion can be drawn that there is no independent component for the corrupted watermarked image.
  • If there are multiple watermarks in the tested image, the extracted watermark must be subtracted before extracting the next one.

Step 3: Extract the perfect watermark by a secret key in the watermark embedding process.

5. Performance analysis of proposed watermarking process

The performance of the watermarking includes invisibility, robustness, and correct detection, which are measured in this paper by the PSNR between the watermarked image and the original image and the probabilities of detection and false alarm. The performance analysis of the watermark is done by calculating the probabilities of false alarm and detection based on the distribution of the likelihood function, which is used to control the embedding processing.

5.1 Likelihood channel detection

To do the performance analysis, the detection process can be formulated as a binary hypothesis test.

H0:Y(i,j)=X(i,j)=watermarkabsent,H1:Y(i,j)=X(i,j)W(i,j)watermarkpresent.

In order to compute the probabilities of false alarm and detection, a normalized correlation function (NC) sim(w,w′) between original watermark w(n) and extracted watermark w′(n) is used as likelihood function, which is expressed as

sim(w,w)=w(n)w(n)w2(n)w2(n).

If Λ(Y) > η, H 1 is true. Define the probability of detection as

PdE=Prob{sim(w,w)ηH1},

where η is the decision threshold. Define probability of false alarm as

pfE=Prob{sim(w,w)ηH0},

Suppose w(n) and w′(n) are either 1 or -1, then w 2(n) = w2 (n) = 1. Denote k(n) = w(n)w′(n), and sim(w,w′) can be rewritten as

sim(w,w)=w(n)w(n)Nw=k(n)Nw.

Then Eq. (39) can be rewritten as

pfE=P(sim(w,w)>ηH0)=p(k(n)>NwηH0).

As k(n)∈{-1,1}, ∑k(n) can only take discrete values from the set {-Nw,-Nw + 2,-Nw + 4, ⋯ Nw -4, Nw - 2, Nw}. Eq. (41) can be rewritten as

pfE=P(k(n)>NwηH0)=m=Nw(η+1)2Nwp(k(n)=Nw+2mH0),

where P(∑k(n) = -Nw + 2m/H 0) is a probability that the series {k(n)}contains m ones and -Nw - m negative ones. Suppose that the probability of k(n ) = -1 is p 0 then

P(k(n)=Nw+2mH0)=Nw!m!(Nwm)!p0Nwm(1p0)m.

If no watermark is given in the test image, ensure that the extracted watermark consists of a series of random, independent, equally probable values from the set {-1,1}. Thus p 0 = 0.5. It can be deduced that

pfE=m=Nw(η+1)2NwNw!m!(Nwm)!0.5Nw.

For pf, a conclusion can be drawn:

Conclusion 2: Increasing the decision threshold η will decrease pfE. Increasing the intensity of watermark Nw will decrease pfE with fixed η.

Proof: Mathematical induction is used to prove it. Define a function as

F(Nw)=m=(Nw+1)(η+1)2Nw+1(Nw+1)!m!(Nw+1m)!0.5Nw+1m=Nw(η+1)2NwNw!m!(Nwm)!0.5Nw.

When Nw = 2, F(2) = 0.25 - 0.5 < 0. Suppose Nw = k

F(k)=m=(k+1)(η+1)2k+1(k+1)!m!(k+1m)!0.5k+1m=k(η+1)2kk!m!(km)!0.5k<0.

When Nw = k + 1

F(k+1)=m=(k+2)(η+1)2k+2(k+2)!m!(k+2m)!0.5k+2m=k+1(η+1)2k+1(k+1)!m!(k+1m)!0.5k+1
=m=(k+2)(η+1)2k+1(k+2)!m!(k+2m)!0.5k+2m=k+1(η+1)2k(k+1)!m!(k+1m)!0.5k+1
<m=(k+1)(η+1)2k+1(k+1)!m!(k+1m)!k+1k+1m0.5k+2m=k(η+1)2kk!m!(km)!k+1k+1m0.5k+1
<m=(k+1)(η+1)2k+1(k+1)!m!(k+1m)!0.5k+1m=k(η+1)2kk!m!(km)!0.5k<0.

So a conclusion can be drawn that if the decision threshold η is fixed, increasing the embedded intensity Nw will decrease the probability of false detection pfE.

The detection probability pdE can be deduced from the central limit theory. It can be deduced that with hypothesis H 0, the mean of ww′ is E(ww′) = 0. The variance is D(ww)=19. So it can be deduced that: Z=i=1NwwwNw9=Nwsim(ww)Nw9~N(0,1). That is, Z obeys the normal distribution N(0,1). Then it can be deduced easily that sim(ww)~N(0,Nw9).. With hypothesis H 1, the mean of ww′ is E(ww′) = 1/3. The variance is D(ww)=445. And it can be concluded that z=i=1Nw(ww13)4Nw45=(Nwsim(ww)Nw3)4Nw45~N(0,1). It is not difficult to get that sim(ww′) obeys the normal distribution N(13,4Nw45).pdE is defined as

pdE=Prob{sim(w,w)ηH1}η18πNw45e(t13)24Nw45dt.

It can be seen that increasing Nw will increase pdE for fixed threshold η.

5.2 High-order spectra channel detection

Since a traditional likelihood detector is sensitive to noise, another watermark detection channel based on high-order spectra detection is proposed to suppress noise. The detector model utilizing a bispectrum can be expressed as a hypothesis test, too:

H0:By(w1,w2)=Bx(w1,w2)watermarkabsent,H1:By(w1,w2)=Bw(w1,w2)+Bx(w1,w2)present.

where By(w 1,w 2), Bx(w 1,w 2), and Bw(w 1,w 2) are bispectrums of the test image, original image, and watermark, respectively. Even with a small SNR, a satisfying detection probability can be obtained if there is enough bispectrum information. For large N(e.g., N > 256), the statistics can be constructed as

T=2PB(n)(wi,wj)2[NKL2py(wi)py(wj)py(wi+wj)]12.

B (N)(w i,w j) is the estimated bispectrum with N points, and py (w) is the power spectra of the test image. The data is divided into K segments, and each segment contains M points. The statistics obey Gaussian distribution [17]. Conditional probability density of {T(i)} can be expressed as

p(TH0)=1(2πσ2)N2exp[i=1N(T(i)μ0)22σ2],p(TH1)=1(2πσ2)N2exp[i=1N(T(i)μ1)22σ2].

μ1, μ0 is the mean for H 1, H 0. σ is the variance. The likelihood function can be constructed as

λ(T)=p(TH0)p(TH1)=exp[μ1μ02σ2i=1NT(i)N(μ12μ02)2σ2]<H0>H1η.

In order to calculate probabilities efficiently, the likelihood function is rewritten in logarithm form

i=1NT(i)<H0>H1σ2μ1μ0[lnη+N(μ12μ02)2σ2].

Set T˜=1Ni=1NT(i)<H0>H1σ2lnηN(μ1μ0)+μ1+μ02T˜c. The false alarm probability can be expressed as

pfB=ηp(T˜H0)dT˜=η1(2πσ2)N2exp[i=1N(T˜(i)μ0)22σ2]dT˜.

The detection probability pdB can be expressed as

pdB=ηp(T˜H1)dT˜=η1(2πσ2)N2exp[i=1N(T˜(i)μ1)22σ2]dT˜.

Since estimation of the bispectrum of the received image is independent, the bispectrum likelihood ratio can be approximately expressed by a normal distribution N(4N1,1) under H 0, according to the central limit theory. Under H 1, the bispectrum likelihood ratio can be approximately expressed by a normal distribution N(1+B2N,N+BN2). Set pfB = α

pfB=1erf(2η4N1)=α.

So η can be determined by the required pfB

η=12(erf1(1α)+4N1)2.

pdB can be expressed as

pdB=1erf(η2NB2N+B).

So from the computation of probabilities of false alarm and detection, it can be seen that they do not depend on the SNR. Even with a very low SNR, the detector can give a good performance.

5.3 Dual channel detection

The detection probability pD of the dual channel is:

pD=pdE+pdBpdEpdB.

6. Computation aspects of the watermarking system

The symmetry property can be used to considerably reduce the time required for computing the Tschebycheff moments. This property suggests the subdivision of the domain of an N × N image (when N is even) into four equal parts and performs the computation of the polynomials only in the first quadrant, where 0x,y(N21). If N is odd, the image can be zero-padded to achieve an even N. Tschebycheff moments can be modified with the help of Eq. (1)

Tpq=1ρ~pNρ~qNx=0N21y=0N21t~p(x)t~q(y){f(x,y)+(1)pf(N1x,y)+(1)qf(x,N1y)+(1)p+qf(N1x,N1y)}.

Set β(n.N) = ρ(n,N) to further simplify the computation. The direct calculation of two-dimensional Tschebycheff moments with a p + q order from Eq. (1) requires 16N(p+q+2)(p+q+1)(4(p+q)+3N9) multiplications and 16(p+q+2)(p+q+1)(2N(p+q)+3N26N9) additions. It is known that the term ρ͂(n,N) needs only to be calculated once per moment at most. Then a Tschebycheff moment with a p + q order requires 124(p+q+2)(p+q+1)((p+q)2+7(p+q)+24) multiplications and 13(p+q)3+(p+q)2+23(p+q) additions.

Because a bispectrum function is a quantity that is third order in the Fourier transform of the intensity, and thus sixth order in the electric field of the source, it is extremely demanding of computational resources. To avoid computational complexity in calculation and interpolation of the bispectrum of two-dimensional data, a simple and effective method of calculating the bispectrum is introduced. Let f(x), and (x = (x, y), x, y = 0,1, … N -1) is the 2D digital image data of N × N pixels. The Fourier transform F(w 1) of the image data is calculated with a 2N × 2N point FFT using the Gaussian window exp(-(x - μ)(x - μ)/(2σ2)), μ = (N/2,N/2), σ = N/3, and padding of zero outside of the N × N data. For each frequency pair (w 1,w 2), the triple product F(w 1)F(w 2)F(-w 1 - w 2) is calculated. [The value of F(-w 1 - w 2) is calculated with the bilinear interpolation.] The calculation is done in O(N 4) time and in O(N 2)space.

7. Experimental results

The NC between the embedded watermark w(n) and the extracted watermark w′(n) is quantitative. It is observed that the higher the NC, the more similarity there is between the extracted watermark and the original watermark. The definition of the NC is given below:

NC=i,jwijw´iji,j(wij)2.

Set PD ≥ 0.96, PF ≤ 0.01. All attacks are performed by the popular watermark benchmark, StirMark.

7.1 Experimental results with rotation angle estimation

Figure 4 gives the estimation results of rotation compared with the results in Ref. [3], where ‘*’ are the results of our method and ‘Δ’ are the results in Ref. [3]. The data in the x-axis are the real rotation angles, and the data in y-axis are the estimated rotation angles. It is clear that our estimation results are closer to the real rotation angle.

7.2 Experimental results with scaling factor estimation

Figure 5 gives the estimation results of scaling factors compared to the results in Ref. [3], where ‘Δ’ are the results in Ref. [3] and ‘*’ are the results of our proposed method. The data in x-axis are the real scaling factors, and the data in y-axis are the estimated scaling factors. From the figure comparison it can be seen that our results perform better.

 

Fig. 4. Results of estimation rotation angle.

Download Full Size | PPT Slide | PDF

 

Fig. 5. Results of estimating of scaling factor and comparison.

Download Full Size | PPT Slide | PDF

7.3 Experimental results with translation parameters estimation

The translation parameters estimation results are listed in Table 2.

Tables Icon

Table 2. Translation parameters estimation translated by StirMark

7.4 Experimental results with rotation angle and scaling factor combined estimation

No matter what order of geometric distortions of rotation and scaling are done to the watermarked images, geometric distortion parameters are estimated in the same way and the estimation results are listed in Table 3.

Tables Icon

Table 3. Rotation angle and scaling factor estimation in different domain

7.5 Experimental results with translation parameter and scaling factor estimation

No matter what order of geometric distortions of translation and scaling are done to the watermarked image, the translation parameter and scaling factors are estimated with Tschebycheff moments of the original image. The estimation results are listed in Table 4.

Tables Icon

Table 4. Translation parameter and scaling factor estimated by StirMark

7.6 Robustness against additive noise

Generally, lower-order Tschebycheff moments of the image captures the low spatial-frequency features of an image, while the higher-order captures the high-frequency features of an image. Since noise can be seen as high-frequency features of the image, it is shown that by reconstructing an image with just its lower-order moments and using only a partial lower-order moment, noise can be removed. The experimental results listed in Table 5 show that even in watermarked images degraded by additive noise including Gaussian, salt-and-pepper, and speckle noise, the detector can still get a high NC. The NC with closed-loop embedding is achieved by modifying the watermark intensity according to the performance of watermarking. The NC with open-loop embedding is obtained by embedding the watermark with the initial watermark intensity of the closed-loop watermarking system.

Tables Icon

Table 5. Robust to additive noise (where μ and σ 2 are the mean and variance, respectively)

7.7 Robustness against JPEG compression

JPEG compression causes high-frequency components of an image to be attenuated. Experiments were performed to examine the robustness of the proposed watermarking technique to the JPEG compression produced by StirMark with different quality factor Q from 90 to 10 with closed-loop watermark embedding and open-loop watermark embedding. Table 6 lists the results.

Tables Icon

Table 6. Robust to JPEG compression produced by StirMark

7.8 Robustness against other attacks performed by StirMark

Experiments were done to test the robustness with respect to the attacks performed by StirMark with closed-loop embedding and open-loop embedding. The experimental results are listed in Table 7.

Tables Icon

Table 7. Robustness against attacks produced by StirMark

7.9 Performance of the proposed watermarking detector

The average of false alarm probability and the detection probability are listed in Table 8 with the experiments of 1000 embedded watermarks.

Tables Icon

Table 8. The probabilities of the detector

7.10 Experimental results comparison

Experimental results are compared with Ref. [6] and commercial watermark techniques. The blank squares in the table indicate that no results are listed for that techniques.

Tables Icon

Table 9. Experimental results compared to Ref. [6] and two commercial watermarking techniques

7.11 Closed-loop watermark embedding experimental results

The PSNR and the embedded intensity of traditional (the watermark embedding process is an open-loop system) and closed-loop watermarking processes are listed in Table 10. From the experimental results, it can be seen that the embedded intensity of closed-loop watermarking is much greater than traditional watermarking.

Tables Icon

Table 10. Experimental data

8. Conclusions

A novel geometric-invariant blind watermarking is proposed with the introduction of Tschebycheff moments, which estimate geometric distortion parameters for a corrupted watermarked image by Tschebycheff moments of the original image. Tschebycheff moments of an original can be used as a private key of the watermark extraction process. The watermark is generated randomly, independent of the original, and embedded by modifying Tschebycheff moments of the original image. The embedded intensity of the watermark is modified according to the results of performance analysis so as to obtain the optimal watermark embedding. The initial embedded intensity is selected randomly. The convergence of closed-loop techniques is proved. An optimum-blind watermarking detector is proposed based on dual-channel detection. Likelihood detection and high-order spectra detection were utilized by the detector to increase the credibility of the watermarking system. Independent Component Analysis is utilized to extract the perfect watermarks blindly, not merely to detect them. The mathematical description is shown and the simulation is performed also. The proposed approach withstands rotation, scaling, and translation by estimating the parameters of these transformations. This avoids the need for an exhaustive search for an embedded watermark in a complicated multidimensional space. The computation aspects of the proposed watermarking system are also described. Experimental results show that the proposed image-watermarking technique is robust against the watermark benchmark, StirMark.

Acknowledgments

This work is partly supported by the National Nature Science Foundation of China grant 60502027.

References and links

1. S. Voloshynovskiy, S. Pereira, T. Pun, J. J. Eggers, and J. K. Su, “Attacks on digital watermarks: classification, estimation-based attacks, and benchmarks,” IEEE Commun. Mag. 8, 2–10 (2001).

2. L. Zhang, G.-B. Qian, X.W.-W. Xiao, and Z. Ji, “Geometric invariant blind image watermarking by invariant Tschebycheff moments,” Opt. Express 15, 2251–2261 (2007).

3. p. Dong, J. G. Brankov, N. P. Galatsanos, Y. Yang, and F. Davoine “Digital watermarking robust to geometric distortions,” IEEE Trans. Image Process. 14, 2140–2150 (2005).

4. M. Alghoniemy and A. H. Tewfik, “Geometric invariants in image watermarking,” IEEE Trans. Image Process. 13, 145–153 (2004).

5. Y. Xin, S. Liao, and M. Pawlak, “A multibit geometrically robust image watermark based on Zernike moments,” International Conference on Pattern Recognition4, 861–864 (2004).

6. S. Pereira and T. Pun, “Robust template matching for affine resistant image watermarks,” IEEE Trans. Image Process. 9, 1123–1129 (2000).

7. M. Kutter, “Performance improvement of spread spectrum based image watermarking schemes through M-ary modulation,” Lect. Notes Comput. Sci. 1728, 238–250 (1999).

8. P. Dong and N. P. Galasanos, “Affine transform resistant watermarking based on image normalization,” in Proceedings of IEEE International Conference on Image Processing, 3, 489–492 (2002).

9. P. Bas, J.-M. Chassery, and B. Macq, “Geometrically invariant watermarking using feature points,” in Proceedings of IEEE International Conference on Image Processing, 11, 1014–1028 (2002).

10. J. O’Ruanaidh and T. Pun, “Rotation, scale, and translation invariant spread spectrum digital image watermarking,” Signal Process. 66, 303–317 (1998).

11. H. S. Kim and H. K. Lee, “Invariant image watermark using Zernike moments,” IEEE Trans. Circuits Syst. Vid. Technol. 13, 766–775 (2003).

12. R. Mukundan, S. H. Ong, and P. A. Lee, “Image analysis by Tschebycheff moments,” IEEE Trans. Image Process. 10, 1357–1364 (2001).

13. R. Mukundan, “Some computational aspects of discrete orthonormal moments,” IEEE Trans. Image Process. 13, 1055–1059 (2004).

14. P.-T. Yap and R. Paramesran, “Local watermarks based on Krawtchouk moments,” in IEEE Region 10 Conference pp. 73–76 (2002).

15. M. K. Hu, “Visual pattern recognition by moment invariants,” IRE Trans. Information Theory IT-8, 179–187, (1962).

16. A. Hyvarinen and E. Oja, “Independent component analysis: a tutorial,” in Notes for International Joint Conference on Neural Networks (1999), http://www.cis.hut.fi/projects/ica/.

17. G. Sundaramorthy, M. R. Raghuveer, and S. A. Diana, “Bispectral reconstruction of signal in noise amplitude reconstruction issues,” IEEE Trans. Acoust. Speech Signal Process. 38, 1297–1300 (1990).

References

  • View by:
  • |
  • |
  • |

  1. S. Voloshynovskiy, S. Pereira, T. Pun, J. J. Eggers and J. K. Su, "Attacks on digital watermarks: classification, estimation-based attacks, and benchmarks," IEEE Commun. Mag. 8, 2-10 (2001).
  2. L. Zhang, G.-B. Qian, X. W.-W. Xiao, and Z. Ji, "Geometric invariant blind image watermarking by invariant Tschebycheff moments," Opt. Express 15, 2251-2261 (2007).
  3. P. Dong, J. G. Brankov, N. P. Galatsanos, Y. Yang, and F. Davoine "Digital watermarking robust to geometric distortions," IEEE Trans. Image Process. 14, 2140-2150 (2005).
  4. M. Alghoniemy and A. H. Tewfik, "Geometric invariants in image watermarking," IEEE Trans. Image Process. 13, 145-153 (2004).
  5. Y. Xin, S. Liao, and M. Pawlak, "A multibit geometrically robust image watermark based on Zernike moments," International Conference on Pattern Recognition4, 861-864 (2004).
  6. S. Pereira and T. Pun, "Robust template matching for affine resistant image watermarks," IEEE Trans. Image Process. 9, 1123-1129 (2000).
  7. M. Kutter, "Performance improvement of spread spectrum based image watermarking schemes through M-ary modulation," Lect. Notes Comput. Sci. 1728, 238-250 (1999).
  8. P. Dong and N. P. Galasanos, "Affine transform resistant watermarking based on image normalization," in Proceedings of IEEE International Conference on Image Processing, 3, 489-492 (2002).
  9. P. Bas, J.-M. Chassery, and B. Macq, "Geometrically invariant watermarking using feature points," in Proceedings of IEEE International Conference on Image Processing, 11,1014-1028 (2002).
  10. J. O'Ruanaidh and T. Pun, "Rotation, scale, and translation invariant spread spectrum digital image watermarking," Signal Process. 66, 303-317 (1998).
  11. H. S. Kim and H. K. Lee, "Invariant image watermark using Zernike moments," IEEE Trans. Circuits Syst. Vid. Technol. 13, 766-775 (2003).
  12. R. Mukundan, S. H. Ong, and P. A. Lee, "Image analysis by Tschebycheff moments," IEEE Trans. Image Process. 10, 1357-1364 (2001).
  13. R. Mukundan, "Some computational aspects of discrete orthonormal moments," IEEE Trans. Image Process. 13, 1055-1059 (2004).
  14. P.-T. Yap and R. Paramesran, "Local watermarks based on Krawtchouk moments," in IEEE Region 10 Conference pp. 73-76 (2002).
  15. M. K. Hu, "Visual pattern recognition by moment invariants," IRE Trans. Information Theory IT-8, 179-187, (1962).
  16. A. Hyvarinen and E. Oja, "Independent component analysis: a tutorial," in Notes for International Joint Conference on Neural Networks (1999), http://www.cis.hut.fi/projects/ica/.
  17. G. Sundaramorthy, M. R. Raghuveer, and S. A. Diana, "Bispectral reconstruction of signal in noise amplitude reconstruction issues," IEEE Trans. Acoust. Speech Signal Process. 38, 1297-1300 (1990).

2007 (1)

2005 (1)

P. Dong, J. G. Brankov, N. P. Galatsanos, Y. Yang, and F. Davoine "Digital watermarking robust to geometric distortions," IEEE Trans. Image Process. 14, 2140-2150 (2005).

2004 (2)

M. Alghoniemy and A. H. Tewfik, "Geometric invariants in image watermarking," IEEE Trans. Image Process. 13, 145-153 (2004).

R. Mukundan, "Some computational aspects of discrete orthonormal moments," IEEE Trans. Image Process. 13, 1055-1059 (2004).

2003 (1)

H. S. Kim and H. K. Lee, "Invariant image watermark using Zernike moments," IEEE Trans. Circuits Syst. Vid. Technol. 13, 766-775 (2003).

2002 (2)

P. Dong and N. P. Galasanos, "Affine transform resistant watermarking based on image normalization," in Proceedings of IEEE International Conference on Image Processing, 3, 489-492 (2002).

P. Bas, J.-M. Chassery, and B. Macq, "Geometrically invariant watermarking using feature points," in Proceedings of IEEE International Conference on Image Processing, 11,1014-1028 (2002).

2001 (2)

R. Mukundan, S. H. Ong, and P. A. Lee, "Image analysis by Tschebycheff moments," IEEE Trans. Image Process. 10, 1357-1364 (2001).

S. Voloshynovskiy, S. Pereira, T. Pun, J. J. Eggers and J. K. Su, "Attacks on digital watermarks: classification, estimation-based attacks, and benchmarks," IEEE Commun. Mag. 8, 2-10 (2001).

2000 (1)

S. Pereira and T. Pun, "Robust template matching for affine resistant image watermarks," IEEE Trans. Image Process. 9, 1123-1129 (2000).

1999 (1)

M. Kutter, "Performance improvement of spread spectrum based image watermarking schemes through M-ary modulation," Lect. Notes Comput. Sci. 1728, 238-250 (1999).

1998 (1)

J. O'Ruanaidh and T. Pun, "Rotation, scale, and translation invariant spread spectrum digital image watermarking," Signal Process. 66, 303-317 (1998).

1990 (1)

G. Sundaramorthy, M. R. Raghuveer, and S. A. Diana, "Bispectral reconstruction of signal in noise amplitude reconstruction issues," IEEE Trans. Acoust. Speech Signal Process. 38, 1297-1300 (1990).

1962 (1)

M. K. Hu, "Visual pattern recognition by moment invariants," IRE Trans. Information Theory IT-8, 179-187, (1962).

Alghoniemy, M.

M. Alghoniemy and A. H. Tewfik, "Geometric invariants in image watermarking," IEEE Trans. Image Process. 13, 145-153 (2004).

Bas, P.

P. Bas, J.-M. Chassery, and B. Macq, "Geometrically invariant watermarking using feature points," in Proceedings of IEEE International Conference on Image Processing, 11,1014-1028 (2002).

Brankov, J. G.

P. Dong, J. G. Brankov, N. P. Galatsanos, Y. Yang, and F. Davoine "Digital watermarking robust to geometric distortions," IEEE Trans. Image Process. 14, 2140-2150 (2005).

Chassery, J.-M.

P. Bas, J.-M. Chassery, and B. Macq, "Geometrically invariant watermarking using feature points," in Proceedings of IEEE International Conference on Image Processing, 11,1014-1028 (2002).

Davoine, F.

P. Dong, J. G. Brankov, N. P. Galatsanos, Y. Yang, and F. Davoine "Digital watermarking robust to geometric distortions," IEEE Trans. Image Process. 14, 2140-2150 (2005).

Diana, S. A.

G. Sundaramorthy, M. R. Raghuveer, and S. A. Diana, "Bispectral reconstruction of signal in noise amplitude reconstruction issues," IEEE Trans. Acoust. Speech Signal Process. 38, 1297-1300 (1990).

Dong, P.

P. Dong, J. G. Brankov, N. P. Galatsanos, Y. Yang, and F. Davoine "Digital watermarking robust to geometric distortions," IEEE Trans. Image Process. 14, 2140-2150 (2005).

P. Dong and N. P. Galasanos, "Affine transform resistant watermarking based on image normalization," in Proceedings of IEEE International Conference on Image Processing, 3, 489-492 (2002).

Eggers, J. J.

S. Voloshynovskiy, S. Pereira, T. Pun, J. J. Eggers and J. K. Su, "Attacks on digital watermarks: classification, estimation-based attacks, and benchmarks," IEEE Commun. Mag. 8, 2-10 (2001).

Galasanos, N. P.

P. Dong and N. P. Galasanos, "Affine transform resistant watermarking based on image normalization," in Proceedings of IEEE International Conference on Image Processing, 3, 489-492 (2002).

Galatsanos, N. P.

P. Dong, J. G. Brankov, N. P. Galatsanos, Y. Yang, and F. Davoine "Digital watermarking robust to geometric distortions," IEEE Trans. Image Process. 14, 2140-2150 (2005).

Hu, M. K.

M. K. Hu, "Visual pattern recognition by moment invariants," IRE Trans. Information Theory IT-8, 179-187, (1962).

Ji, Z.

Kim, H. S.

H. S. Kim and H. K. Lee, "Invariant image watermark using Zernike moments," IEEE Trans. Circuits Syst. Vid. Technol. 13, 766-775 (2003).

Kutter, M.

M. Kutter, "Performance improvement of spread spectrum based image watermarking schemes through M-ary modulation," Lect. Notes Comput. Sci. 1728, 238-250 (1999).

Lee, H. K.

H. S. Kim and H. K. Lee, "Invariant image watermark using Zernike moments," IEEE Trans. Circuits Syst. Vid. Technol. 13, 766-775 (2003).

Lee, P. A.

R. Mukundan, S. H. Ong, and P. A. Lee, "Image analysis by Tschebycheff moments," IEEE Trans. Image Process. 10, 1357-1364 (2001).

Macq, B.

P. Bas, J.-M. Chassery, and B. Macq, "Geometrically invariant watermarking using feature points," in Proceedings of IEEE International Conference on Image Processing, 11,1014-1028 (2002).

Mukundan, R.

R. Mukundan, "Some computational aspects of discrete orthonormal moments," IEEE Trans. Image Process. 13, 1055-1059 (2004).

R. Mukundan, S. H. Ong, and P. A. Lee, "Image analysis by Tschebycheff moments," IEEE Trans. Image Process. 10, 1357-1364 (2001).

Ong, S. H.

R. Mukundan, S. H. Ong, and P. A. Lee, "Image analysis by Tschebycheff moments," IEEE Trans. Image Process. 10, 1357-1364 (2001).

O'Ruanaidh, J.

J. O'Ruanaidh and T. Pun, "Rotation, scale, and translation invariant spread spectrum digital image watermarking," Signal Process. 66, 303-317 (1998).

Pereira, S.

S. Voloshynovskiy, S. Pereira, T. Pun, J. J. Eggers and J. K. Su, "Attacks on digital watermarks: classification, estimation-based attacks, and benchmarks," IEEE Commun. Mag. 8, 2-10 (2001).

S. Pereira and T. Pun, "Robust template matching for affine resistant image watermarks," IEEE Trans. Image Process. 9, 1123-1129 (2000).

Pun, T.

S. Voloshynovskiy, S. Pereira, T. Pun, J. J. Eggers and J. K. Su, "Attacks on digital watermarks: classification, estimation-based attacks, and benchmarks," IEEE Commun. Mag. 8, 2-10 (2001).

S. Pereira and T. Pun, "Robust template matching for affine resistant image watermarks," IEEE Trans. Image Process. 9, 1123-1129 (2000).

J. O'Ruanaidh and T. Pun, "Rotation, scale, and translation invariant spread spectrum digital image watermarking," Signal Process. 66, 303-317 (1998).

Qian, G.-B.

Raghuveer, M. R.

G. Sundaramorthy, M. R. Raghuveer, and S. A. Diana, "Bispectral reconstruction of signal in noise amplitude reconstruction issues," IEEE Trans. Acoust. Speech Signal Process. 38, 1297-1300 (1990).

Su, J. K.

S. Voloshynovskiy, S. Pereira, T. Pun, J. J. Eggers and J. K. Su, "Attacks on digital watermarks: classification, estimation-based attacks, and benchmarks," IEEE Commun. Mag. 8, 2-10 (2001).

Sundaramorthy, G.

G. Sundaramorthy, M. R. Raghuveer, and S. A. Diana, "Bispectral reconstruction of signal in noise amplitude reconstruction issues," IEEE Trans. Acoust. Speech Signal Process. 38, 1297-1300 (1990).

Tewfik, A. H.

M. Alghoniemy and A. H. Tewfik, "Geometric invariants in image watermarking," IEEE Trans. Image Process. 13, 145-153 (2004).

Voloshynovskiy, S.

S. Voloshynovskiy, S. Pereira, T. Pun, J. J. Eggers and J. K. Su, "Attacks on digital watermarks: classification, estimation-based attacks, and benchmarks," IEEE Commun. Mag. 8, 2-10 (2001).

Xiao, X. W.-W.

Yang, Y.

P. Dong, J. G. Brankov, N. P. Galatsanos, Y. Yang, and F. Davoine "Digital watermarking robust to geometric distortions," IEEE Trans. Image Process. 14, 2140-2150 (2005).

Zhang, L.

IEEE Commun. Mag. (1)

S. Voloshynovskiy, S. Pereira, T. Pun, J. J. Eggers and J. K. Su, "Attacks on digital watermarks: classification, estimation-based attacks, and benchmarks," IEEE Commun. Mag. 8, 2-10 (2001).

IEEE Trans. Acoust. Speech Signal Process. (1)

G. Sundaramorthy, M. R. Raghuveer, and S. A. Diana, "Bispectral reconstruction of signal in noise amplitude reconstruction issues," IEEE Trans. Acoust. Speech Signal Process. 38, 1297-1300 (1990).

IEEE Trans. Circuits Syst. Vid. Technol. (1)

H. S. Kim and H. K. Lee, "Invariant image watermark using Zernike moments," IEEE Trans. Circuits Syst. Vid. Technol. 13, 766-775 (2003).

IEEE Trans. Image Process. (5)

R. Mukundan, S. H. Ong, and P. A. Lee, "Image analysis by Tschebycheff moments," IEEE Trans. Image Process. 10, 1357-1364 (2001).

R. Mukundan, "Some computational aspects of discrete orthonormal moments," IEEE Trans. Image Process. 13, 1055-1059 (2004).

P. Dong, J. G. Brankov, N. P. Galatsanos, Y. Yang, and F. Davoine "Digital watermarking robust to geometric distortions," IEEE Trans. Image Process. 14, 2140-2150 (2005).

M. Alghoniemy and A. H. Tewfik, "Geometric invariants in image watermarking," IEEE Trans. Image Process. 13, 145-153 (2004).

S. Pereira and T. Pun, "Robust template matching for affine resistant image watermarks," IEEE Trans. Image Process. 9, 1123-1129 (2000).

IRE Trans. Information Theory (1)

M. K. Hu, "Visual pattern recognition by moment invariants," IRE Trans. Information Theory IT-8, 179-187, (1962).

Lect. Notes Comput. Sci. (1)

M. Kutter, "Performance improvement of spread spectrum based image watermarking schemes through M-ary modulation," Lect. Notes Comput. Sci. 1728, 238-250 (1999).

Opt. Express (1)

Proceedings of IEEE International Conference on Image Processing (2)

P. Dong and N. P. Galasanos, "Affine transform resistant watermarking based on image normalization," in Proceedings of IEEE International Conference on Image Processing, 3, 489-492 (2002).

P. Bas, J.-M. Chassery, and B. Macq, "Geometrically invariant watermarking using feature points," in Proceedings of IEEE International Conference on Image Processing, 11,1014-1028 (2002).

Signal Process. (1)

J. O'Ruanaidh and T. Pun, "Rotation, scale, and translation invariant spread spectrum digital image watermarking," Signal Process. 66, 303-317 (1998).

Other (3)

Y. Xin, S. Liao, and M. Pawlak, "A multibit geometrically robust image watermark based on Zernike moments," International Conference on Pattern Recognition4, 861-864 (2004).

A. Hyvarinen and E. Oja, "Independent component analysis: a tutorial," in Notes for International Joint Conference on Neural Networks (1999), http://www.cis.hut.fi/projects/ica/.

P.-T. Yap and R. Paramesran, "Local watermarks based on Krawtchouk moments," in IEEE Region 10 Conference pp. 73-76 (2002).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1.
Fig. 1.

Traditional watermark embedding system.

Fig. 2.
Fig. 2.

Closed-loop watermark embedding system.

Fig. 3
Fig. 3

Watermark detection with dual-channel detection.

Fig. 4.
Fig. 4.

Results of estimation rotation angle.

Fig. 5.
Fig. 5.

Results of estimating of scaling factor and comparison.

Tables (10)

Tables Icon

Table 1. The relation between the probabilities and the embedded watermark

Tables Icon

Table 2. Translation parameters estimation translated by StirMark

Tables Icon

Table 3. Rotation angle and scaling factor estimation in different domain

Tables Icon

Table 4. Translation parameter and scaling factor estimated by StirMark

Tables Icon

Table 5. Robust to additive noise (where μ and σ 2 are the mean and variance, respectively)

Tables Icon

Table 6. Robust to JPEG compression produced by StirMark

Tables Icon

Table 7. Robustness against attacks produced by StirMark

Tables Icon

Table 8. The probabilities of the detector

Tables Icon

Table 9. Experimental results compared to Ref. [6] and two commercial watermarking techniques

Tables Icon

Table 10. Experimental data

Equations (72)

Equations on this page are rendered with MathJax. Learn more.

T pq = 1 ρ ( p , N ) ρ ( q , N ) x = 0 N 1 y = 0 N 1 t p ( x ) t p ( y ) f ( x , y ) , p , q = 0,1,2 , , N 1 ,
t n ( x ) = n ! k = 0 n ( 1 ) n k N 1 k n k n + k n x k .
f ( x , y ) = p = 0 N 1 q = 0 N 1 T pq t p ( x ) t q ( y ) .
m pq = x = 0 N 1 y = 0 N 1 x p y q f ( x , y ) ,
T pq = 1 ρ ˜ ( p , N ) ρ ˜ ( q , N ) i = 0 p j = 0 q c j , q , N c i , p , N m i , j .
T 00 = m 00 N 2 , T 10 = 6 m 10 + 3 ( 1 N ) m 00 N ( N 2 1 ) , T 01 = 6 m 01 + 3 ( 1 N ) m 00 N ( N 2 1 ) ,
T 20 = 30 m 20 + 30 ( 1 N ) m 10 + 5 ( 1 N ) ( 2 N ) m 00 ( N 2 1 ) ( N 2 2 ) , T 02 = 30 m 02 + 30 ( 1 N ) m 01 + 5 ( 1 N ) ( 2 N ) m 00 ( N 2 1 ) ( N 2 2 ) ,
T 11 = 36 m 11 + 18 ( 1 N ) ( m 10 + m 01 ) + 9 ( 1 N 2 ) m 00 ( N 2 1 ) 2 .
x 2 y 2 = a b c d × x 1 y 1 + e f ,
[ x y ] = [ cos θ sin θ sin θ cos θ ] × [ x y ] ,
m 20 = ( 1 + cos θ 2 ) m 20 ( sin 2 θ ) m 11 + ( 1 cos θ 2 ) m 02 , m 02 = ( 1 cos θ 2 ) m 20 + ( sin 2 θ ) m 11 + ( 1 + cos θ 2 ) m 02 ,
m 11 = ( sin 2 θ 2 ) m 20 + ( cos 2 θ ) m 11 ( sin 2 θ 2 ) m 02 ,
m 20 + m 02 = m 20 + m 02 .
T 10 = T 10 cos θ m 01 sin θ + 3 N N + 1 ( cos θ sin θ + 1 ) ,
T 01 = T 10 sin θ T 01 cos θ + 3 N N + 1 ( cos θ + sin θ 1 ) .
T 10 cos θ + T 01 sin θ = T 10 + 3 ( 1 N ) N 2 N ( N 2 1 ) ( cos θ + sin θ 1 ) T 00 ,
m pq = a p + q + 2 m pq .
m 00 = a 2 m 00 .
m pq ( m 00 ) ( p + q + 2 ) 2 = m pq ( m 00 ) ( p + q + 2 ) 2 .
a = N ( N 2 1 ) T 10 3 ( 1 N ) N 2 T 00 N ( N 2 1 ) T 10 3 ( 1 N ) N 2 T 00 , 3
{ x = x + c y = y + d ,
T 10 = 6 x = 0 N 1 y = 0 N 1 xf ( x , y ) + 3 ( 1 N ) x = 0 N 1 y = 0 N 1 f ( x , y ) N ( N 2 1 ) , T 01 = 6 x = 0 N 1 y = 0 N 1 yf ( x , y ) + 3 ( 1 N ) x = 0 N 1 y = 0 N 1 f ( x , y ) N ( N 2 1 ) .
T 10 = 6 x = 0 N 1 y = 0 N 1 ( x + c ) f ( x , y ) + 3 ( 1 N ) x = 0 N 1 y = 0 N 1 f ( x , y ) N ( N 2 1 ) , T 10 = 6 x = 0 N 1 y = 0 N 1 ( y + d ) f ( x , y ) + 3 ( 1 N ) x = 0 N 1 y = 0 N 1 f ( x , y ) N ( N 2 1 )
T 10 = 6 x = 0 N 1 y = 0 N 1 xf ( x , y ) + 3 ( 1 N ) x = 0 N 1 y = 0 N 1 f ( x , y ) N ( N 2 1 ) + 6 x = 0 N 1 y = 0 N 1 c f ( x , y ) N ( N 2 1 ) = T 10 + 6 cN T 00 N 2 1 ,
T 10 = 6 x = 0 N 1 y = 0 N 1 yf ( x , y ) + 3 ( 1 N ) x = 0 N 1 y = 0 N 1 f ( x , y ) N ( N 2 1 ) + 6 x = 0 N 1 y = 0 N 1 d f ( x , y ) N ( N 2 1 ) = T 01 + 6 dN T 00 N 2 1 .
c = ( N 2 1 ) ( T 10 T 10 ) ( 6 NT 00 ) , d = ( N 2 1 ) ( T 01 T 01 ) ( 6 NT 00 ) .
{ x 2 = x cos θ y sin θ = x 1 a cos θ y 1 a sin θ y 2 = x sin θ + y cos θ = x 1 a sin θ + y 1 a cos θ .
{ N ( N 2 1 ) T 10 3 ( 1 N ) N 2 T 00 = a 3 [ N ( N 2 1 ) T 10 3 ( 1 N ) N 2 T 00 ] N ( N 2 1 ) T 01 3 ( 1 N ) N 2 T 00 = a 3 [ N ( N 2 1 ) T 01 3 ( 1 N ) N 2 T 00 ] N ( N 2 1 ) T 10 3 ( 1 N ) N 2 T 00 = N ( N 2 1 ) ( T 10 cos θ T 01 sin θ ) 3 ( 1 N ) N 2 T 00 ( cos θ sin θ ) N ( N 2 1 ) T 01 3 ( 1 N ) N 2 T 00 = N ( N 2 1 ) ( T 10 sin θ + T 01 cos θ ) 3 ( 1 N ) N 2 T 00 ( cos θ + sin θ ) .
{ a = ( T 00 N 2 ) ( T 00 N 2 ) N ( N 2 1 ) ( T 10 cos θ T 01 sin θ ) 3 ( 1 N ) N 2 T 00 ( cos θ + sin θ ) = a 3 [ N ( N 2 1 ) T 10 3 ( 1 N ) N 2 T 00 ] .
{ x 2 = x 1 a + m y 2 = y 1 a + n .
a = ( T 00 N 2 ) ( T 00 N 2 )
c = [ N ( N 2 1 ) T 10 a 3 N ( N 2 1 ) T 10 3 ( 1 N ) N 2 T 00 + 3 ( 1 N ) N 2 T 00 ] ( 6 N 2 T 00 )
d = [ N ( N 2 1 ) T 01 a 3 N ( N 2 1 ) T 01 3 ( 1 N ) N 2 T 00 + 3 ( 1 N ) N 2 T 00 ] ( 6 N 2 T 00 ) .
T ̂ m i , n i = T om i , n i + N wk ( m i , n i ) T wm i , n i .
PSNR k = 10 log { x max 2 [ N 2 i = 1 N i = 1 N ( x i , j y i , j ( k ) ) 2 ] } ,
N wk = N w 0 i = 0 k 2 ρ 2 ( i 1 ) q 0 ( ( n 2 i 1 + 1 ) ρ ( m 2 i 1 + 1 ) ) ,
N wk + 1 N wk = ρ k q 0 ( m k + 1 ) .
N wk = N w 0 i = 0 ( k 1 ) 2 ρ 2 ( i 1 ) q 0 ( ( n 2 i 1 + 1 ) ρ ( m 2 i 1 + 1 ) ) ρ k 1 q 0 ( n k + 1 ) ,
N wk + 1 N wk = ρ k q 0 ( m k + 1 + 1 ) .
lim k ( N wk + 1 N wk ) = 0 .
N wk + 1 = N wk + ρ ( n k q k ρ m k q k ) = N w 0 + ( n 0 m 0 ρ ) q 0 + ρ ( n 1 ρ m 1 ) q 1 + + ρ ( n k ρ m k ) q k
= N wo + [ ( n 0 ρ m 0 ) + ρ 2 ( n 1 ρ m 1 ) + + ρ 2 k ( n k ρ m k ) ] q 0 .
N wk + 1 N wk = ρ 2 k ( n k ρ m k ) q 0 < ρ 2 k u q 0 0 .
H 0 : Y ( i , j ) = X ( i , j ) = watermark absent , H 1 : Y ( i , j ) = X ( i , j ) W ( i , j ) watermark present .
sim ( w , w ) = w ( n ) w ( n ) w 2 ( n ) w 2 ( n ) .
P dE = Prob { sim ( w , w ) η H 1 } ,
p fE = Prob { sim ( w , w ) η H 0 } ,
sim ( w , w ) = w ( n ) w ( n ) N w = k ( n ) N w .
p fE = P ( sim ( w , w ) > η H 0 ) = p ( k ( n ) > N w η H 0 ) .
p fE = P ( k ( n ) > N w η H 0 ) = m = N w ( η + 1 ) 2 N w p ( k ( n ) = N w + 2 m H 0 ) ,
P ( k ( n ) = N w + 2 m H 0 ) = N w ! m ! ( N w m ) ! p 0 N w m ( 1 p 0 ) m .
p fE = m = N w ( η + 1 ) 2 N w N w ! m ! ( N w m ) ! 0.5 N w .
F ( N w ) = m = ( N w + 1 ) ( η + 1 ) 2 N w + 1 ( N w + 1 ) ! m ! ( N w + 1 m ) ! 0.5 N w + 1 m = N w ( η + 1 ) 2 N w N w ! m ! ( N w m ) ! 0.5 N w .
F ( k ) = m = ( k + 1 ) ( η + 1 ) 2 k + 1 ( k + 1 ) ! m ! ( k + 1 m ) ! 0.5 k + 1 m = k ( η + 1 ) 2 k k ! m ! ( k m ) ! 0.5 k < 0 .
F ( k + 1 ) = m = ( k + 2 ) ( η + 1 ) 2 k + 2 ( k + 2 ) ! m ! ( k + 2 m ) ! 0.5 k + 2 m = k + 1 ( η + 1 ) 2 k + 1 ( k + 1 ) ! m ! ( k + 1 m ) ! 0.5 k + 1
= m = ( k + 2 ) ( η + 1 ) 2 k + 1 ( k + 2 ) ! m ! ( k + 2 m ) ! 0.5 k + 2 m = k + 1 ( η + 1 ) 2 k ( k + 1 ) ! m ! ( k + 1 m ) ! 0.5 k + 1
< m = ( k + 1 ) ( η + 1 ) 2 k + 1 ( k + 1 ) ! m ! ( k + 1 m ) ! k + 1 k + 1 m 0.5 k + 2 m = k ( η + 1 ) 2 k k ! m ! ( k m ) ! k + 1 k + 1 m 0.5 k + 1
< m = ( k + 1 ) ( η + 1 ) 2 k + 1 ( k + 1 ) ! m ! ( k + 1 m ) ! 0.5 k + 1 m = k ( η + 1 ) 2 k k ! m ! ( k m ) ! 0.5 k < 0 .
p dE = Prob { sim ( w , w ) η H 1 } η 1 8 π N w 45 e ( t 1 3 ) 2 4 N w 45 dt .
H 0 : B y ( w 1 , w 2 ) = B x ( w 1 , w 2 ) watermark absent , H 1 : B y ( w 1 , w 2 ) = B w ( w 1 , w 2 ) + B x ( w 1 , w 2 ) present .
T = 2 P B ( n ) ( w i , w j ) 2 [ N KL 2 p y ( w i ) p y ( w j ) p y ( w i + w j ) ] 1 2 .
p ( T H 0 ) = 1 ( 2 πσ 2 ) N 2 exp [ i = 1 N ( T ( i ) μ 0 ) 2 2 σ 2 ] , p ( T H 1 ) = 1 ( 2 πσ 2 ) N 2 exp [ i = 1 N ( T ( i ) μ 1 ) 2 2 σ 2 ] .
λ ( T ) = p ( T H 0 ) p ( T H 1 ) = exp [ μ 1 μ 0 2 σ 2 i = 1 N T ( i ) N ( μ 1 2 μ 0 2 ) 2 σ 2 ] < H 0 > H 1 η .
i = 1 N T ( i ) < H 0 > H 1 σ 2 μ 1 μ 0 [ ln η + N ( μ 1 2 μ 0 2 ) 2 σ 2 ] .
p fB = η p ( T ˜ H 0 ) d T ˜ = η 1 ( 2 π σ 2 ) N 2 exp [ i = 1 N ( T ˜ ( i ) μ 0 ) 2 2 σ 2 ] d T ˜ .
p dB = η p ( T ˜ H 1 ) d T ˜ = η 1 ( 2 π σ 2 ) N 2 exp [ i = 1 N ( T ˜ ( i ) μ 1 ) 2 2 σ 2 ] d T ˜ .
p fB = 1 erf ( 2 η 4 N 1 ) = α .
η = 1 2 ( erf 1 ( 1 α ) + 4 N 1 ) 2 .
p dB = 1 erf ( η 2 N B 2 N + B ) .
p D = p dE + p dB p dE p dB .
T pq = 1 ρ ~ p N ρ ~ q N x = 0 N 2 1 y = 0 N 2 1 t ~ p ( x ) t ~ q ( y ) { f ( x , y ) + ( 1 ) p f ( N 1 x , y ) + ( 1 ) q f ( x , N 1 y ) + ( 1 ) p + q f ( N 1 x , N 1 y ) } .
NC = i , j w i j w ´ i j i , j ( w i j ) 2 .

Metrics