## Abstract

We design receivers to detect a known pattern or a reference signal in the presence of very general and non-Gaussian types of noise. Three sources of input-noise degradation are considered: additive, multiplicative, and disjoint background. The detection process involves two steps: (1) estimation of the relevant noise parameters within the framework of hypothesis testing and (2) maximizing a certain metric that measures the likelihood of the target being at a given location. The parameter estimation portion is carried out by moment-matching techniques. Because of the number of unknown parameters and the fact that various types of input-noise processes are non-Gaussian, the methods that are used to estimate these parameters differ from the standard methods of maximizing the likelihood function. To verify the existence of the target at a certain location, we use ${l}_{p}$-norm metric for $p\u2a7e0$ to measure the likelihood of the target being present at the location of interest. Computer simulations are used to show that for the images tested here, the receivers designed herein perform better than some existing receivers.

© 2001 Optical Society of America

Full Article |

PDF Article
### Equations (36)

Equations on this page are rendered with MathJax. Learn more.

(1)
$$s(t)=g[r(t-{t}_{j})w(t-{t}_{j})],$$
(2)
$$s(t)={n}_{r}(t-{t}_{j})r(t-{t}_{j})w(t-{t}_{j})+{n}_{b}(t)\times [1-w(t-{t}_{j})]+{n}_{d}(t).$$
(3)
$$s(i)=[{n}_{r}(i-j)r(i-j)+{n}_{d}(i)]w(i-j)+[{n}_{b}(i)+{n}_{d}(i)][1-w(i-j)].$$
(4)
$${g}_{j}(i)=[{n}_{r}(i-j)r(i-j)+{n}_{d}(i)]w(i-j)+[{n}_{b}(i)+{n}_{d}(i)][1-w(i-j)],$$
(5)
$${e}_{j}={G}_{j}-E[{G}_{j}|S].$$
(6)
$$E|{e}_{j}(i){|}^{2}=E[|{G}_{j}(i)-s(i){|}^{2}].$$
(7)
$$\Vert C{\Vert}_{p}={\left[\frac{1}{J}\sum _{j}|{c}_{j}{|}^{p}\right]}^{1/p}.$$
(8)
$$\Vert C{\Vert}_{0}=exp\left[\frac{1}{J}\sum _{j=0}^{J-1}log(|{c}_{j}|)\right].$$
(9)
$${\lambda}_{j}=\sum _{i=1}^{M}\{E[|{G}_{j}(i)-s(i){|}^{2}]{\}}^{p},$$
(10)
$${\lambda}_{j}=\sum _{i=1}^{M}log\{E[|{G}_{j}(i)-s(i){|}^{2}]\}.$$
(11)
$$s(i)=[{n}_{r}(i-j)r(i-j)+{n}_{d}(i)]w(i-j)+[{n}_{b}(i)+{n}_{d}(i)][1-w(i-j)]={G}_{j}(i),$$
(12)
$${n}_{1}(i)={n}_{r}(i)-1.$$
(13)
$$s(i)=[{n}_{1}(i-j)r(i-j)+r(i-j)+{n}_{d}(i)]w(i-j)+[{n}_{b}(i)+{n}_{d}(i)][1-w(i-j)].$$
(14)
$${\mathit{SW}}_{j}=\{s(i){\}}_{[i:w(i-j)=1]},$$
(15)
$${\mathit{SW}}_{j}^{2}=\{{s}^{2}(i){\}}_{[i:w(i-j)=1]},$$
(16)
$${\mathit{XW}}_{j}=\{r(i-j)+r(i-j)w(i-j){n}_{1}+{n}_{d}{\}}_{[i:w(i-j)=1]},$$
(17)
$$E(\mathit{XW}{)}_{j}=\{r(i-j)+r(i-j)w(i-j){m}_{1}+{m}_{d}{\}}_{[i:w(i-j)=1]},$$
(18)
$$E[({\mathit{XW}}_{j}{)}^{2}]=\{[r(i-j)+{m}_{1}r(i-j)+{m}_{d}{]}^{2}+{\sigma}_{1}^{2}{r}^{2}(i-j)+{\sigma}_{d}^{2}{\}}_{[i:w(i-j)=1]},$$
(19)
$$\left[\begin{array}{cc}\Vert r{\Vert}_{2}^{2}& \overline{r}\\ \overline{r}& {n}_{w}\end{array}\right]\left[\begin{array}{c}{\stackrel{\u02c6}{m}}_{1}(j)\\ {\stackrel{\u02c6}{m}}_{d}(j)\end{array}\right]=\left[\begin{array}{c}{\tilde{s}}_{j}\u229b\mathit{wr}(j)\\ {\tilde{s}}_{j}\u229bw(j)\end{array}\right],$$
(20)
$${\tilde{s}}_{j}(i)=[s(i)-r(i-j)]w(i-j),$$
(21)
$$\overline{r}=\sum _{i:r(i)\ne 0}r(i)\hspace{1em}\hspace{1em}\Vert r{\Vert}_{p}={\left[\sum _{i:r(i)\ne 0}{\left|r(i)\right|}^{p}\right]}^{1/p},$$
(22)
$${\tilde{s}}_{j}\u229b\mathit{wr}(j)=\sum _{i:w(i-j)=1}{\tilde{s}}_{j}(i)r(i-j)w(i-j),$$
(23)
$${\tilde{s}}_{j}\u229bw(j)=\sum _{i:w(i-j)=1}{\tilde{s}}_{j}(i)w(i-j).$$
(24)
$$E[({\mathit{XW}}_{j}{)}^{2}]=\{[r(i-j)+{\stackrel{\u02c6}{m}}_{1}(j)r(i-j)+\stackrel{\u02c6}{m}(j){]}^{2}+{\sigma}_{1}^{2}{r}^{2}(i-j)+{\sigma}_{d}^{2}{\}}_{[i:w(i-j)=1]}.$$
(25)
$$\left[\begin{array}{cc}\Vert r{\Vert}_{4}^{4}& \Vert r{\Vert}_{2}^{2}\\ \Vert r{\Vert}_{2}^{2}& {n}_{w}\end{array}\right]\left[\begin{array}{c}{\stackrel{\u02c6}{\sigma}}_{1}^{2}(j)\\ {\stackrel{\u02c6}{\sigma}}_{d}^{2}(j)\end{array}\right]=\left[\begin{array}{c}A(j)\\ B(j)\end{array}\right],$$
(26)
$$A(j)=\sum _{i:w(i-j)=1}[{\tilde{s}}_{j}(i)-{\stackrel{\u02c6}{m}}_{1}(j)r(i-j)+{\stackrel{\u02c6}{m}}_{d}(j){]}^{2}{r}^{2}(i-j),$$
(27)
$$B(j)=\sum _{i:w(i-j)=1}[{\tilde{s}}_{j}(i)-{\stackrel{\u02c6}{m}}_{1}(j)r(i-j)+{\stackrel{\u02c6}{m}}_{d}(j){]}^{2}.$$
(28)
$$\hspace{1em}\hspace{1em}\hspace{1em}\hspace{1em}\hspace{1em}\stackrel{\u02c6}{m}(j)=\frac{1}{{n}_{0}}\sum _{i:w(i-j)=0}s(i),$$
(29)
$${\stackrel{\u02c6}{\sigma}}^{2}(j)=\frac{1}{{n}_{0}}\sum _{i:w(i-j)=0}[s(i)-\stackrel{\u02c6}{m}(j){]}^{2},$$
(30)
$${\lambda}_{j}=-{n}_{0}[{\stackrel{\u02c6}{\sigma}}^{2}(j){]}^{p}-\sum _{i:w(i-j)=1}|{r}^{2}(i-j)w(i-j){\stackrel{\u02c6}{\sigma}}_{1}^{2}(j)+{\stackrel{\u02c6}{\sigma}}_{d}^{2}(j){|}^{p},$$
(31)
$${\lambda}_{j}=-{n}_{0}log[{\stackrel{\u02c6}{\sigma}}^{2}(j)]-\sum _{i:w(i-j)=1}log[{r}^{2}(i-j){\stackrel{\u02c6}{\sigma}}_{1}^{2}(j)+{\stackrel{\u02c6}{\sigma}}_{d}^{2}(j)].$$
(32)
$${Y}_{b}=0.06{X}_{1}^{2}+0.12\sqrt{3{X}_{2}}+0.07,$$
(33)
$${Y}_{m}=0.64+0.16{X}_{3}^{2},$$
(34)
$${Y}_{a}=0.2+0.06({X}_{4}^{2}-1)+0.12\sqrt{3}{X}_{5},$$
(35)
$${Y}_{m}=0.5+0.2{X}_{1}+0.8\sqrt{3}{X}_{2},$$