## Abstract

We develop algorithms to detect a known pattern or a reference signal in the presence of additive, disjoint background, and multiplicative white Gaussian noise with unknown statistics. The presence of three different types of noise processes with unknown statistics presents difficulties in estimating the unknown parameters. The standard methods such as expected-maximization-type algorithms are iterative, and in the framework of hypothesis testing they are time-consuming, because corresponding to each hypothesis one must estimate a set of parameters. Other standard methods such as setting the gradient of the likelihood function with respect to the unknown parameters will lead to a nonlinear system of equations that do not have a closed-form solution and require iterative methods. We develop an approach to overcome these handicaps and derive algorithms to detect a known object. We present new methods to estimate unknown parameters within the framework of hypothesis testing. The methods that we present are direct and provide closed-form estimates of the unknown parameters. Computer simulations are used to show that for the images tested, the receivers that we have designed perform better than existing receivers.

© 2001 Optical Society of America

Full Article |

PDF Article
### Equations (32)

Equations on this page are rendered with MathJax. Learn more.

(1)
$${H}_{j}:s(t)=[{n}_{r}(t-{t}_{j})r(t-{t}_{j})]w(t-{t}_{j}){n}_{b}(t)\times [1-w(t-{t}_{j})]+{n}_{d}(t).$$
(2)
$${H}_{j}:s(i)=[{n}_{r}(i-j)r(i-j)]w(i-j)+{n}_{b}(i)\times [1-w(i-j)]+{n}_{d}(i).$$
(3)
$$s(i)w(i-j)=[{n}_{1}(i-j)r(i-j)+r(i-j)]w(i-j)+{n}_{b}(i)[1-w(i-j)]+{n}_{d}(i).$$
(5)
$$=\frac{1}{(2\pi {)}^{n/2}}\prod _{i:w(i-j)=1}[{\sigma}_{1}^{2}{r}^{2}(i-j)+{\sigma}_{d}^{2}{]}^{-1/2}$$
(6)
$$\times exp\left\{-\sum _{i:w(i-j)=1}\times \frac{[s(i)-r(i-j){m}_{1}-r(i-j)-{m}_{d}{]}^{2}w(i-j)}{2[{r}^{2}(i-j){\sigma}_{1}^{2}+{\sigma}_{d}^{2}]}\right\}\times \prod _{i:w(i-j)=0}({\sigma}_{b}^{2}+{\sigma}_{d}^{2}{)}^{-1/2}\times exp\left\{-\sum _{i:w(i-j)=0}\frac{[s(i)-{m}_{b}-{m}_{d}{]}^{2}}{2({\sigma}_{b}^{2}+{\sigma}_{d}^{2})}\right\},$$
(7)
$$s(i)w(i-j)=[r(i-j){n}_{1}(i-j)+r(i-j)+{n}_{d}(i)]\times w(i-j),$$
(9)
$${X}_{j}=\{r(i-j){n}_{1}(i-j)+r(i-j)+{n}_{d}(i){\}}_{[i:w(i-j)=1]}.$$
(10)
$$s(i)={n}_{b}(i)+{n}_{d}(i),\hspace{1em}\hspace{1em}w(i-j)=0.$$
(11)
$${Y}_{j}=\{{n}_{b}(i)+{n}_{d}(i){\}}_{[i:w(i-j)=0]}.$$
(12)
$${\widehat{X}}_{j}={S}_{j}=\{s(i){\}}_{[i:w(i-j)=1]},$$
(13)
$$E({X}_{j})=\{r(i-j)+r(i-j)\times w(i-j){m}_{1}+{m}_{d}{\}}_{[i:w(i-j)=1]},$$
(14)
$$\left[\begin{array}{cc}\Vert r{\Vert}_{2}^{2}& \overline{r}\\ \overline{r}& {n}_{w}\end{array}\right]\left[\begin{array}{c}{\widehat{m}}_{1}(j)\\ {\widehat{m}}_{d}(j)\end{array}\right]=\left[\begin{array}{c}{\tilde{s}}_{j}*\mathit{wr}(j)\\ {\tilde{s}}_{j}*w(j)\end{array}\right],$$
(15)
$${\tilde{s}}_{j}(i)=[s(i)-r(i-j)]w(i-j),$$
(16)
$$\overline{r}=\sum _{i:r(i)\ne 0}r(i),\hspace{1em}\hspace{1em}\Vert r{\Vert}_{p}={\left[\sum _{i:r(i)\ne 0}|r(i){|}^{p}\right]}^{1/p},$$
(17)
$${\tilde{s}}_{j}*\mathit{wr}(j)=\sum _{i:w(i-j)=1}{\tilde{s}}_{j}(i)r(i-j)w(i-j),$$
(18)
$${\tilde{s}}_{j}*w(j)=\sum _{i:w(i-j)=1}{\tilde{s}}_{j}(i)w(i-j).$$
(19)
$$\stackrel{\u02c6}{m}(j)=\frac{1}{{n}_{o}}\sum _{i:w(i-j)=0}s(i).$$
(20)
$$\widehat{\mathrm{Var}}({x}_{i})=[{\tilde{s}}_{j}(i)-{\widehat{m}}_{1}(j)r(i-j)+{\widehat{m}}_{d}(j){]}^{2}w(i-j),$$
(22)
$$\mathrm{Var}({X}_{j})=\{{\sigma}_{1}^{2}{r}^{2}(i-j)w(i-j)+{\sigma}_{d}^{2}{\}}_{[i:w(i-j)=1]}.$$
(23)
$$\left[\begin{array}{cc}\Vert r{\Vert}_{4}^{4}& \Vert r{\Vert}_{2}^{2}\\ \Vert r{\Vert}_{2}^{2}& {n}_{w}\end{array}\right]\left[\begin{array}{c}{\widehat{\sigma}}_{1}^{2}(j)\\ {\widehat{\sigma}}_{d}^{2}(j)\end{array}\right]=\left[\begin{array}{c}A(j)\\ B(j)\end{array}\right],$$
(24)
$$A(j)=\sum _{i:w(i-j)=1}[{\tilde{s}}_{j}(i)-{\widehat{m}}_{1}(j)r(i-j)+{\tilde{m}}_{d}(j){]}^{2}\times {r}^{2}(i-j),$$
(25)
$$B(j)=\sum _{i:w(i-j)=1}[{\tilde{s}}_{j}(i)-{\widehat{m}}_{1}(j)r(i-j)+{\widehat{m}}_{d}(j){]}^{2}.$$
(26)
$${\widehat{\sigma}}^{2}(j)=\frac{1}{{n}_{o}}\sum _{i:w(i-j)=0}[s(i)-\stackrel{\u02c6}{m}(j){]}^{2},$$
(27)
$$logP(s|{H}_{j})=-({C}_{j}+{D}_{j}+{E}_{j}),$$
(28)
$${C}_{j}=\sum _{i:w(i-j)=1}log[{r}^{2}(i-j){\widehat{\sigma}}_{1}^{2}(j)+{\widehat{\sigma}}_{d}^{2}(j)],$$
(29)
$${D}_{j}=\sum _{i:w(i-j)=1}\times \frac{[s(i)-r(i-j){\widehat{m}}_{1}(j)-r(i-j)-{\widehat{m}}_{d}(j){]}^{2}}{{r}^{2}(i-j){\widehat{\sigma}}_{1}^{2}(j)+{\widehat{\sigma}}_{d}^{2}(j)},$$
(30)
$${E}_{j}=\frac{{n}_{o}}{2}log[{\widehat{\sigma}}^{2}(j)].$$
(31)
$${\mathrm{\lambda}}_{j}=-{C}_{j}-{D}_{j}-{E}_{j}$$
(32)
$${\mathrm{\lambda}}_{j}=-{C}_{j}-{E}_{j},$$