Effect of Time Delay on Binary Signal Detection via a Bistable System
-
Abstract
The effect of time delay on binary signal detection via a bistable system in the presence of white or colored Gaussian noise is investigated. By defining the bit error rate based on the solution of the approximated Fokker–Planck equation, the detector performance is investigated theoretically and is verified by Monte Carlo simulation. It is shown that, when the system parameter or noise intensity is optimally chosen, the increasing time delay generally improves the system performance. It is also shown that it is more difficult to accurately predict the system performance with a larger time delay and correlation time. This may inspire more thorough investigations in cooperative effects of a nonlinear system and time delay on signal processing.
Article Text
-
-
-
About This Article
Cite this article:
ZENG Ling-Zao, LIU Bing-Yang, XU Yi-Da, LI Jian-Long. Effect of Time Delay on Binary Signal Detection via a Bistable System[J]. Chin. Phys. Lett., 2014, 31(2): 020501. DOI: 10.1088/0256-307X/31/2/020501
ZENG Ling-Zao, LIU Bing-Yang, XU Yi-Da, LI Jian-Long. Effect of Time Delay on Binary Signal Detection via a Bistable System[J]. Chin. Phys. Lett., 2014, 31(2): 020501. DOI: 10.1088/0256-307X/31/2/020501
|
ZENG Ling-Zao, LIU Bing-Yang, XU Yi-Da, LI Jian-Long. Effect of Time Delay on Binary Signal Detection via a Bistable System[J]. Chin. Phys. Lett., 2014, 31(2): 020501. DOI: 10.1088/0256-307X/31/2/020501
ZENG Ling-Zao, LIU Bing-Yang, XU Yi-Da, LI Jian-Long. Effect of Time Delay on Binary Signal Detection via a Bistable System[J]. Chin. Phys. Lett., 2014, 31(2): 020501. DOI: 10.1088/0256-307X/31/2/020501
|