Effect of Time Delay on Binary Signal Detection via a Bistable System

  • The effect of time delay on binary signal detection via a bistable system in the presence of white or colored Gaussian noise is investigated. By defining the bit error rate based on the solution of the approximated Fokker–Planck equation, the detector performance is investigated theoretically and is verified by Monte Carlo simulation. It is shown that, when the system parameter or noise intensity is optimally chosen, the increasing time delay generally improves the system performance. It is also shown that it is more difficult to accurately predict the system performance with a larger time delay and correlation time. This may inspire more thorough investigations in cooperative effects of a nonlinear system and time delay on signal processing.
  • Article Text

  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return