A stochastic control problem

Date

2004-11-23

Authors

Margulies, William
Zes, Dean

Journal Title

Journal ISSN

Volume Title

Publisher

Texas State University-San Marcos, Department of Mathematics

Abstract

In this paper, we study a specific stochastic differential equation depending on a parameter and obtain a representation of its probability density function in terms of Jacobi Functions. The equation arose in a control problem with a quadratic performance criteria. The quadratic performance is used to eliminate the control in the standard Hamilton-Jacobi variational technique. The resulting stochastic differential equation has a noise amplitude which complicates the solution. We then solve Kolmogorov's partial differential equation for the probability density function by using Jacobi Functions. A particular value of the parameter makes the solution a Martingale and in this case we prove that the solution goes to zero almost surely as time tends to infinity.

Description

Keywords

Stochastic differential equations, Control problems, Jacobi functions

Citation

Margulies, W., & Zes, D. (2004). A stochastic control problem. <i>Electronic Journal of Differential Equations, 2004</i>(135), pp. 1-10.

Rights

Attribution 4.0 International

Rights Holder

Rights License