Stochastic differential games have been used as models for a wide variety of physical systems. These games are a natural evolution from some stochastic control problems. Two well known methods to find optimal control strategies for a stochastic differential game are solving Hamilton-Jacobi-Isaacs equations which are nonlinear partial differentia...
Creator:
Duncan, Tyrone (University of Kansas)
Created:
2018-05-11
Contributed By:
University of Minnesota, Institute for Mathematics and its Applications.