Learning mean field games with discounted and average costs

buir.contributor.authorSaldı, Naci
buir.contributor.orcidSaldı, Naci|0000-0002-2677-7366
dc.citation.epage17-59
dc.citation.spage17-1
dc.citation.volumeNumber24
dc.contributor.authorAnahtarcı, B
dc.contributor.authorKarıksız, C. D.
dc.contributor.authorSaldı, Naci
dc.contributor.editorAlekh Agarwal
dc.date.accessioned2024-03-14T06:50:22Z
dc.date.available2024-03-14T06:50:22Z
dc.date.issued2023-12-16
dc.departmentDepartment of Mathematics
dc.description.abstractWe consider learning approximate Nash equilibria for discrete-time mean-field games with stochastic nonlinear state dynamics subject to both average and discounted costs. To this end, we introduce a mean-field equilibrium (MFE) operator, whose fixed point is a mean-field equilibrium, i.e., equilibrium in the infinite population limit. We first prove that this operator is a contraction, and propose a learning algorithm to compute an approximate mean-field equilibrium by approximating the MFE operator with a random one. Moreover, using the contraction property of the MFE operator, we establish the error analysis of the proposed learning algorithm. We then show that the learned mean-field equilibrium constitutes an approximate Nash equilibrium for finite-agent games.
dc.identifier.eissn1533-7928
dc.identifier.issn1532-4435
dc.identifier.urihttps://hdl.handle.net/11693/114716
dc.language.isoen
dc.publisherJournal of Machine Learning Research
dc.rightsCC BY 4.0
dc.source.titleJournal of Machine Learning Research
dc.subjectMean- field games
dc.subjectApproximate Nash equilibrium
dc.subjectFitted Q-iteration algo-rithm
dc.subjectDiscounted-cost
dc.subjectAverage-cost
dc.titleLearning mean field games with discounted and average costs
dc.typeArticle
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Learning_mean_field_games_with_discounted_and_average
Size:
656.71 KB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.01 KB
Format:
Item-specific license agreed upon to submission
Description: