Treffer: From POMDP executions to policy specifications
Title:
From POMDP executions to policy specifications
Contributors:
Meli, Daniele, Mazzi, Giulio, Castellini, Alberto, Farinelli, Alessandro
Publication Year:
2022
Collection:
Università degli Studi di Verona: Catalogo dei Prodotti della Ricerca (IRIS)
Subject Terms:
Document Type:
Konferenz
conference object
Language:
English
Relation:
ispartofbook:CEUR Workshop Proceedings; 4th Workshop on Artificial Intelligence and Formal Verification, Logic, Automata, and Synthesis hosted by the 21st International Conference of the Italian Association for Artificial Intelligence (AIxIA 2022); volume:3311; firstpage:93; lastpage:98; numberofpages:6; serie:CEUR WORKSHOP PROCEEDINGS; https://hdl.handle.net/11562/1093986
Rights:
info:eu-repo/semantics/openAccess
Accession Number:
edsbas.5F98BAD
Database:
BASE
Weitere Informationen
Partially Observable Markov Decision Processes (POMDPs) allow modeling systems with uncertain state using probability distributions over states (called beliefs). However, in complex domains, POMDP solvers must explore large belief spaces, which is computationally intractable. One solution is to introduce domain knowledge to drive exploration, in the form of logic specifications. However, defining effective specifications may be challenging even for domain experts. We propose an approach based on inductive logic programming to learn specifications with confidence level from observed POMDP executions. We show that the learning approach converges to robust specifications as the number of examples increases.