Result: Cross-Entropy and Relative Entropy of Basic Belief Assignments

Title:
Cross-Entropy and Relative Entropy of Basic Belief Assignments
Contributors:
DTIS, ONERA, Université Paris Saclay [Palaiseau], ONERA-Université Paris-Saclay
Source:
International Conference on Information Fusion (FUSION 2023). :1-8
Publisher Information:
HAL CCSD; IEEE, 2023.
Publication Year:
2023
Collection:
collection:ONERA
collection:UNIV-PARIS-SACLAY
collection:UNIVERSITE-PARIS-SACLAY
collection:GS-COMPUTER-SCIENCE
collection:DTIS_ONERA
Subject Geographic:
Original Identifier:
HAL: hal-04628203
Document Type:
Conference conferenceObject<br />Conference papers
Language:
English
Relation:
info:eu-repo/semantics/altIdentifier/doi/10.23919/FUSION52260.2023.10224207
DOI:
10.23919/FUSION52260.2023.10224207
Rights:
info:eu-repo/semantics/OpenAccess
Accession Number:
edshal.hal.04628203v1
Database:
HAL

Further Information

This paper introduces the concept of cross-entropy and relative entropy of two basic belief assignments. It is based on the new entropy measure presented recently. We prove that the cross-entropy satisfies a generalized Gibbs-alike inequality from which a generalized Kullback-Leibler divergence measure can be established in the framework of belief functions. We show on a simple illustrating example how these concepts can be used for decision-making under uncertainty.