Treffer: Topological Inference via Meshing

Title:
Topological Inference via Meshing
Contributors:
Toyota Technological Institute, Toyota Technological Institute at Chicago [Chicago] (TTIC), Computer Science Department - Carnegie Mellon University, University of Pittsburgh (PITT), Pennsylvania Commonwealth System of Higher Education (PCSHE)-Pennsylvania Commonwealth System of Higher Education (PCSHE), Geometric computing (GEOMETRICA), Centre Inria d'Université Côte d'Azur, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-Centre Inria de Saclay, Institut National de Recherche en Informatique et en Automatique (Inria), INRIA
Source:
[Research Report] RR-7125, INRIA. 2009
Publisher Information:
CCSD, 2009.
Publication Year:
2009
Collection:
collection:INRIA
collection:INRIA-SOPHIA
collection:INRIA-RRRT
collection:INRIA-SACLAY
collection:INRIASO
collection:INRIA_TEST
collection:TESTALAIN1
collection:INRIA2
collection:LARA
collection:UNIV-COTEDAZUR
collection:INRIA-ETATSUNIS
Original Identifier:
HAL:
Document Type:
Report report<br />Reports
Language:
English
Rights:
info:eu-repo/semantics/OpenAccess
Accession Number:
edshal.inria.00436891v3
Database:
HAL

Weitere Informationen

We apply ideas from mesh generation to improve the time and space complexities of computing the full persistent homological information associated with a point cloud $P$ in Euclidean space $\R^d$. Classical approaches rely on the \v Cech, Rips, $\alpha$-complex, or witness complex filtrations of $P$, whose complexities scale up very badly with $d$. For instance, the $\alpha$-complex filtration incurs the $n^{\Omega(d)}$ size of the Delaunay triangulation, where $n$ is the size of $P$. The common alternative is to truncate the filtrations when the sizes of the complexes become prohibitive, possibly before discovering the most relevant topological features. In this paper we propose a new collection of filtrations, based on the Delaunay triangulation of a carefully-chosen superset of $P$, whose sizes are reduced to $2^{O(d^2)}n$. A nice property of these filtrations is to be interleaved multiplicatively with the family of offsets of $P$, so that the persistence diagram of $P$ can be approximated in $2^{O(d^2)}n^3$ time in theory, with a near-linear observed running time in practice (ignoring the constant factors depending exponentially on $d$). Thus, our approach remains tractable in medium dimensions, say 4 to 10.