Result: From open-vocabulary to vocabulary-free semantic segmentation

Title:
From open-vocabulary to vocabulary-free semantic segmentation
Source:
Pattern Recognition Letters. 198:14-21
Publication Status:
Preprint
Publisher Information:
Elsevier BV, 2025.
Publication Year:
2025
Document Type:
Academic journal Article
Language:
English
ISSN:
0167-8655
DOI:
10.1016/j.patrec.2025.08.025
DOI:
10.48550/arxiv.2502.11891
Rights:
CC BY
arXiv Non-Exclusive Distribution
Accession Number:
edsair.doi.dedup.....a7fdafcb4ae42a8096f3b83459f5f8f6
Database:
OpenAIRE

Further Information

Open-vocabulary semantic segmentation enables models to identify novel object categories beyond their training data. While this flexibility represents a significant advancement, current approaches still rely on manually specified class names as input, creating an inherent bottleneck in real-world applications. This work proposes a Vocabulary-Free Semantic Segmentation pipeline, eliminating the need for predefined class vocabularies. Specifically, we address the chicken-and-egg problem where users need knowledge of all potential objects within a scene to identify them, yet the purpose of segmentation is often to discover these objects. The proposed approach leverages Vision-Language Models to automatically recognize objects and generate appropriate class names, aiming to solve the challenge of class specification and naming quality. Through extensive experiments on several public datasets, we highlight the crucial role of the text encoder in model performance, particularly when the image text classes are paired with generated descriptions. Despite the challenges introduced by the sensitivity of the segmentation text encoder to false negatives within the class tagging process, which adds complexity to the task, we demonstrate that our fully automated pipeline significantly enhances vocabulary-free segmentation accuracy across diverse real-world scenarios.
Submitted to: Pattern Recognition Letters, Klara Reichard and Giulia Rizzoli equally contributed to this work