The problem of automatically selecting the pose of a 3D object that corresponds to the most informative and intuitive view of the shape is known as the best view problem. In this paper we address the selection of the best view driven by the meaningful features of the shape, in order to maximize the visibility of salient components from the context or from the application point of view. Meaningful features can be automatically detected by means of semantic-oriented segmentations: we tested several approaches with very pleasant results in the automatic generation of thumbnails for large 3D model databases.

Semantics-driven best view of 3D shapes

Michela Mortara;Michela Spagnuolo
2009

Abstract

The problem of automatically selecting the pose of a 3D object that corresponds to the most informative and intuitive view of the shape is known as the best view problem. In this paper we address the selection of the best view driven by the meaningful features of the shape, in order to maximize the visibility of salient components from the context or from the application point of view. Meaningful features can be automatically detected by means of semantic-oriented segmentations: we tested several approaches with very pleasant results in the automatic generation of thumbnails for large 3D model databases.
2009
Istituto di Matematica Applicata e Tecnologie Informatiche - IMATI -
Viewpoint selection
Semantics
Segmentation
3D shapes
File in questo prodotto:
File Dimensione Formato  
prod_31420-doc_19994.pdf

solo utenti autorizzati

Descrizione: articolo pubblicato
Dimensione 916.73 kB
Formato Adobe PDF
916.73 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
prod_31420-doc_30122.pdf

solo utenti autorizzati

Descrizione: prefazione special issue
Dimensione 133.7 kB
Formato Adobe PDF
133.7 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/40719
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 54
  • ???jsp.display-item.citation.isi??? 33
social impact