Utilize este identificador para referenciar este registo: https://hdl.handle.net/10316/40485
Campo DCValorIdioma
dc.contributor.advisorGonçalves, Nuno Miguel Mendonça da Silva-
dc.contributor.authorFerreira, Rodrigo Miguel Belo Leal Toste-
dc.date.accessioned2017-04-04T16:00:23Z-
dc.date.available2017-04-04T16:00:23Z-
dc.date.issued2016-02-24-
dc.identifier.urihttps://hdl.handle.net/10316/40485-
dc.descriptionDissertação de Mestrado Integrado em Engenharia Electrotécnica e de Computadores apresentada à Faculdade de Ciências e Tecnologia da Universidade de Coimbrapt
dc.description.abstractLight field cameras capture a scene’s multi-directional light field with one image, allowing the estimation of depth of the captured scene and focus the image after it has been taken. In this thesis, we introduce a fully automatic method for depth estimation from a single plenoptic image running a RANSAC-like algorithm for feature matching. We filter the estimated depth points on a global and fine scale, allowing a more accurate depth estimation. The novelty about our approach is the global method to back project correspondences found using photometric similarity to obtain a 3D virtual point cloud. We use a smart mixture of lenses with different focal-lengths in a multiple depth map refining phase, generating a dense depth map. This depth map is then used to generate very high quality all-in-focus renders. We also introduce a new method for detection and correction of highly blurred areas, which greatly improves the depth estimation of the scene and subsequently the all-in-focus as well. As far as the author knows, our algorithm is the first fully automatic (zero intervention) method to process multi-focus plenoptic images. On the previous work a plenoptic data simulator was introduced which allows us to create plenoptic datasets with specific parameters. Knowing the depth ground truth of these datasets we are able to test and improve our algorithm and provide guidelines for future work. Tests with simulated datasets and real images are presented and show very good accuracy of the method presented. We also compare our results with other methods, being able to achieve comparable results to the state of the art with substantial less processing time. A short paper was submitted and accepted to Eurographics 2016, the 37th Annual Conference of the European Association for Computer Graphics and a full paper was also submitted to ICCP 2016, an International Conference on Computer Photography.pt
dc.language.isoengpt
dc.rightsopenAccesspt
dc.subjectCâmeras Plenópticaspt
dc.subjectcampo de luzpt
dc.subjectestimação de profundidadept
dc.subjectall in focuspt
dc.subjectdados plenópticos simuladospt
dc.subjectRaytrixpt
dc.subjectLytropt
dc.subjectPlenoptic cameraspt
dc.subjectlight eldpt
dc.subjectdepth estimationpt
dc.subjectall in focuspt
dc.subjectsynthetic plenopticdatapt
dc.subjectRaytrixpt
dc.subjectLytropt
dc.titleA fully automatic depth estimation algorithm for multi-focus plenoptic cameras: coarse and dense aproachespt
dc.typemasterThesispt
degois.publication.locationCoimbrapt
dc.date.embargo2016-02-24*
dc.identifier.tid201673746pt
thesis.degree.nameMestrado Integrado em Engenharia Electrotécnica e de Computadorespt
uc.degree.grantorUnit0501 - Faculdade de Ciências e Tecnologiapor
uc.rechabilitacaoestrangeiranopt
uc.date.periodoEmbargo0pt
uc.controloAutoridadeSim-
item.openairetypemasterThesis-
item.fulltextCom Texto completo-
item.languageiso639-1en-
item.grantfulltextopen-
item.cerifentitytypePublications-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
crisitem.advisor.researchunitISR - Institute of Systems and Robotics-
crisitem.advisor.parentresearchunitUniversity of Coimbra-
crisitem.advisor.orcid0000-0002-1854-049X-
Aparece nas coleções:UC - Dissertações de Mestrado
FCTUC Eng.Electrotécnica - Teses de Mestrado
Ficheiros deste registo:
Mostrar registo em formato simples

Visualizações de página 50

482
Visto em 16/jul/2024

Downloads

64
Visto em 16/jul/2024

Google ScholarTM

Verificar


Todos os registos no repositório estão protegidos por leis de copyright, com todos os direitos reservados.