Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(long video analysis). Showing records 1 – 3 of 3 total matches.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters

1. Hua, Yang. Vers un suivi robuste d'objets visuels : sélection de propositions et traitement des occlusions : Towards robust visual object tracking : proposal selection and occlusion reasoning.

Degree: Docteur es, Mathématiques et Informatique, 2016, Grenoble Alpes

Cette dissertation traite du problème du suivi d'objets visuels, dont le but est de localiser un objet et de déterminer sa trajectoire au cours du temps. En particulier, nous nous concentrons sur les scénarios difficiles, dans lesquels les objets subissent d'importantes déformations et occlusions, ou quittent le champs de vision. A cette fin, nous proposons deux méthodes robustes qui apprennent un modèle pour l'objet d'intérêt et le mettent à jour, afin de refléter ses changements au cours du temps.Notre première méthode traite du problème du suivi dans le cas où les objets subissent d'importantes transformations géométriques comme une rotation ou un changement d'échelle. Nous présentons un nouvel algorithme de sélection de propositions, qui étend l'approche traditionnelle de ``suivi par détection''. Cette méthode procède en deux étapes: proposition puis sélection. Dans l'étape de proposition, nous construisons un ensemble de candidats qui représente les localisations potentielles de l'objet en estimant de manière robuste les transformations géométriques. La meilleure proposition est ensuite sélectionnée parmi cet ensemble de candidats pour précisément localiser l'objet en utilisant des indices d'apparence et de mouvement.Dans un second temps, nous traitons du problème de la mise à jour de modèles dans le suivi visuel, c'est-à-dire de déterminer quand il est besoin de mettre à jour le modèle de la cible, lequel peut subir une occlusion, ou quitter le champs de vision. Pour résoudre cela, nous utilisons des indices de mouvement pour identifier l'état d'un objet de manière automatique et nous mettons à jour le modèle uniquement lorsque l'objet est entièrement visible. En particulier, nous utilisons des trajectoires à long terme ainsi qu'une technique basée sur la coup de graphes pour estimer les parties de l'objet qui sont visibles.Nous avons évalué nos deux approches de manière étendue sur différents bancs d'essai de suivi, en particulier sur le récent banc d'essai de suivi en ligne et le jeu de donnée du concours de suivi visuel. Nos deux approches se comparent favorablement à l'état de l'art et font montre d'améliorations significatives par rapport à plusieurs autres récents suiveurs. Notre soumission au concours de suivi d'objets visuels de 2015 a par ailleurs remporté l'une de ces compétitions.

In this dissertation we address the problem of visual object tracking, whereinthe goal is to localize an object and determine its trajectory over time. Inparticular, we focus on challenging scenarios where the object undergoessignificant transformations, becomes occluded or leaves the field of view. Tothis end, we propose two robust methods which learn a model for the object ofinterest and update it, to reflect its changes over time.Our first method addresses the tracking problem in the context of objectsundergoing severe geometric transformations, such as rotation, change in scale.We present a novel proposal-selection algorithm, which extends the traditionaldiscriminative tracking-by-detection approach. This method…

Advisors/Committee Members: Schmid, Cordelia (thesis director), Alahari, Karteek (thesis director).

Subjects/Keywords: Suivi d'objet visuel; Suivi par détection; Suivi par proposition-Sélection; Suivi à long terme; Traitement des occlusions; Analyse vidéo; Visual object tracking; Tracking-By-Detection; Proposal-Selection tracking; Long-Term tracking; Occlusion reasoning; Video analysis; 510; 004

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hua, Y. (2016). Vers un suivi robuste d'objets visuels : sélection de propositions et traitement des occlusions : Towards robust visual object tracking : proposal selection and occlusion reasoning. (Doctoral Dissertation). Grenoble Alpes. Retrieved from http://www.theses.fr/2016GREAM012

Chicago Manual of Style (16th Edition):

Hua, Yang. “Vers un suivi robuste d'objets visuels : sélection de propositions et traitement des occlusions : Towards robust visual object tracking : proposal selection and occlusion reasoning.” 2016. Doctoral Dissertation, Grenoble Alpes. Accessed August 11, 2020. http://www.theses.fr/2016GREAM012.

MLA Handbook (7th Edition):

Hua, Yang. “Vers un suivi robuste d'objets visuels : sélection de propositions et traitement des occlusions : Towards robust visual object tracking : proposal selection and occlusion reasoning.” 2016. Web. 11 Aug 2020.

Vancouver:

Hua Y. Vers un suivi robuste d'objets visuels : sélection de propositions et traitement des occlusions : Towards robust visual object tracking : proposal selection and occlusion reasoning. [Internet] [Doctoral dissertation]. Grenoble Alpes; 2016. [cited 2020 Aug 11]. Available from: http://www.theses.fr/2016GREAM012.

Council of Science Editors:

Hua Y. Vers un suivi robuste d'objets visuels : sélection de propositions et traitement des occlusions : Towards robust visual object tracking : proposal selection and occlusion reasoning. [Doctoral Dissertation]. Grenoble Alpes; 2016. Available from: http://www.theses.fr/2016GREAM012

2. Matuszewski, Damian Janusz. Computer vision for continuous plankton monitoring.

Degree: Mestrado, Ciência da Computação, 2014, University of São Paulo

Plankton microorganisms constitute the base of the marine food web and play a great role in global atmospheric carbon dioxide drawdown. Moreover, being very sensitive to any environmental changes they allow noticing (and potentially counteracting) them faster than with any other means. As such they not only influence the fishery industry but are also frequently used to analyze changes in exploited coastal areas and the influence of these interferences on local environment and climate. As a consequence, there is a strong need for highly efficient systems allowing long time and large volume observation of plankton communities. This would provide us with better understanding of plankton role on global climate as well as help maintain the fragile environmental equilibrium. The adopted sensors typically provide huge amounts of data that must be processed efficiently without the need for intensive manual work of specialists. A new system for general purpose particle analysis in large volumes is presented. It has been designed and optimized for the continuous plankton monitoring problem; however, it can be easily applied as a versatile moving fluids analysis tool or in any other application in which targets to be detected and identified move in a unidirectional flux. The proposed system is composed of three stages: data acquisition, targets detection and their identification. Dedicated optical hardware is used to record images of small particles immersed in the water flux. Targets detection is performed using a Visual Rhythm-based method which greatly accelerates the processing time and allows higher volume throughput. The proposed method detects, counts and measures organisms present in water flux passing in front of the camera. Moreover, the developed software allows saving cropped plankton images which not only greatly reduces required storage space but also constitutes the input for their automatic identification. In order to assure maximal performance (up to 720 MB/s) the algorithm was implemented using CUDA for GPGPU. The method was tested on a large dataset and compared with alternative frame-by-frame approach. The obtained plankton images were used to build a classifier that is applied to automatically identify organisms in plankton analysis experiments. For this purpose a dedicated feature extracting software was developed. Various subsets of the 55 shape characteristics were tested with different off-the-shelf learning models. The best accuracy of approximately 92% was obtained with Support Vector Machines. This result is comparable to the average expert manual identification performance. This work was developed under joint supervision with Professor Rubens Lopes (IO-USP).

Microorganismos planctônicos constituem a base da cadeia alimentar marinha e desempenham um grande papel na redução do dióxido de carbono na atmosfera. Além disso, são muito sensíveis a alterações ambientais e permitem perceber (e potencialmente neutralizar) as mesmas mais rapidamente do que em qualquer outro meio. Como tal, não só…

Advisors/Committee Members: Cesar Junior, Roberto Marcondes.

Subjects/Keywords: análise de vídeos longos; Big Data; Big Data; detecção de plâncton; e-Science; e-Science; long video analysis; marine environment monitoring; monitoramento de ambiente marinho; plankton detection; Ritmo Visual; visual rhythm

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Matuszewski, D. J. (2014). Computer vision for continuous plankton monitoring. (Masters Thesis). University of São Paulo. Retrieved from http://www.teses.usp.br/teses/disponiveis/45/45134/tde-24042014-150825/ ;

Chicago Manual of Style (16th Edition):

Matuszewski, Damian Janusz. “Computer vision for continuous plankton monitoring.” 2014. Masters Thesis, University of São Paulo. Accessed August 11, 2020. http://www.teses.usp.br/teses/disponiveis/45/45134/tde-24042014-150825/ ;.

MLA Handbook (7th Edition):

Matuszewski, Damian Janusz. “Computer vision for continuous plankton monitoring.” 2014. Web. 11 Aug 2020.

Vancouver:

Matuszewski DJ. Computer vision for continuous plankton monitoring. [Internet] [Masters thesis]. University of São Paulo; 2014. [cited 2020 Aug 11]. Available from: http://www.teses.usp.br/teses/disponiveis/45/45134/tde-24042014-150825/ ;.

Council of Science Editors:

Matuszewski DJ. Computer vision for continuous plankton monitoring. [Masters Thesis]. University of São Paulo; 2014. Available from: http://www.teses.usp.br/teses/disponiveis/45/45134/tde-24042014-150825/ ;

3. Evers, Aaron S. Evaluation and Application of LTE, DVB, and DAB Signals of Opportunity for Passive Bistatic SAR Imaging.

Degree: MSEgr, Electrical Engineering, 2014, Wright State University

Due to the many advantages of passive radar and ubiquity of commercial broadcast transmitters, interest in passive bistatic radar (PBR) applications has continued to grow. More specifically, sources studying commercial orthogonal frequency division multiplexing (OFDM) waveforms for passive bistatic synthetic aperture radar (SAR) imaging have become more common. This work evaluates and applies long term evolution (LTE), digital video broadcast (DVB), and digital audio broadcast (DAB) signals of opportunity for passive bistatic SAR imaging.First, implications of the structure and properties of each of the signal of opportunity's transmitted waveform are characterized by examining the waveform's self- and cross-ambiguity functions (AFs). In addition to deriving waveform properties, link budget analysis is completed using pessimistic values intrinsic to LTE, DVB, and DAB transmissions for predicting performance of potential passive bistatic SAR imaging scenarios. Small-scale, passive bistatic SAR imaging experiments are carried out using signals structured similarly to LTE, DVB, and DAB signals, demonstrating the merits of the considered processing schemes for passive bistatic SAR image generation. Advisors/Committee Members: Jackson, Julie (Advisor), Rigling, Brian (Advisor).

Subjects/Keywords: Electrical Engineering; Passive Bistatic Radar; Ambiguity Function; Link Budget Analysis; Orthogonal Frequency Division Duplexing; Long Term Evolution; Digital Video Broadcast; Digital Audio Broadcast

…communication standards such as long term evolution (LTE) [16]. Consequently… …LTE, digital video broadcast (DVB), and digital audio broadcast (DAB)… …x29; defined in [19]. Then, similar to analysis performed in [2, 13, 20–24… …budget analysis for passive bistatic SAR imaging systems exploiting LTE, DVB, and DAB signals… …detail. 5 2.1.1 Ambiguity Analysis Works such as [13, 20–24] have examined the… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Evers, A. S. (2014). Evaluation and Application of LTE, DVB, and DAB Signals of Opportunity for Passive Bistatic SAR Imaging. (Masters Thesis). Wright State University. Retrieved from http://rave.ohiolink.edu/etdc/view?acc_num=wright1398768956

Chicago Manual of Style (16th Edition):

Evers, Aaron S. “Evaluation and Application of LTE, DVB, and DAB Signals of Opportunity for Passive Bistatic SAR Imaging.” 2014. Masters Thesis, Wright State University. Accessed August 11, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=wright1398768956.

MLA Handbook (7th Edition):

Evers, Aaron S. “Evaluation and Application of LTE, DVB, and DAB Signals of Opportunity for Passive Bistatic SAR Imaging.” 2014. Web. 11 Aug 2020.

Vancouver:

Evers AS. Evaluation and Application of LTE, DVB, and DAB Signals of Opportunity for Passive Bistatic SAR Imaging. [Internet] [Masters thesis]. Wright State University; 2014. [cited 2020 Aug 11]. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=wright1398768956.

Council of Science Editors:

Evers AS. Evaluation and Application of LTE, DVB, and DAB Signals of Opportunity for Passive Bistatic SAR Imaging. [Masters Thesis]. Wright State University; 2014. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=wright1398768956

.