You searched for subject:(hard end point data)
.
Showing records 1 – 30 of
50928 total matches.
◁ [1] [2] [3] [4] [5] … [1698] ▶

University of Victoria
1.
Alam, Shahid.
A Framework for Metamorphic Malware Analysis and Real-Time Detection.
Degree: Department of Computer Science, 2014, University of Victoria
URL: http://hdl.handle.net/1828/5576
► Metamorphism is a technique that mutates the binary code using different obfuscations. It is difficult to write a new metamorphic malware and in general malware…
(more)
▼ Metamorphism is a technique that mutates the binary code using different obfuscations. It is difficult to write a new metamorphic malware and in general malware writers reuse old malware. To hide detection the malware writers change the obfuscations (syntax) more than the behavior (semantic) of such a new malware. On this assumption and motivation, this thesis presents a new framework named MARD for Metamorphic Malware Analysis and Real-Time Detection. We also introduce a new intermediate language named MAIL (Malware Analysis Intermediate Language). Each MAIL statement is assigned a pattern that can be used to annotate a control flow graph for pattern matching to analyse and detect metamorphic malware. MARD uses MAIL to achieve platform independence, automation and optimizations for metamorphic malware analysis and detection. As part of the new framework, to build a behavioral signature and detect metamorphic malware in real-time, we propose two novel techniques, named ACFG (Annotated Control Flow Graph) and SWOD-CFWeight (Sliding Window of Difference and Control Flow Weight). Unlike other techniques, ACFG provides a faster matching of CFGs, without compromising
detection accuracy; it can handle malware with smaller CFGs, and contains more information and hence provides more accuracy than a CFG. SWOD-CFWeight mitigates and addresses key issues in current techniques, related to the change of the frequencies of opcodes, such as the use of different compilers, compiler optimizations, operating systems and obfuscations. The size of SWOD can change, which gives anti-malware tool developers the ability to select appropriate parameter values to further optimize malware detection. CFWeight captures the control flow semantics of a program to an extent that helps detect metamorphic malware in real-time. Experimental evaluation of the two proposed techniques, using an existing dataset, achieved detection rates in the range 94% - 99.6% and false positive rates in the range 0.93% - 12.44%. Compared to ACFG, SWOD-CFWeight significantly improves the detection time, and is suitable to be used where the time for malware detection is more important as in real-time (practical) anti-malware applications.
Advisors/Committee Members: Horspool, R. Nigel (supervisor), Traore, Issa (supervisor).
Subjects/Keywords: End point security; Malware detection; Metamorphic malware; Control flow analysis; Heuristics; Data mining; Window of difference
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Alam, S. (2014). A Framework for Metamorphic Malware Analysis and Real-Time Detection. (Thesis). University of Victoria. Retrieved from http://hdl.handle.net/1828/5576
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Alam, Shahid. “A Framework for Metamorphic Malware Analysis and Real-Time Detection.” 2014. Thesis, University of Victoria. Accessed March 07, 2021.
http://hdl.handle.net/1828/5576.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Alam, Shahid. “A Framework for Metamorphic Malware Analysis and Real-Time Detection.” 2014. Web. 07 Mar 2021.
Vancouver:
Alam S. A Framework for Metamorphic Malware Analysis and Real-Time Detection. [Internet] [Thesis]. University of Victoria; 2014. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/1828/5576.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Alam S. A Framework for Metamorphic Malware Analysis and Real-Time Detection. [Thesis]. University of Victoria; 2014. Available from: http://hdl.handle.net/1828/5576
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Otago
2.
Roberts, Dax.
Data Remanence In New Zealand
.
Degree: 2013, University of Otago
URL: http://hdl.handle.net/10523/3768
► International research has shown that individuals and companies in other countries do not always fully remove the data from their computer data storage devices before…
(more)
▼ International research has shown that individuals and companies in other countries do not always fully remove the
data from their computer
data storage devices before disposing of them. Typically this means when people are disposing of their computer
hard drives there is a wealth of personal or corporate information that can be exploited to commit such crimes as identity theft, fraud, stalking and blackmail.
A further literature review showed that no such “
data remanence” research for
hard drives (or any other
data storage devices such as mobile phones, USB thumb drives and the like) had been conducted in New Zealand.
The methodologies for all relevant
hard drive
data remanence experiments were compared and then used to design the most appropriate methodology for this research.
100 second hand
hard drives were then sourced nationally across New Zealand for the experiments of this research to determine the baseline of
data remanence for
hard drives in New Zealand. The results of the experiments were then compared with international results to determine how New Zealand compares and what if any further actions (such as education) should be taken.
Advisors/Committee Members: Wolfe, H (advisor).
Subjects/Keywords: Data;
Remanence;
Hard Drives;
Data Remanence
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Roberts, D. (2013). Data Remanence In New Zealand
. (Doctoral Dissertation). University of Otago. Retrieved from http://hdl.handle.net/10523/3768
Chicago Manual of Style (16th Edition):
Roberts, Dax. “Data Remanence In New Zealand
.” 2013. Doctoral Dissertation, University of Otago. Accessed March 07, 2021.
http://hdl.handle.net/10523/3768.
MLA Handbook (7th Edition):
Roberts, Dax. “Data Remanence In New Zealand
.” 2013. Web. 07 Mar 2021.
Vancouver:
Roberts D. Data Remanence In New Zealand
. [Internet] [Doctoral dissertation]. University of Otago; 2013. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/10523/3768.
Council of Science Editors:
Roberts D. Data Remanence In New Zealand
. [Doctoral Dissertation]. University of Otago; 2013. Available from: http://hdl.handle.net/10523/3768

Texas A&M University
3.
Hafley, Brian Scott.
Developement of monoclonal antibodies for a multiple antigen ELISA to verify safe cooking end-point temperature in beef and pork.
Degree: PhD, Food Science and Technology, 2007, Texas A&M University
URL: http://hdl.handle.net/1969.1/4802
► Four proteins exhibiting different rates of denaturation or precipitation with increasing cooking temperature from 63 to 73ðC for beef and 67 to 79ðC for pork…
(more)
▼ Four proteins exhibiting different rates of denaturation or precipitation with
increasing cooking temperature from 63 to 73ðC for beef and 67 to 79ðC for pork were
selected for developing a ratio model and incorporating the results into a mathematical
expression. Monoclonal antibodies (Mabs) against lactate dehydrogenase isozyme 5
(LDH-5), bovine serum albumin (BSA), porcine enolase, and bovine myoglobin were
developed for use in a sandwich enzyme-linked immunosorbent assay (ELISA) to
simultaneously investigate changes in protein concentration with incremental increases
in temperature.
Four groups of mice were immunized separately with commercially available or
purified protein (LDH-5, BSA, enolase, or myoglobin). After reporting ample blood
serum titers, spleen cells were harvested and fused with SP2 myeloma tumor cells using
an electro fusion cell manipulator. Hybridoma containing wells were screened against
their respective protein to isolate hybridomas secreting protein specific Mabs. Tissue culture flask produced Mabs were used initially in sandwich ELISA assay
testing. Mabs were tested against ground beef and pork cooked to instantaneous endpoint
temperatures (EPTs). A 6 g section removed from the geometric center of each
sample was homogenized in phosphate buffer, centrifuged, and a 1 ml aliquot collected
for analysis.
Microtiter plates were coated with goat anti-mouse IgG antibody (2 mg/ml) to act
as a capture antibody for the protein specific monoclonal antibody concentrated from cell
culture supernatant. Serial diluted muscle (beef or pork) extract (10 ml) from each EPT
was applied to a microtiter plate. A protein A/G purified polyclonal antibody (Pab) was
applied, followed by a goat anti-rabbit IgG peroxidase conjugated antibody.
Concentration was determined by comparison to a standard curve.
After multiple cell fusions, 24, 29, 66, and 12 cell lines secreting protein specific
Mabs against LDH-5, BSA, enolase, and myoglobin, respectively, were produced. Six
Mabs against LDH-5 reported R2 values > 0.9 indicating high specificity and affinity for
LDH-5. Sandwich ELISA assays development with Mabs against BSA, enolase, and
myoglobin was not as successful. Mouse ascites produced Mabs against BSA, enolase,
and myoglobin were also unsuccessful when used in a sandwich ELISA. However,
preliminary
data suggested a multiple antigen ratio model still remained a viable option.
Advisors/Committee Members: Keeton, Jimmy T. (advisor), Berghman, Luc R. (committee member), Miller, Rhonda K. (committee member), Rooney, Lloyd W. (committee member).
Subjects/Keywords: End-point Temperature
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Hafley, B. S. (2007). Developement of monoclonal antibodies for a multiple antigen ELISA to verify safe cooking end-point temperature in beef and pork. (Doctoral Dissertation). Texas A&M University. Retrieved from http://hdl.handle.net/1969.1/4802
Chicago Manual of Style (16th Edition):
Hafley, Brian Scott. “Developement of monoclonal antibodies for a multiple antigen ELISA to verify safe cooking end-point temperature in beef and pork.” 2007. Doctoral Dissertation, Texas A&M University. Accessed March 07, 2021.
http://hdl.handle.net/1969.1/4802.
MLA Handbook (7th Edition):
Hafley, Brian Scott. “Developement of monoclonal antibodies for a multiple antigen ELISA to verify safe cooking end-point temperature in beef and pork.” 2007. Web. 07 Mar 2021.
Vancouver:
Hafley BS. Developement of monoclonal antibodies for a multiple antigen ELISA to verify safe cooking end-point temperature in beef and pork. [Internet] [Doctoral dissertation]. Texas A&M University; 2007. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/1969.1/4802.
Council of Science Editors:
Hafley BS. Developement of monoclonal antibodies for a multiple antigen ELISA to verify safe cooking end-point temperature in beef and pork. [Doctoral Dissertation]. Texas A&M University; 2007. Available from: http://hdl.handle.net/1969.1/4802

Brno University of Technology
4.
Bečička, Martin.
Analýza dat získaných z genotypizačních esejí z real-time PCR: Analysis of data obtained from genotyping essays from the real-time PCR.
Degree: 2018, Brno University of Technology
URL: http://hdl.handle.net/11012/81823
► Bachelor thesis is concerned with the visualization of real-time PCR data in MATLAB. Theoretical part of the thesis provides introduction to PCR and real-time PCR,…
(more)
▼ Bachelor thesis is concerned with the visualization of real-time PCR
data in MATLAB. Theoretical part of the thesis provides introduction to PCR and real-time PCR, describes tools used for the evaluation of
data acquired from real-time PCR and the international format RDML used for the storage of such
data. The practical part describes the developed graphical interface.
Advisors/Committee Members: Sekora, Jiří (advisor), Čmiel, Vratislav (referee).
Subjects/Keywords: Real-time PCR; GUI; end-point analýza; Matlab; RDML; Real-time PCR; GUI; end-point analysis; Matlab; RDML
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Bečička, M. (2018). Analýza dat získaných z genotypizačních esejí z real-time PCR: Analysis of data obtained from genotyping essays from the real-time PCR. (Thesis). Brno University of Technology. Retrieved from http://hdl.handle.net/11012/81823
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Bečička, Martin. “Analýza dat získaných z genotypizačních esejí z real-time PCR: Analysis of data obtained from genotyping essays from the real-time PCR.” 2018. Thesis, Brno University of Technology. Accessed March 07, 2021.
http://hdl.handle.net/11012/81823.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Bečička, Martin. “Analýza dat získaných z genotypizačních esejí z real-time PCR: Analysis of data obtained from genotyping essays from the real-time PCR.” 2018. Web. 07 Mar 2021.
Vancouver:
Bečička M. Analýza dat získaných z genotypizačních esejí z real-time PCR: Analysis of data obtained from genotyping essays from the real-time PCR. [Internet] [Thesis]. Brno University of Technology; 2018. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/11012/81823.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Bečička M. Analýza dat získaných z genotypizačních esejí z real-time PCR: Analysis of data obtained from genotyping essays from the real-time PCR. [Thesis]. Brno University of Technology; 2018. Available from: http://hdl.handle.net/11012/81823
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Delft University of Technology
5.
Hemmes, Tom (author); Li, Weiran (author); van der Maaden, Jippe (author); Olsen, Brenda (author).
A vario-scale approach that improves integration of point clouds with different point densities.
Degree: 2017, Delft University of Technology
URL: http://resolver.tudelft.nl/uuid:e89cabbd-a5b5-4cc3-8699-bd751aeb3d97
► Point clouds are becoming one of the most common ways to represent geographical data. The scale of acquisition of point clouds is growing steadily. However,…
(more)
▼ Point clouds are becoming one of the most common ways to represent geographical data. The scale of acquisition of point clouds is growing steadily. However, point clouds are often very large in storage size and require computationally intensive operations. The integration of point clouds nowadays still face a lot of challenges. This project focuses on one of these challenges; integrating point clouds of different scales and granularity. Solving this challenge enables appealing visualisation, usability for low and high computation powers and geometrical consistency for analysis. The following question is researched: 'To what extent can a vario-scale approach improve integration of point clouds with varying point densities?'. A data model is created that uses importance as an additional dimension. This dimension contains an importance value which is calculated using two methods. Firstly random assignment of values to the points and secondly exact computed values. To compute this value the smallest distances to its nearest neighbour is assigned as importance value. A web application shows the results. Both random and exact methods show an exponential decay in distribution of the importance value. Though the random methods run much faster, the exact methods preserve much more edges and other details.
GRIND - Geomatics Synthesis Project on Point Clouds
Geomatics
Advisors/Committee Members: van der Spek, Stefan (mentor), van Oosterom, P.J.M. (mentor), Meijers, Martijn (mentor), Tijssen, Theo (mentor), Delft University of Technology (degree granting institution).
Subjects/Keywords: geomatics synthesis project; point clouds; point cloud data; point cloud integration
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Hemmes, Tom (author); Li, Weiran (author); van der Maaden, Jippe (author); Olsen, B. (. (2017). A vario-scale approach that improves integration of point clouds with different point densities. (Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:e89cabbd-a5b5-4cc3-8699-bd751aeb3d97
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Hemmes, Tom (author); Li, Weiran (author); van der Maaden, Jippe (author); Olsen, Brenda (author). “A vario-scale approach that improves integration of point clouds with different point densities.” 2017. Thesis, Delft University of Technology. Accessed March 07, 2021.
http://resolver.tudelft.nl/uuid:e89cabbd-a5b5-4cc3-8699-bd751aeb3d97.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Hemmes, Tom (author); Li, Weiran (author); van der Maaden, Jippe (author); Olsen, Brenda (author). “A vario-scale approach that improves integration of point clouds with different point densities.” 2017. Web. 07 Mar 2021.
Vancouver:
Hemmes, Tom (author); Li, Weiran (author); van der Maaden, Jippe (author); Olsen B(. A vario-scale approach that improves integration of point clouds with different point densities. [Internet] [Thesis]. Delft University of Technology; 2017. [cited 2021 Mar 07].
Available from: http://resolver.tudelft.nl/uuid:e89cabbd-a5b5-4cc3-8699-bd751aeb3d97.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Hemmes, Tom (author); Li, Weiran (author); van der Maaden, Jippe (author); Olsen B(. A vario-scale approach that improves integration of point clouds with different point densities. [Thesis]. Delft University of Technology; 2017. Available from: http://resolver.tudelft.nl/uuid:e89cabbd-a5b5-4cc3-8699-bd751aeb3d97
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Saskatchewan
6.
Fransoo, Stephen 1982-.
Pulse Producer Decision Making Under Risky Conditions: Will End-Point Royalties Change Preferences?.
Degree: 2017, University of Saskatchewan
URL: http://hdl.handle.net/10388/8298
► In 2015, the Agriculture for Growth Act (C-18) came into effect in Canada. This Act modernized plant breeding by including amendments that aligned it with…
(more)
▼ In 2015, the Agriculture for Growth Act (C-18) came into effect in Canada. This Act modernized plant breeding by including amendments that aligned it with the 1991 International Convention for the Protection of New Plant Varieties (UPOV91) (CFIA, 2017). Regulations within the Act grant plant breeders the right to charge an
end-
point royalty (EPR) on harvested grain. This thesis is interested in assessing how provenance and framing, influence pulse producer seed choice decisions. This study created a prospect theory behavioral experiment to answer this question. The study concluded that producers are not overly influenced by provenance and framing and instead make decisions based on the expected utility model, except when questions are manipulated by both EPR and negative framing. The study also concluded that most producers (56%) are willing to tolerate a level of risk. This provided a way to profile producers by risk tolerance and found many similarities and few minor differences between those that are always risk-seeking, always risk-averse, and occasionally risk-seeking.
Advisors/Committee Members: Phillips, Peter W.B.B, Agblor, Kofi, Smyth, Stuart, Feist, Gina.
Subjects/Keywords: Lentils; UPOV 91; End Point Royalty; Decision Making; Risk
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Fransoo, S. 1. (2017). Pulse Producer Decision Making Under Risky Conditions: Will End-Point Royalties Change Preferences?. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/8298
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Fransoo, Stephen 1982-. “Pulse Producer Decision Making Under Risky Conditions: Will End-Point Royalties Change Preferences?.” 2017. Thesis, University of Saskatchewan. Accessed March 07, 2021.
http://hdl.handle.net/10388/8298.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Fransoo, Stephen 1982-. “Pulse Producer Decision Making Under Risky Conditions: Will End-Point Royalties Change Preferences?.” 2017. Web. 07 Mar 2021.
Vancouver:
Fransoo S1. Pulse Producer Decision Making Under Risky Conditions: Will End-Point Royalties Change Preferences?. [Internet] [Thesis]. University of Saskatchewan; 2017. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/10388/8298.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Fransoo S1. Pulse Producer Decision Making Under Risky Conditions: Will End-Point Royalties Change Preferences?. [Thesis]. University of Saskatchewan; 2017. Available from: http://hdl.handle.net/10388/8298
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Brno University of Technology
7.
Batelková, Andrea.
Řízení firemních dat a návrh jejich zálohování: Corporate Data Management and Backup Design.
Degree: 2019, Brno University of Technology
URL: http://hdl.handle.net/11012/7856
► This bachelor thesis deals with the current process of corporate data management and central data backup design at STAVOPROGRES BRNO spol. s r.o. Thesis offers…
(more)
▼ This bachelor thesis deals with the current process of corporate
data management and central
data backup design at STAVOPROGRES BRNO spol. s r.o. Thesis offers options for changes to make process more optimal. Thesis points to the solution of a central
data backup in which all created
data will be backed up automatically. Thesis also describes theoretical resources and other options to manage
data inside the company.
Advisors/Committee Members: Kříž, Jiří (advisor), Ing.Jiří Hradil (referee).
Subjects/Keywords: Zálohování; archivace; data; pevný disk; email.; Backup; archiving; data; hard drive; email.
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Batelková, A. (2019). Řízení firemních dat a návrh jejich zálohování: Corporate Data Management and Backup Design. (Thesis). Brno University of Technology. Retrieved from http://hdl.handle.net/11012/7856
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Batelková, Andrea. “Řízení firemních dat a návrh jejich zálohování: Corporate Data Management and Backup Design.” 2019. Thesis, Brno University of Technology. Accessed March 07, 2021.
http://hdl.handle.net/11012/7856.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Batelková, Andrea. “Řízení firemních dat a návrh jejich zálohování: Corporate Data Management and Backup Design.” 2019. Web. 07 Mar 2021.
Vancouver:
Batelková A. Řízení firemních dat a návrh jejich zálohování: Corporate Data Management and Backup Design. [Internet] [Thesis]. Brno University of Technology; 2019. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/11012/7856.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Batelková A. Řízení firemních dat a návrh jejich zálohování: Corporate Data Management and Backup Design. [Thesis]. Brno University of Technology; 2019. Available from: http://hdl.handle.net/11012/7856
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
8.
Tammana, Praveen Aravind Babu.
Software-defined datacenter network debugging.
Degree: PhD, 2018, University of Edinburgh
URL: http://hdl.handle.net/1842/31326
► Software-defined Networking (SDN) enables flexible network management, but as networks evolve to a large number of end-points with diverse network policies, higher speed, and higher…
(more)
▼ Software-defined Networking (SDN) enables flexible network management, but as networks evolve to a large number of end-points with diverse network policies, higher speed, and higher utilization, abstraction of networks by SDN makes monitoring and debugging network problems increasingly harder and challenging. While some problems impact packet processing in the data plane (e.g., congestion), some cause policy deployment failures (e.g., hardware bugs); both create inconsistency between operator intent and actual network behavior. Existing debugging tools are not sufficient to accurately detect, localize, and understand the root cause of problems observed in a large-scale networks; either they lack in-network resources (compute, memory, or/and network bandwidth) or take long time for debugging network problems. This thesis presents three debugging tools: PathDump, SwitchPointer, and Scout, and a technique for tracing packet trajectories called CherryPick. We call for a different approach to network monitoring and debugging: in contrast to implementing debugging functionality entirely in-network, we should carefully partition the debugging tasks between end-hosts and network elements. Towards this direction, we present CherryPick, PathDump, and SwitchPointer. The core of CherryPick is to cherry-pick the links that are key to representing an end-to-end path of a packet, and to embed picked linkIDs into its header on its way to destination. PathDump is an end-host based network debugger based on tracing packet trajectories, and exploits resources at the end-hosts to implement various monitoring and debugging functionalities. PathDump currently runs over a real network comprising only of commodity hardware, and yet, can support surprisingly a large class of network debugging problems with minimal in-network functionality. The key contributions of SwitchPointer is to efficiently provide network visibility to end-host based network debuggers like PathDump by using switch memory as a "directory service" - each switch, rather than storing telemetry data necessary for debugging functionalities, stores pointers to end hosts where relevant telemetry data is stored. The key design choice of thinking about memory as a directory service allows to solve performance problems that were hard or infeasible with existing designs. Finally, we present and solve a network policy fault localization problem that arises in operating policy management frameworks for a production network. We develop Scout, a fully-automated system that localizes faults in a large scale policy deployment and further pin-points the physical-level failures which are most likely cause for observed faults.
Subjects/Keywords: 004.6; automated debugging tools; data center networks; debugging; Software-defined Networking; SDN; PathDump; SwitchPointer; Scout; CherryPick; end-to-end; end-host
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Tammana, P. A. B. (2018). Software-defined datacenter network debugging. (Doctoral Dissertation). University of Edinburgh. Retrieved from http://hdl.handle.net/1842/31326
Chicago Manual of Style (16th Edition):
Tammana, Praveen Aravind Babu. “Software-defined datacenter network debugging.” 2018. Doctoral Dissertation, University of Edinburgh. Accessed March 07, 2021.
http://hdl.handle.net/1842/31326.
MLA Handbook (7th Edition):
Tammana, Praveen Aravind Babu. “Software-defined datacenter network debugging.” 2018. Web. 07 Mar 2021.
Vancouver:
Tammana PAB. Software-defined datacenter network debugging. [Internet] [Doctoral dissertation]. University of Edinburgh; 2018. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/1842/31326.
Council of Science Editors:
Tammana PAB. Software-defined datacenter network debugging. [Doctoral Dissertation]. University of Edinburgh; 2018. Available from: http://hdl.handle.net/1842/31326
9.
Holmberg, Jonas.
OFFLINE SCHEDULING OF TASK SETS WITH COMPLEX END-TO-END DELAY CONSTRAINTS.
Degree: Design and Engineering, 2017, Mälardalen University
URL: http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35694
► Software systems in the automotive domain are generally safety critical and subject to strict timing requirements. Systems of this character are often constructed utilizing…
(more)
▼ Software systems in the automotive domain are generally safety critical and subject to strict timing requirements. Systems of this character are often constructed utilizing periodically executed tasks, that have a hard deadline. In addition, these systems may have additional deadlines that can be specified on cause-effect chains, or simply task chains. They are defined by existing tasks in the system, hence the chains are not stand alone additions to the system. Each chain provide an end-to-end timing constraint targeting the propagation of data through the chain of tasks. These constraints specify the additional timing requirements that need to be fulfilled, when searching for a valid schedule. In this thesis, an offline non-preemptive scheduling method is presented, designed for single core systems. The scheduling problem is defined and formulated utilizing Constraint Programming. In addition, to ensure that end-to-end timing requirements are met, job-level dependencies are considered during the schedule generation. Utilizing this approach can guarantee that individual task periods along with end-to-end timing requirements are always met, if a schedule exists. The results show a good increase in schedulability ratio when utilizing job-level dependencies compared to the case where job-level dependencies are not specified. When the system utilization increases this improvement is even greater. Depending on the system size and complexity the improvement can vary, but in many cases it is more than double. The scheduling generation is also performed within a reasonable time frame. This would be a good benefit during the development process of a system, since it allows fast verification when changes are made to the system. Further, the thesis provide an overview of the entire process, starting from a system model and ending at a fully functional schedule executing on a hardware platform.
Subjects/Keywords: Embedded Systems; Offline Scheduling; End-to-End Delay; Constraint Programming; Job-level Dependencies; Automotive; Hard real-time systems; Embedded Systems; Inbäddad systemteknik
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Holmberg, J. (2017). OFFLINE SCHEDULING OF TASK SETS WITH COMPLEX END-TO-END DELAY CONSTRAINTS. (Thesis). Mälardalen University. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35694
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Holmberg, Jonas. “OFFLINE SCHEDULING OF TASK SETS WITH COMPLEX END-TO-END DELAY CONSTRAINTS.” 2017. Thesis, Mälardalen University. Accessed March 07, 2021.
http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35694.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Holmberg, Jonas. “OFFLINE SCHEDULING OF TASK SETS WITH COMPLEX END-TO-END DELAY CONSTRAINTS.” 2017. Web. 07 Mar 2021.
Vancouver:
Holmberg J. OFFLINE SCHEDULING OF TASK SETS WITH COMPLEX END-TO-END DELAY CONSTRAINTS. [Internet] [Thesis]. Mälardalen University; 2017. [cited 2021 Mar 07].
Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35694.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Holmberg J. OFFLINE SCHEDULING OF TASK SETS WITH COMPLEX END-TO-END DELAY CONSTRAINTS. [Thesis]. Mälardalen University; 2017. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35694
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Cambridge
10.
Lu, Ruodan.
Automated Generation of Geometric Digital Twins of Existing Reinforced Concrete Bridges.
Degree: PhD, 2019, University of Cambridge
URL: https://www.repository.cam.ac.uk/handle/1810/289430
► The cost and effort of modelling existing bridges from point clouds currently outweighs the perceived benefits of the resulting model. The time required for generating…
(more)
▼ The cost and effort of modelling existing bridges from point clouds currently outweighs the perceived benefits of the resulting model. The time required for generating a geometric Bridge Information Model, a holistic data model which has recently become known as a "Digital Twin", of an existing bridge from Point Cloud Data is roughly ten times greater than laser scanning it. There is a pressing need to automate this process. This is particularly true for the highway infrastructure sector because Bridge Digital Twin Generation is an efficient means for documenting bridge condition data. Based on a two-year inspection cycle, there is a need for at least 315,000 bridge inspections per annum across the United States and the United Kingdom. This explains why there is a huge market demand for less labour-intensive bridge documentation techniques that can efficiently boost bridge management productivity.
Previous research has achieved the automatic generation of surface primitives combined with rule-based classification to create labelled cuboids and cylinders from point clouds. While existing methods work well in synthetic datasets or simplified cases, they encounter huge challenges when dealing with real-world bridge point clouds, which are often unevenly distributed and suffer from occlusions. In addition, real bridge topology is much more complicated than idealized cases. Real bridge geometries are defined with curved horizontal alignments, and varying vertical elevations and cross-sections. These characteristics increase the modelling difficulties, which is why none of the existing methods can handle reliably.
The objective of this PhD research is to devise, implement, and benchmark a novel framework that can reasonably generate labelled geometric object models of constructed bridges comprising concrete elements in an established data format (i.e. Industry Foundation Classes). This objective is achieved by answering the following research questions: (1) how to effectively detect reinforced concrete bridge components in Point Cloud Data? And (2) how to effectively fit 3D solid models in the format of Industry Foundation Classes to the detected point clusters?
The proposed framework employs bridge engineering knowledge that mimics the intelligence of human modellers to detect and model reinforced concrete bridge objects in point clouds. This framework directly extracts structural bridge components and then models them without generating low-level shape primitives. Experimental results suggest that the proposed framework can perform quickly and reliably with complex and incomplete real-world bridge point clouds encounter occlusions and unevenly distributed points. The results of experiments on ten real-world bridge point clouds indicate that the framework achieves an overall micro-average detection F1-score of 98.4%, an average modelling accuracy of (C2C) ̅_Auto 7.05 cm, and the average modelling time of merely 37.8 seconds. Compared to the laborious and time-consuming manual practice, the proposed framework can…
Subjects/Keywords: Digital Twin; Bridge; Point Cloud Data; IFC
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Lu, R. (2019). Automated Generation of Geometric Digital Twins of Existing Reinforced Concrete Bridges. (Doctoral Dissertation). University of Cambridge. Retrieved from https://www.repository.cam.ac.uk/handle/1810/289430
Chicago Manual of Style (16th Edition):
Lu, Ruodan. “Automated Generation of Geometric Digital Twins of Existing Reinforced Concrete Bridges.” 2019. Doctoral Dissertation, University of Cambridge. Accessed March 07, 2021.
https://www.repository.cam.ac.uk/handle/1810/289430.
MLA Handbook (7th Edition):
Lu, Ruodan. “Automated Generation of Geometric Digital Twins of Existing Reinforced Concrete Bridges.” 2019. Web. 07 Mar 2021.
Vancouver:
Lu R. Automated Generation of Geometric Digital Twins of Existing Reinforced Concrete Bridges. [Internet] [Doctoral dissertation]. University of Cambridge; 2019. [cited 2021 Mar 07].
Available from: https://www.repository.cam.ac.uk/handle/1810/289430.
Council of Science Editors:
Lu R. Automated Generation of Geometric Digital Twins of Existing Reinforced Concrete Bridges. [Doctoral Dissertation]. University of Cambridge; 2019. Available from: https://www.repository.cam.ac.uk/handle/1810/289430

Delft University of Technology
11.
van Dongen, Kirsten (author).
Random Forest Classification of three different species of trees in Delft, based on AHN point clouds: Additional Thesis.
Degree: 2019, Delft University of Technology
URL: http://resolver.tudelft.nl/uuid:70b0b406-b247-4212-8e66-02534935b815
► Trees are an important aspect of the world around us, and play a sufficient role in our daily lives. They contribute to human health and…
(more)
▼ Trees are an important aspect of the world around us, and play a sufficient role in our daily lives. They contribute to human health and well-being in various ways. Tree inventory and monitoring are of great interest for biomass estimations and changes in the purifying effect on the air. It is a very time consuming and cost inefficient way to check every tree in and around a city or town, therefore there is further research required in the use of AHN
data. Together with the “tree information
data set” formthemunicipality ofDelft, the location and the corresponding
point cloud of tree different species of trees are selected. For the species of interest, Aesculus Hippocastanum, Acer Saccharinum and Platanus x Hispanica, different characteristics are determined. In this research six different characteristics are estimated; Height, Trunk Height, Normalized Trunk Height, Canopy Projected Area, Normalized Canopy Projected Area, Ratio of Diameters, Normalized Ratio of Diameter, Centre of Gravity and at least the Normalized Centre of Gravity. These characteristics are used as features for the Random Forest Classification, Consequently the Confusion Matrix is used as performance measurement. The results of a test of 30 pointclouds, per species of interest, show that the Random Forest Classification is able to classify individual trees. However, these three different species cannot by sufficiently classified using clustering.
Advisors/Committee Members: Lindenbergh, Roderik (mentor), Nan, Liangliang (graduation committee), Delft University of Technology (degree granting institution).
Subjects/Keywords: AHN; tree classification; laser point data
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
van Dongen, K. (. (2019). Random Forest Classification of three different species of trees in Delft, based on AHN point clouds: Additional Thesis. (Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:70b0b406-b247-4212-8e66-02534935b815
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
van Dongen, Kirsten (author). “Random Forest Classification of three different species of trees in Delft, based on AHN point clouds: Additional Thesis.” 2019. Thesis, Delft University of Technology. Accessed March 07, 2021.
http://resolver.tudelft.nl/uuid:70b0b406-b247-4212-8e66-02534935b815.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
van Dongen, Kirsten (author). “Random Forest Classification of three different species of trees in Delft, based on AHN point clouds: Additional Thesis.” 2019. Web. 07 Mar 2021.
Vancouver:
van Dongen K(. Random Forest Classification of three different species of trees in Delft, based on AHN point clouds: Additional Thesis. [Internet] [Thesis]. Delft University of Technology; 2019. [cited 2021 Mar 07].
Available from: http://resolver.tudelft.nl/uuid:70b0b406-b247-4212-8e66-02534935b815.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
van Dongen K(. Random Forest Classification of three different species of trees in Delft, based on AHN point clouds: Additional Thesis. [Thesis]. Delft University of Technology; 2019. Available from: http://resolver.tudelft.nl/uuid:70b0b406-b247-4212-8e66-02534935b815
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of New South Wales
12.
Lee, Chung Tong.
Fixed point theorems in mathematical models for data aggregation.
Degree: Computer Science & Engineering, 2011, University of New South Wales
URL: http://handle.unsw.edu.au/1959.4/50302
;
https://unsworks.unsw.edu.au/fapi/datastream/unsworks:9183/SOURCE02?view=true
► Model construction requires selecting and identifying relevant aspects of a situation in the real world. Describing these aspects and the relations between them by a…
(more)
▼ Model construction requires selecting and identifying relevant aspects of a situation in the real world. Describing these aspects and the relations between them by a system of equations essentially constructs a mathematical model. Functional definitions are very useful in describing relations. They provide additional imperative information for computation, compared to their implicit counterparts. However, it is rarely the case that all entities have a one-way dependency on others. Rather, entities in a system interact with one another and display interdependencies. When translated to mathematical models with functional definitions, the equation systems may contain circular definitions.This thesis demonstrates how to apply fixed
point theorems to mathematical models when the relations between entities involve circular definitions. Fixed
point solutions are computed via iteration. As a simplified example, suppose that the relations between two variables x and y can be described by functions f and g such that x = f(y) and y = g(x). Then the set of fixed points of the composite function f ∘ g is the solution for x, i.e., x = f(g(x)). In this thesis, formulations of this type have been applied to different problem domains which are commonly found in the Internet environment. These include rating aggregation, voting, reputation and trust, and information retrieval. In the simulation for rating aggregation, the quality of an assessor depends on the discrepancy between the ratings he gives and the final ratings. On the other hand, the final rating is defined as a weighted average of ratings given by different assessors, using assessor qualities as the weights. This model shows robustness against random attacks and collusion.The voting study in this dissertation involved real-life
data from the MSN Q&A service. Voter quality is defined to capture the agreement between voters, again a circular definition. Existence of a solution is asserted by Brouwer's Fixed
Point Theorem. This new voting system shows advantages over simple majority vote counting, being more robust against random attacks and showing identification hints for ballot stuffing.Using intuitively self-evident axioms on the trust building process, the method of a weighted quasi-arithmetic average is proved to be adequate to serve as a mathematical model for trust. Further, reputation is defined as an aggregation of trust over a community. The transaction properties and the reputation of the rating agents are used as the weight factors for the aggregation. This is essentially a circular definition. Solution existence is guaranteed when a suitable weighting function is chosen.Topic difficulty and system retrieval performance exhibit a negative reinforcement relationship which is an excellent example of a circular definition. Using an estimation accuracy interpretation, the mathematical model with fixed
point solution in this thesis gives a more natural result on TREC
data than that from the HITS algorithm with eigenvector solutions.Finally, this dissertation…
Advisors/Committee Members: Ignjatovic, Aleksandar, Computer Science & Engineering, Faculty of Engineering, UNSW, Martin, Eric, Computer Science & Engineering, Faculty of Engineering, UNSW.
Subjects/Keywords: Data Aggregation; Fixed Point Theorem; Mathematical Model
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Lee, C. T. (2011). Fixed point theorems in mathematical models for data aggregation. (Doctoral Dissertation). University of New South Wales. Retrieved from http://handle.unsw.edu.au/1959.4/50302 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:9183/SOURCE02?view=true
Chicago Manual of Style (16th Edition):
Lee, Chung Tong. “Fixed point theorems in mathematical models for data aggregation.” 2011. Doctoral Dissertation, University of New South Wales. Accessed March 07, 2021.
http://handle.unsw.edu.au/1959.4/50302 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:9183/SOURCE02?view=true.
MLA Handbook (7th Edition):
Lee, Chung Tong. “Fixed point theorems in mathematical models for data aggregation.” 2011. Web. 07 Mar 2021.
Vancouver:
Lee CT. Fixed point theorems in mathematical models for data aggregation. [Internet] [Doctoral dissertation]. University of New South Wales; 2011. [cited 2021 Mar 07].
Available from: http://handle.unsw.edu.au/1959.4/50302 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:9183/SOURCE02?view=true.
Council of Science Editors:
Lee CT. Fixed point theorems in mathematical models for data aggregation. [Doctoral Dissertation]. University of New South Wales; 2011. Available from: http://handle.unsw.edu.au/1959.4/50302 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:9183/SOURCE02?view=true

University of Adelaide
13.
Arnold, Anne Jillian.
A game-theoretic approach to modelling crop royalties.
Degree: 2015, University of Adelaide
URL: http://hdl.handle.net/2440/95230
► Plant variety rights assist crop breeders to appropriate returns from new varieties and incentivise varietal improvement. Royalties are one form of plant variety rights and…
(more)
▼ Plant variety rights assist crop breeders to appropriate returns from new varieties and incentivise varietal improvement. Royalties are one form of plant variety rights and this dissertation asks which combination of the available royalty instruments is best from the perspective of consumers, farmers, crop breeders, and the overall economy. We use a game-theoretic approach to model strategic interactions between breeders and farmers. The model allows farmer privilege, whereby farmers save seed one year to plant in the future, and we show a
point-of- sale royalty with either or both of the remaining royalties is optimal, whether or not we allow the possibility of farmers under-paying royalties through under-declaring output or saved seed. We also develop a Principal–Agent model, in which risk-neutral breeders share the risk with risk-averse farmers. In this model, the optimum royalty depends on various parameters, including the costs of compliance and enforcement.
Advisors/Committee Members: Bayer, Ralph-Christopher (advisor), Binenbaum, Eran (advisor), Anderson, Kym (advisor), Wong, Jacob (advisor), School of Economics (school).
Subjects/Keywords: game-theory; economic model; end-point royalty; point-of-sale royalty; saved seed; farmer privilege; principal–agent model
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Arnold, A. J. (2015). A game-theoretic approach to modelling crop royalties. (Thesis). University of Adelaide. Retrieved from http://hdl.handle.net/2440/95230
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Arnold, Anne Jillian. “A game-theoretic approach to modelling crop royalties.” 2015. Thesis, University of Adelaide. Accessed March 07, 2021.
http://hdl.handle.net/2440/95230.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Arnold, Anne Jillian. “A game-theoretic approach to modelling crop royalties.” 2015. Web. 07 Mar 2021.
Vancouver:
Arnold AJ. A game-theoretic approach to modelling crop royalties. [Internet] [Thesis]. University of Adelaide; 2015. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/2440/95230.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Arnold AJ. A game-theoretic approach to modelling crop royalties. [Thesis]. University of Adelaide; 2015. Available from: http://hdl.handle.net/2440/95230
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Brno University of Technology
14.
Kugler, Petr.
Zálohování dat firmy: Data Backup Companies.
Degree: 2015, Brno University of Technology
URL: http://hdl.handle.net/11012/36996
This work introduces the process of backing up data, explains the concept of storage and further analyzes and proposals for improving data backup company Xella CZ, s.r.o.
Advisors/Committee Members: Kříž, Jiří (advisor), Doležal, Pavel (referee).
Subjects/Keywords: Zálohování; zálohování dat; datová úložiště; data; pevný disk; archivace; Backup; data backup; data storage; data; hard disk; archiving
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Kugler, P. (2015). Zálohování dat firmy: Data Backup Companies. (Thesis). Brno University of Technology. Retrieved from http://hdl.handle.net/11012/36996
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Kugler, Petr. “Zálohování dat firmy: Data Backup Companies.” 2015. Thesis, Brno University of Technology. Accessed March 07, 2021.
http://hdl.handle.net/11012/36996.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Kugler, Petr. “Zálohování dat firmy: Data Backup Companies.” 2015. Web. 07 Mar 2021.
Vancouver:
Kugler P. Zálohování dat firmy: Data Backup Companies. [Internet] [Thesis]. Brno University of Technology; 2015. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/11012/36996.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Kugler P. Zálohování dat firmy: Data Backup Companies. [Thesis]. Brno University of Technology; 2015. Available from: http://hdl.handle.net/11012/36996
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Brno University of Technology
15.
Kugler, Petr.
Zálohování dat firmy: Data Backup Companies.
Degree: 2019, Brno University of Technology
URL: http://hdl.handle.net/11012/33767
Tato bakalářská práce seznamuje s procesem zálohování dat, vysvětluje pojem datová úložiště a dále se zabývá analýzou a návrhem zlepšení zálohování dat ve firmě Xella CZ, s.r.o.
Advisors/Committee Members: Kříž, Jiří (advisor), Doležal, Pavel (referee).
Subjects/Keywords: Zálohování; zálohování dat; datová úložiště; data; pevný disk; archivace; Backup; data backup; data storage; data; hard disk; archiving
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Kugler, P. (2019). Zálohování dat firmy: Data Backup Companies. (Thesis). Brno University of Technology. Retrieved from http://hdl.handle.net/11012/33767
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Kugler, Petr. “Zálohování dat firmy: Data Backup Companies.” 2019. Thesis, Brno University of Technology. Accessed March 07, 2021.
http://hdl.handle.net/11012/33767.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Kugler, Petr. “Zálohování dat firmy: Data Backup Companies.” 2019. Web. 07 Mar 2021.
Vancouver:
Kugler P. Zálohování dat firmy: Data Backup Companies. [Internet] [Thesis]. Brno University of Technology; 2019. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/11012/33767.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Kugler P. Zálohování dat firmy: Data Backup Companies. [Thesis]. Brno University of Technology; 2019. Available from: http://hdl.handle.net/11012/33767
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Edinburgh
16.
Stewart, Hannah J.
Objective measurement of imitation problems in autism.
Degree: 2011, University of Edinburgh
URL: http://hdl.handle.net/1842/6066
► Imitation is a complex behaviour used to allow faster learning of skills, including pivotal social cognitive processes such as language and gesture. Difficulties in imitating…
(more)
▼ Imitation is a complex behaviour used to allow faster learning of skills, including pivotal social cognitive processes such as language and gesture. Difficulties in imitating others have been broadly found within Autistic Spectrum Disorders (ASD) populations. This paper discusses two possible theories explaining these deficits: self-other mapping theory, whereby imitation deficits in ASD have been proposed to restrict the ability to map relationships between social representations of others and themselves; and self-other comparison theory, whereby the individual must distinguish similarities and differences between themself and the other, which are then related to emotional and contextual differences learnt through experience in order to provide emotional context.
Whilst imitation difficulties have been widely reported, the recordings of such difficulties have been done subjectively. This paper, however, approaches this well- known phenomenon objectively through the use of a clinical-kinematics assessment tool (C-Kat). Furthermore, this paper discusses different ways to precisely describe the imitative act to be copied and the performance of the imitator.
This paper aimed to objectively investigate whether, when compared to typically developing peers, an imitative deficit was present in ASD adolescents (ASD n = 16; TD n = 24). Secondly, it aimed to determine whether such a deficit existed only for bodily imitation. Results showed a clear group difference and suggested a developmental delay in imitation ability within ASD rather than a deficit. Furthermore, results suggested a specific ASD difficulty in bodily imitation. However, following comparison of imitation stimuli and measures, the possibility is discussed that these results may be due to focusing on elements other than the critical movement information within the action to be imitated.
Advisors/Committee Members: Williams, Justin J H G, McIntosh, Rob.
Subjects/Keywords: autism; imitation; kinematics; adolescents; motor control; movement end-point re-enactment; action imitation; bodily imitation
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Stewart, H. J. (2011). Objective measurement of imitation problems in autism. (Thesis). University of Edinburgh. Retrieved from http://hdl.handle.net/1842/6066
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Stewart, Hannah J. “Objective measurement of imitation problems in autism.” 2011. Thesis, University of Edinburgh. Accessed March 07, 2021.
http://hdl.handle.net/1842/6066.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Stewart, Hannah J. “Objective measurement of imitation problems in autism.” 2011. Web. 07 Mar 2021.
Vancouver:
Stewart HJ. Objective measurement of imitation problems in autism. [Internet] [Thesis]. University of Edinburgh; 2011. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/1842/6066.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Stewart HJ. Objective measurement of imitation problems in autism. [Thesis]. University of Edinburgh; 2011. Available from: http://hdl.handle.net/1842/6066
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Saskatchewan
17.
Bolek, Katarzyna.
Public, Producer, Private Partnerships and EPR systems in Australian Wheat Breeding.
Degree: 2015, University of Saskatchewan
URL: http://hdl.handle.net/10388/ETD-2015-03-1991
► Australia has a crop research system with higher research intensity than exists internationally. Motivated to improve R&D policy in Canada, this dissertation focuses on the…
(more)
▼ Australia has a crop research system with higher research intensity than exists internationally. Motivated to improve R&D policy in Canada, this dissertation focuses on the Australian
End Point Royalty (EPR) system for wheat and addresses four principal questions: (1) How was the Australian system created and how does it work? (2) How do public, producer and private ownership of breeding programs affect the pricing of varieties? (3) How do EPR rates affect wheat variety adoption? (4) Finally, how would uniform EPR rates, similar to those used in France, affect variety selection, total production and revenue if used in the Australian market? In addressing the first question I use existing literature and interviews with prominent personnel in the Australian wheat breeding system, including management of InterGrain, AGT, DAFWA, GRDC and others. Interviews were conducted during field study in Australia in 2011. In addressing the second question I employ a horizontal location model to analyze three game theoretic scenarios of a two firm oligopoly market with private, public and producer owned-breeding companies. The results show public and producer ownership of one of the wheat breeding programs reduces price level relative to private only ownership. I derive a novel result showing that when competing with private firms who must price above marginal cost, the public firm should also price above marginal cost in order to maximize total industry surplus. In addressing the third question I develop and estimate an econometric wheat variety adoption model for Western Australia. I find EPR rates have a negative inelastic, statistically significant impact on the adoption of varieties.
Finally, in addressing the last question, I use the econometric model to simulate the adoption of Australian wheat varieties, given a counterfactual of revenue neutral uniform EPR rates. The uniform EPR rates speed up both the adoption and dis-adoption of varieties, thereby increasing weighted average yield and total production. The value of the increase in value of production exceeds the revenue for breeders under varying EPR rates, suggesting uniform EPR system may be an attractive alternative to varying EPR rates.
Advisors/Committee Members: Gray, Richard S., Fulton, Murray, Gilchrist, Donald, Micheels, Eric.
Subjects/Keywords: Wheat Breeding; End Point Royalties; EPR; Partnerships; Australian system; Funding R&D
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Bolek, K. (2015). Public, Producer, Private Partnerships and EPR systems in Australian Wheat Breeding. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/ETD-2015-03-1991
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Bolek, Katarzyna. “Public, Producer, Private Partnerships and EPR systems in Australian Wheat Breeding.” 2015. Thesis, University of Saskatchewan. Accessed March 07, 2021.
http://hdl.handle.net/10388/ETD-2015-03-1991.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Bolek, Katarzyna. “Public, Producer, Private Partnerships and EPR systems in Australian Wheat Breeding.” 2015. Web. 07 Mar 2021.
Vancouver:
Bolek K. Public, Producer, Private Partnerships and EPR systems in Australian Wheat Breeding. [Internet] [Thesis]. University of Saskatchewan; 2015. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/10388/ETD-2015-03-1991.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Bolek K. Public, Producer, Private Partnerships and EPR systems in Australian Wheat Breeding. [Thesis]. University of Saskatchewan; 2015. Available from: http://hdl.handle.net/10388/ETD-2015-03-1991
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Manchester
18.
Loftus, John Paul Matthew.
On The Development of Control Systems Technology for
Fermentation Processes.
Degree: 2016, University of Manchester
URL: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:306385
► Fermentation processes play an integral role in the manufacture of pharmaceutical products. The Quality by Design initiative, combined with Process Analytical Technologies, aims to facilitate…
(more)
▼ Fermentation processes play an integral role in the
manufacture of pharmaceutical products. The Quality by Design
initiative, combined with Process Analytical Technologies, aims to
facilitate the consistent production of high quality products in
the most efficient and economical way. The ability to estimate and
control product quality from these processes is essential in
achieving this aim. Large historical datasets are commonplace in
the pharmaceutical industry and multivariate methods based on PCA
and PLS have been successfully used in a wide range of applications
to extract useful information from such datasets. This thesis has
focused on the development and application of novel multivariate
methods to the estimation and control of product quality from a
number of processes. The document is divided into four main
categories. Firstly, the related literature and inherent
mathematical techniques are summarised. Following this, the three
main technical areas of work are presented. The first of these
relates to the development of a novel method for estimating the
quality of products from a proprietary process using PCA. The
ability to estimate product quality is useful for identifying
production steps that are potentially problematic and also
increases process efficiency by ensuring that any defective
products are detected before they undergo any further processing.
The proposed method is simple and robust and has been applied to
two separate case studies, the results of which demonstrate the
efficacy of the technique. The second area of work concentrates on
the development of a novel method of identifying the operational
phases of batch fermentation processes and is based on PCA and
associated statistics. Knowledge of the operational phases of a
process can be beneficial from a monitoring and control perspective
and allows a process to be divided into phases that can be
approximated by a linear model. The devised methodology is applied
to two separate fermentation processes and results show the
capability of the proposed method. The third area of work focuses
on undertaking a performance evaluation of two multivariate
algorithms, PLS and EPLS, in controlling the end-point product
yield of fermentation processes. Control of end-point product
quality is of crucial importance in many manufacturing industries,
such as the pharmaceutical industry. Developing a controller based
on historical and identification process data is attractive due to
the simplicity of modelling and the increasing availability of
process data. The methodology is applied to two case studies and
performance evaluated. From both a prediction and control
perspective, it is seen that EPLS outperforms PLS, which is
important if modelling data is limited.
None.
None.
Advisors/Committee Members: CARRASCO GOMEZ, JOAQUIN J, Lennox, Barry, Carrasco Gomez, Joaquin.
Subjects/Keywords: Process Control; Multivariate Statistics; Product Quality; End-Point Control; Partial-Least Squares; Principal Component Analysis
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Loftus, J. P. M. (2016). On The Development of Control Systems Technology for
Fermentation Processes. (Doctoral Dissertation). University of Manchester. Retrieved from http://www.manchester.ac.uk/escholar/uk-ac-man-scw:306385
Chicago Manual of Style (16th Edition):
Loftus, John Paul Matthew. “On The Development of Control Systems Technology for
Fermentation Processes.” 2016. Doctoral Dissertation, University of Manchester. Accessed March 07, 2021.
http://www.manchester.ac.uk/escholar/uk-ac-man-scw:306385.
MLA Handbook (7th Edition):
Loftus, John Paul Matthew. “On The Development of Control Systems Technology for
Fermentation Processes.” 2016. Web. 07 Mar 2021.
Vancouver:
Loftus JPM. On The Development of Control Systems Technology for
Fermentation Processes. [Internet] [Doctoral dissertation]. University of Manchester; 2016. [cited 2021 Mar 07].
Available from: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:306385.
Council of Science Editors:
Loftus JPM. On The Development of Control Systems Technology for
Fermentation Processes. [Doctoral Dissertation]. University of Manchester; 2016. Available from: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:306385

Virginia Tech
19.
Monti, Henry Matthew.
An Integrated End-User Data Service for HPC Centers.
Degree: PhD, Computer Science and Applications, 2013, Virginia Tech
URL: http://hdl.handle.net/10919/19259
► The advent of extreme-scale computing systems, e.g., Petaflop supercomputers, High Performance Computing (HPC) cyber-infrastructure, Enterprise databases, and experimental facilities such as large-scale particle colliders, are…
(more)
▼ The advent of extreme-scale computing systems, e.g., Petaflop supercomputers, High Performance Computing (HPC) cyber-infrastructure, Enterprise databases, and experimental facilities such as large-scale particle colliders, are pushing the envelope on dataset sizes. Supercomputing centers routinely generate and consume ever increasing amounts of
data while executing high-throughput computing jobs. These are often result-datasets or checkpoint snapshots from long-running simulations, but can also be input
data from experimental facilities such as the Large Hadron Collider (LHC) or the Spallation Neutron Source (SNS). These growing datasets are often processed by a geographically dispersed user base across multiple different HPC installations. Moreover,
end-user workflows are also increasingly distributed in nature with massive input, output, and even intermediate
data often being transported to and from several HPC resources or
end-users for further processing or visualization. The growing
data demands of applications coupled with the distributed nature of HPC workflows, have the potential to place significant strain on both the storage and network resources at HPC centers. Despite this potential impact, rather than stringently managing HPC center resources, a common practice is to leave application-associated
data management to the
end-user, as the user is intimately aware of the application\'s workflow and
data needs. This means
end-users must frequently interact with the local storage in HPC centers, the scratch space, which is used for job input, output, and intermediate
data. Scratch is built using a parallel file system that supports very high aggregate I/O throughput, e.g., Lustre, PVFS, and GPFS. To ensure efficient I/O and faster job turnaround, use of scratch by applications is encouraged. Consequently, job input and output
data are required to be moved in and out of the scratch space by
end-users before and after the job runs, respectively. In practice,
end-users arbitrarily stage and offload
data as and when they deem fit, without any consideration to the center\'s performance, often leaving
data on the scratch long after it is needed. HPC centers resort to "purge" mechanisms that sweep the scratch space to remove files found to be no longer in use, based on not having been accessed in a preselected time threshold called the purge window that commonly ranges from a few days to a week. This ad-hoc
data management ignores the interactions between different usersd́ata storage and transmission demands, and their impact on center serviceability leading to suboptimal use of precious center resources. To address the issues of exponentially increasing
data sizes and ad-hoc
data management, we present a fresh perspective to scratch storage management by fundamentally rethinking the manner in which scratch space is employed. Our approach is twofold. First, we re-design the scratch system as a "cache" and build "retention", "population", and "eviction" policies that are tightly…
Advisors/Committee Members: Butt, Ali Raza Ashraf (committeechair), Ribbens, Calvin J. (committee member), Vazhkudai, Sudharshan Sankaran (committee member), Feng, Wu-Chun (committee member), Lin, Heshan (committee member).
Subjects/Keywords: End-User Data Services; Scratch as a Cache; Data Offloading; Data Staging
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Monti, H. M. (2013). An Integrated End-User Data Service for HPC Centers. (Doctoral Dissertation). Virginia Tech. Retrieved from http://hdl.handle.net/10919/19259
Chicago Manual of Style (16th Edition):
Monti, Henry Matthew. “An Integrated End-User Data Service for HPC Centers.” 2013. Doctoral Dissertation, Virginia Tech. Accessed March 07, 2021.
http://hdl.handle.net/10919/19259.
MLA Handbook (7th Edition):
Monti, Henry Matthew. “An Integrated End-User Data Service for HPC Centers.” 2013. Web. 07 Mar 2021.
Vancouver:
Monti HM. An Integrated End-User Data Service for HPC Centers. [Internet] [Doctoral dissertation]. Virginia Tech; 2013. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/10919/19259.
Council of Science Editors:
Monti HM. An Integrated End-User Data Service for HPC Centers. [Doctoral Dissertation]. Virginia Tech; 2013. Available from: http://hdl.handle.net/10919/19259
20.
Oliveira, Bruno Miguel Almeida de.
Comparação dos métodos ENF e 4-ENF para determinação da tenacidade ao corte de juntas adesivas.
Degree: 2016, Instituto Politécnico do Porto
URL: http://www.rcaap.pt/detail.jsp?id=oai:recipp.ipp.pt:10400.22/8252
► Ao longo destes últimos anos as ligações adesivas têm vindo a verificar um aumento progressivo em aplicações estruturais em detrimento das ligações mecânicas convencionais. Esta…
(more)
▼ Ao longo destes últimos anos as ligações adesivas têm vindo a verificar um aumento progressivo em aplicações estruturais em detrimento das ligações mecânicas convencionais. Esta alteração de paradigma deve-se às vantagens que as juntas adesivas possuem relativamente aos outros métodos de ligação. A mecânica da fratura e os Modelos de Dano Coesivo (MDC) são critérios comuns para prever a resistência em juntas adesivas e usam como parâmetros fundamentais as taxas de libertação de energia. Pelo facto do ensaio 4-Point End Notched Flexure (4-ENF), aplicado em juntas adesivas, ainda estar pouco estudado é de grande relevância um estudo acerca da sua viabilidade para a determinação da taxa crítica de libertação de energia de deformação ao corte (GIIc).
Esta dissertação tem como objetivo principal efetuar uma comparação entre os métodos End- Notched Flexure (ENF) e 4-ENF na determinação de GIIc em juntas adesivas. Para tal foram utilizados 3 adesivos: Araldite® AV138, Araldite® 2015 e SikaForce® 7752. O trabalho experimental passou pela conceção e fabrico de uma ferramenta para realização do ensaio 4-ENF, seguindo-se o fabrico e a preparação dos provetes para os ensaios. Pelo facto do ensaio 4-ENF ainda se encontrar pouco divulgado em juntas adesivas, e não se encontrar normalizado, uma parte importante do trabalho passou pela pesquisa e análise em trabalhos de investigação e artigos científicos. A análise dos resultados foi realizada por comparação direta dos valores de GIIc com os resultados obtidos no ensaio ENF, sendo realizada por série de adesivo, através da comparação das curvas P-δ e curvas-R.
Como resultado verificou-se que o ensaio 4-ENF em ligações adesivas não é o mais versátil para a determinação do valor de GIIc, e que apenas um método de obtenção de GIIc é viável. Este método é baseado na medição do comprimento de fenda (a). Ficou evidenciado que o ensaio ENF, devido a ser um ensaio normalizado, por apresentar um setup mais simples e por apresentar uma maior disponibilidade de métodos para a determinação do valor de GIIc, é o mais recomendado. Conclui-se assim que o ensaio 4-ENF, embora sendo uma alternativa ao ensaio ENF, tem aplicação mais limitada.
Over the last few years, adhesively-bonded joints have been increasingly used in structural applications instead of conventional mechanical joints. This paradigm change is due to advantages of the adhesively-bonded joints when compared with the other joining methods. Fracture mechanics and the Cohesive Zone Models (CZM) are the usual techniques to predict joints strength and they use the energy release rate as fundamental parameters. Since the 4-End Notched Flexure (4-ENF) test, applied to adhesively-bonded joints, is understudied, the studies about the viability of this method to estimate the critical shear strain energy release rate (GIIc) have great relevance.
The main objective of this thesis is the comparison between the End-Notched Flexure (ENF) and 4- ENF test methods in the determination of GIIc of adhesively-bonded joints. Three types of adhesives…
Advisors/Committee Members: Campilho, Raul Duarte Salgueiral Gomes.
Subjects/Keywords: End-Notched Flexure (ENF); Four-Point End-Notched Flexure (4-ENF); Ligações adesivas; Adesivos estruturais; End-Notched Flexure (ENF); Four-Point End-Notched Flexure (4-ENF); Adhesively-bonded joints; Structural adhesives; Materiais e Tecnologias de Fabrico
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Oliveira, B. M. A. d. (2016). Comparação dos métodos ENF e 4-ENF para determinação da tenacidade ao corte de juntas adesivas. (Thesis). Instituto Politécnico do Porto. Retrieved from http://www.rcaap.pt/detail.jsp?id=oai:recipp.ipp.pt:10400.22/8252
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Oliveira, Bruno Miguel Almeida de. “Comparação dos métodos ENF e 4-ENF para determinação da tenacidade ao corte de juntas adesivas.” 2016. Thesis, Instituto Politécnico do Porto. Accessed March 07, 2021.
http://www.rcaap.pt/detail.jsp?id=oai:recipp.ipp.pt:10400.22/8252.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Oliveira, Bruno Miguel Almeida de. “Comparação dos métodos ENF e 4-ENF para determinação da tenacidade ao corte de juntas adesivas.” 2016. Web. 07 Mar 2021.
Vancouver:
Oliveira BMAd. Comparação dos métodos ENF e 4-ENF para determinação da tenacidade ao corte de juntas adesivas. [Internet] [Thesis]. Instituto Politécnico do Porto; 2016. [cited 2021 Mar 07].
Available from: http://www.rcaap.pt/detail.jsp?id=oai:recipp.ipp.pt:10400.22/8252.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Oliveira BMAd. Comparação dos métodos ENF e 4-ENF para determinação da tenacidade ao corte de juntas adesivas. [Thesis]. Instituto Politécnico do Porto; 2016. Available from: http://www.rcaap.pt/detail.jsp?id=oai:recipp.ipp.pt:10400.22/8252
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
21.
Charyyev, Batyr.
Protecting File Transfers Against Silent Data Corruption with Robust End-to-End Integrity Verification.
Degree: 2019, University of Nevada – Reno
URL: http://hdl.handle.net/11714/6023
► Scientific applications generate large volumes of data that often needs to be moved between geographically distributed sites which has led to a significant increase in…
(more)
▼ Scientific applications generate large volumes of
data that often needs to be moved between geographically distributed sites which has led to a significant increase in
data transfer rates. As an increasing number of scientific applications are becoming sensitive to silent
data corruption,
end-to-
end integrity verification has been proposed.
End-to-
end integrity verification minimizes the likelihood of silent
data corruption by comparing checksum of files at the source and the destination using secure hash algorithms such as MD5 and SHA1. However, existing implementations of
end-to-
end data integrity verification for file transfers compute checksum of files based on memory copy (i.e. cache) of the file, thus fall short to detect silent disk errors that take place while writing cached
data to disk. In this thesis, we inspect the robustness of existing
end-to-
end integrity verification approaches against silent
data corruption and propose a Robust Integrity Verification Algorithm (i.e. RIVA) to enhance
data integrity. Extensive experiments show that unlike existing solutions, RIVA is able to detect silent disk corruptions by invalidating file contents in page cache and reading them directly from disk. Since RIVA clears page cache and reads file contents directly from the disk, it incurs delay to execution time. However, by running transfer, cache invalidation, and checksum operations concurrently, RIVA is able to keep its overhead below 15% in most cases compared to the state-of-the-art solutions in exchange of more secure file transfers. We also introduce a novel fault injection mechanism to assesses the robustness of RIVA against undetected disk errors by altering file content on the disk. Finally, we present dynamic parallelism to adjust the number of transfer and checksum threads to overcome performance bottlenecks. The results show that dynamic parallelism lead to more than 5x increase in RIVA’s speed.
Advisors/Committee Members: Gunes, Mehmet H. (advisor), Arslan, Engin (advisor), Gunes, Mehmet Hadi (committee member), Arslan, Engin (committee member), Ben-Idris, Mohammed (committee member).
Subjects/Keywords: Data transfer; End to end Integrity verification; High Performance Computing; Silent data corruptions; Undetected disk errors
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Charyyev, B. (2019). Protecting File Transfers Against Silent Data Corruption with Robust End-to-End Integrity Verification. (Thesis). University of Nevada – Reno. Retrieved from http://hdl.handle.net/11714/6023
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Charyyev, Batyr. “Protecting File Transfers Against Silent Data Corruption with Robust End-to-End Integrity Verification.” 2019. Thesis, University of Nevada – Reno. Accessed March 07, 2021.
http://hdl.handle.net/11714/6023.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Charyyev, Batyr. “Protecting File Transfers Against Silent Data Corruption with Robust End-to-End Integrity Verification.” 2019. Web. 07 Mar 2021.
Vancouver:
Charyyev B. Protecting File Transfers Against Silent Data Corruption with Robust End-to-End Integrity Verification. [Internet] [Thesis]. University of Nevada – Reno; 2019. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/11714/6023.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Charyyev B. Protecting File Transfers Against Silent Data Corruption with Robust End-to-End Integrity Verification. [Thesis]. University of Nevada – Reno; 2019. Available from: http://hdl.handle.net/11714/6023
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Brno University of Technology
22.
Cabalka, Ondřej.
Zálohování dat a datová úložiště: Data Backup and Storage.
Degree: 2019, Brno University of Technology
URL: http://hdl.handle.net/11012/5235
► This work highlights the frequently overlooked importace of corporate data and their protection. It introduces the backup process as an essential element of safety and…
(more)
▼ This work highlights the frequently overlooked importace of corporate
data and their protection. It introduces the backup process as an essential element of safety and efficiency of information systém. The work is both basic and advanced backup mothods suitable for small and medium enterprises, depending on their needs and budget.
Advisors/Committee Members: Kříž, Jiří (advisor), Veškrna, Josef (referee).
Subjects/Keywords: data; ztráta dat; zálohování; obnova; archivace; pevný disk; data; data loss; backup; restore; archiving; hard disk
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Cabalka, O. (2019). Zálohování dat a datová úložiště: Data Backup and Storage. (Thesis). Brno University of Technology. Retrieved from http://hdl.handle.net/11012/5235
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Cabalka, Ondřej. “Zálohování dat a datová úložiště: Data Backup and Storage.” 2019. Thesis, Brno University of Technology. Accessed March 07, 2021.
http://hdl.handle.net/11012/5235.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Cabalka, Ondřej. “Zálohování dat a datová úložiště: Data Backup and Storage.” 2019. Web. 07 Mar 2021.
Vancouver:
Cabalka O. Zálohování dat a datová úložiště: Data Backup and Storage. [Internet] [Thesis]. Brno University of Technology; 2019. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/11012/5235.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Cabalka O. Zálohování dat a datová úložiště: Data Backup and Storage. [Thesis]. Brno University of Technology; 2019. Available from: http://hdl.handle.net/11012/5235
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Edith Cowan University
23.
James, Peter.
Secure portable execution and storage environments: A capability to improve security for remote working.
Degree: 2015, Edith Cowan University
URL: https://ro.ecu.edu.au/theses/1707
► Remote working is a practice that provides economic benefits to both the employing organisation and the individual. However, evidence suggests that organisations implementing remote working…
(more)
▼ Remote working is a practice that provides economic benefits to both the employing organisation and the individual. However, evidence suggests that organisations implementing remote working have limited appreciation of the security risks, particularly those impacting upon the confidentiality and integrity of information and also on the integrity and availability of the remote worker’s computing environment. Other research suggests that an organisation that does appreciate these risks may veto remote working, resulting in a loss of economic benefits. With the implementation of high speed broadband, remote working is forecast to grow and therefore it is appropriate that improved approaches to managing security risks are researched. This research explores the use of secure portable execution and storage environments (secure PESEs) to improve information security for the remote work categories of telework, and mobile and deployed working.
This thesis with publication makes an original contribution to improving remote work information security through the development of a body of knowledge (consisting of design models and design instantiations) and the assertion of a nascent design theory. The research was conducted using design science research (DSR), a paradigm where the research philosophies are grounded in design and construction.
Following an assessment of both the remote work information security issues and threats, and preparation of a set of functional requirements, a secure PESE concept was defined. The concept is represented by a set of attributes that encompass the security properties of preserving the confidentiality, integrity and availability of the computing environment and data. A computing environment that conforms to the concept is considered to be a secure PESE, the implementation of which consists of a highly portable device utilising secure storage and an up-loadable (on to a PC) secure execution environment. The secure storage and execution environment combine to address the information security risks in the remote work location.
A research gap was identified as no existing ‘secure PESE like’ device fully conformed to the concept, enabling a research problem and objectives to be defined. Novel secure storage and execution environments were developed and used to construct a secure PESE suitable for commercial remote work and a high assurance secure PESE suitable for security critical remote work. The commercial secure PESE was trialled with an existing telework team looking to improve security and the high assurance secure PESE was trialled within an organisation that had previously vetoed remote working due to the sensitivity of the data it processed.
An evaluation of the research findings found that the objectives had been satisfied. Using DSR evaluation frameworks it was determined that the body of knowledge had improved an area of study with sufficient evidence generated to assert a nascent design theory for secure PESEs.
The thesis highlights the limitations of the research while opportunities…
Subjects/Keywords: Information Security; Cyber Security; Secure Data at Rest; Secure Portable Storage; Secure Portable Execution Environment; Secure Remote Working; Secure Teleworking; Secure Mobile Working; Secure Deployed Working; Hardened Browser; Hardened Operating System; Anti Digital Forensics; Design Science Research; End Point Security; Human Resources Management; Information Security; Technology and Innovation
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
James, P. (2015). Secure portable execution and storage environments: A capability to improve security for remote working. (Thesis). Edith Cowan University. Retrieved from https://ro.ecu.edu.au/theses/1707
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
James, Peter. “Secure portable execution and storage environments: A capability to improve security for remote working.” 2015. Thesis, Edith Cowan University. Accessed March 07, 2021.
https://ro.ecu.edu.au/theses/1707.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
James, Peter. “Secure portable execution and storage environments: A capability to improve security for remote working.” 2015. Web. 07 Mar 2021.
Vancouver:
James P. Secure portable execution and storage environments: A capability to improve security for remote working. [Internet] [Thesis]. Edith Cowan University; 2015. [cited 2021 Mar 07].
Available from: https://ro.ecu.edu.au/theses/1707.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
James P. Secure portable execution and storage environments: A capability to improve security for remote working. [Thesis]. Edith Cowan University; 2015. Available from: https://ro.ecu.edu.au/theses/1707
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

UCLA
24.
Uppala, Medha.
Separable Temporal Modeling of Point Processes on Linear Networks & Balancing Data Sufficiency and Privacy.
Degree: Statistics, 2018, UCLA
URL: http://www.escholarship.org/uc/item/6g33n64x
► The first part of the dissertation focusses on spatial and temporal modeling of point processes on linear networks. Point processes on/near linear networks can simply…
(more)
▼ The first part of the dissertation focusses on spatial and temporal modeling of point processes on linear networks. Point processes on/near linear networks can simply be defined as point events occurring on or near line segment network structures embedded in a certain space. A separable modeling framework is presented that fits a formation and a dissolution model of point processes on linear networks over time. Two major applications of the separable temporal model are spider web building activity in brick mortar lines and wildfire ignition origins near road networks.The second part of the dissertation focusses on analyses of large energy databases, specifically the Energy Atlas database. The main motivation of this part is to explore and understand the issues of balancing necessary data resolution while maintaining consumer privacy.The issue of data resolution and its importance are explored by first tackling a specific policy objective. This is achieved by applying a longitudinal quantile regression model to parcel-level monthly energy consumption in the Westwood neighborhood; the model results aid in fulfilling efficiency goals outlined in the California Senate Bill 350. Then the issue of record privacy is explored through a review of current privacy methods, implementation, data ownership, and concluded with avenues of future research.
Subjects/Keywords: Statistics; Data Privacy; Data resolution; Linear Networks; Point Processes; Wildfires
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Uppala, M. (2018). Separable Temporal Modeling of Point Processes on Linear Networks & Balancing Data Sufficiency and Privacy. (Thesis). UCLA. Retrieved from http://www.escholarship.org/uc/item/6g33n64x
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Uppala, Medha. “Separable Temporal Modeling of Point Processes on Linear Networks & Balancing Data Sufficiency and Privacy.” 2018. Thesis, UCLA. Accessed March 07, 2021.
http://www.escholarship.org/uc/item/6g33n64x.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Uppala, Medha. “Separable Temporal Modeling of Point Processes on Linear Networks & Balancing Data Sufficiency and Privacy.” 2018. Web. 07 Mar 2021.
Vancouver:
Uppala M. Separable Temporal Modeling of Point Processes on Linear Networks & Balancing Data Sufficiency and Privacy. [Internet] [Thesis]. UCLA; 2018. [cited 2021 Mar 07].
Available from: http://www.escholarship.org/uc/item/6g33n64x.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Uppala M. Separable Temporal Modeling of Point Processes on Linear Networks & Balancing Data Sufficiency and Privacy. [Thesis]. UCLA; 2018. Available from: http://www.escholarship.org/uc/item/6g33n64x
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
25.
Deibe Seoane, David.
Geospatial Processing and Visualization of Point Clouds: from GPUs to Big Data Technologies
.
Degree: 2019, Universidad da Coruña
URL: http://hdl.handle.net/2183/24513
► [Resumo] A tecnoloxía LiDAR (Light Detection And Ranging) é actualmente unha das máis valiosas fontes de información xeográfica xa que permite, mediante dispositivos de dixitalización…
(more)
▼ [Resumo] A tecnoloxía LiDAR (Light Detection And Ranging) é actualmente unha das máis
valiosas fontes de información xeográfica xa que permite, mediante dispositivos de
dixitalización láser, a obtención de modelos tridimensionais de alta resolución de
grandes áreas de terra. Os datos LiDAR, normalmente almacenados como nubes
de puntos, utilízanse nun gran número de campos científicos e profesionais como un
elemento fundamental do traballo. Debido á enorme cantidade de información que
pode ser xerada mediante esta tecnoloxía, os conxuntos de datos LiDAR sempre
foron considerados coma un gran desafío á hora de desenvolver aplicacións software
capaces de manexar tales volumes de información de xeito rápido e eficiente.
Toda a investigación realizada durante esta Tese centrouse no desenvolvemento
de novas técnicas, algoritmos e sistemas que mellorasen o rendemento, a eficiencia
e a calidade dos múltiples e diversos elementos críticos dos contornos LiDAR.
Así, desenvolvéronse sistemas web de tipo cliente-servidor para visualizar e procesar en tempo real grandes nubes de puntos de resolución completa, permitindo o acceso desde calquera tipo de dispositivo, dende tabletas a equipos de sobremesa, adaptando as súas funcionalidades e características aos requisitos e necesidades de campos específicos do coñecemento científico. As elevadas esixencias de almacenamento típicamente asociadas aos datos LiDAR, así como o intenso tráfico de rede que pode xerarse en aplicacións de tipo web, levou ao desenvolvemento de métodos de compresión de datos sen perdas xunto con novas estruturas de datos baseadas na non redundancia de información. Estes novos elementos foron utilizados para proporcionar un soporte altamente eficiente para técnicas multi-resolución out-ofcore para a visualizacion en tempo real de nubes de puntos masivas, reducindo de xeito significativo os requisitos de almacenamento, o consumo de memoria principal e de vídeo, así como a conxestión no tráfico de rede. Por último, estableceuse como obxectivo da fase final da Tese, superar as limitacións derivadas da execución de soft-ware en computadores compostos por unha única máquina para o almacenamento e a computación sobre grandes conxuntos de datos LiDAR masivos. A partir dun estudo preliminar para analizar a idoneidade de diferentes solucións big
data para almacenar, distribuír e dar soporte ao envío de datos a varios clientes LiDAR, desenvolveuse un sistema altamente escalable para a computación distribuída sobre os volumes de datos mencionados. Como punto de partida, implementáronse diversas propostas utilizando como caso de estudo a creación de modelos dixitais do terreo
(MDT), servindo como base tecnolóxica para un futuro servizo coa capacidade de
ofrecer unha biblioteca de múltiples procesos xeoespaciais.
Advisors/Committee Members: Amor, Margarita (advisor), Doallo, Ramón (advisor).
Subjects/Keywords: Data point sets;
Lidar technology;
Geospatial measurement;
Big Data
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Deibe Seoane, D. (2019). Geospatial Processing and Visualization of Point Clouds: from GPUs to Big Data Technologies
. (Doctoral Dissertation). Universidad da Coruña. Retrieved from http://hdl.handle.net/2183/24513
Chicago Manual of Style (16th Edition):
Deibe Seoane, David. “Geospatial Processing and Visualization of Point Clouds: from GPUs to Big Data Technologies
.” 2019. Doctoral Dissertation, Universidad da Coruña. Accessed March 07, 2021.
http://hdl.handle.net/2183/24513.
MLA Handbook (7th Edition):
Deibe Seoane, David. “Geospatial Processing and Visualization of Point Clouds: from GPUs to Big Data Technologies
.” 2019. Web. 07 Mar 2021.
Vancouver:
Deibe Seoane D. Geospatial Processing and Visualization of Point Clouds: from GPUs to Big Data Technologies
. [Internet] [Doctoral dissertation]. Universidad da Coruña; 2019. [cited 2021 Mar 07].
Available from: http://hdl.handle.net/2183/24513.
Council of Science Editors:
Deibe Seoane D. Geospatial Processing and Visualization of Point Clouds: from GPUs to Big Data Technologies
. [Doctoral Dissertation]. Universidad da Coruña; 2019. Available from: http://hdl.handle.net/2183/24513

Delft University of Technology
26.
Psomadaki, S. (author).
Using a Space Filling Curve for the Management of Dynamic Point Cloud Data in a Relational DBMS.
Degree: 2016, Delft University of Technology
URL: http://resolver.tudelft.nl/uuid:c1e625b0-0a74-48b5-b748-6968e7f83e2b
► The rapid developments in the field of point cloud acquisition technologies have allowed point clouds to become an important source of information for many applications.…
(more)
▼ The rapid developments in the field of point cloud acquisition technologies have allowed point clouds to become an important source of information for many applications. One of the newest applications of point clouds concerns the monitoring of the coast. Many countries, among which the Netherlands, use this source of data in order to determine the changes in coastal elevations. This means that point clouds are collected every hour, day, month, year; ultimately talking about dynamic point clouds. To be able to efficiently use this plethora of data, the management of those point clouds, dynamic or not, is proven to be crucial. Point clouds, like the majority of geodata, have been traditionally managed using file-based solutions. Nevertheless, the last years database solutions have emerged. Typical examples are the point cloud extensions for PostgreSQL and the Oracle Database. Both options use a similar block-based organisation. In addition to the block based organisations, point clouds can also be managed using a flat table where each point is stored in a separate row. While the first approach is very scalable and efficient, the second is easier to implement and to update. To make the flat model scalable, a Space Filling Curve (SFC) can be used to cluster the data. Nonetheless, both approaches in their current forms, are not suited for the management of dynamic points. The reason for this is the fact that they do not consider the time dimension as part of the organisation and further insertions for the block-based approaches are not straightforward. Within this thesis a SFC approach for managing dynamic point clouds is investigated. For this, the flat model approach using an Index Organised Table (IOT) within a Relational Database Management System (RDBMS) is used. Two variants coming from two extremes of the space - time continuum are then taken into account. In the first approach, space and time are both used within the SFC (integrated approach), while in the second one, time dominates over space (non-integrated approach). Along these two approaches, two treatments of the z dimension are, also, studied: as attribute or as part of the SFC. In addition to that, building on the coastal monitoring applications, the most important queries are identified: space - time, only time, only space. The efficiency of the implemented methodology is tested through the execution of a benchmark. Using two use cases coming from coastal applications, the benchmark is executed once for daily and once for yearly data. The results show that the SFC approach is an appropriate method for managing dynamic point clouds. Furthermore, the integrated approach is the most suitable way to proceed. Achieving scalability, time efficiency and dynamic insertions can be achieved for various use cases.
Architecture and The Built Environment
OTB
Geomatics for the Built Environment
Advisors/Committee Members: Tijssen, T.P.M. (mentor), van Oosterom, P.J.M. (mentor).
Subjects/Keywords: Point cloud data; Space filling curve; Spatio-temporal data; Benchmark; DBMS
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Psomadaki, S. (. (2016). Using a Space Filling Curve for the Management of Dynamic Point Cloud Data in a Relational DBMS. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:c1e625b0-0a74-48b5-b748-6968e7f83e2b
Chicago Manual of Style (16th Edition):
Psomadaki, S (author). “Using a Space Filling Curve for the Management of Dynamic Point Cloud Data in a Relational DBMS.” 2016. Masters Thesis, Delft University of Technology. Accessed March 07, 2021.
http://resolver.tudelft.nl/uuid:c1e625b0-0a74-48b5-b748-6968e7f83e2b.
MLA Handbook (7th Edition):
Psomadaki, S (author). “Using a Space Filling Curve for the Management of Dynamic Point Cloud Data in a Relational DBMS.” 2016. Web. 07 Mar 2021.
Vancouver:
Psomadaki S(. Using a Space Filling Curve for the Management of Dynamic Point Cloud Data in a Relational DBMS. [Internet] [Masters thesis]. Delft University of Technology; 2016. [cited 2021 Mar 07].
Available from: http://resolver.tudelft.nl/uuid:c1e625b0-0a74-48b5-b748-6968e7f83e2b.
Council of Science Editors:
Psomadaki S(. Using a Space Filling Curve for the Management of Dynamic Point Cloud Data in a Relational DBMS. [Masters Thesis]. Delft University of Technology; 2016. Available from: http://resolver.tudelft.nl/uuid:c1e625b0-0a74-48b5-b748-6968e7f83e2b

Delft University of Technology
27.
Panagiotou, V. (author).
Blind segmentation of time-series: A two-level approach.
Degree: 2015, Delft University of Technology
URL: http://resolver.tudelft.nl/uuid:832c8b73-fbc2-412e-9b8d-9063a48e6d57
► Change-point detection is an indispensable tool for awide variety of applications which has been extensively studied in the literature over the years. However, the development…
(more)
▼ Change-point detection is an indispensable tool for awide variety of applications which has been extensively studied in the literature over the years. However, the development of wireless devices and miniature sensors that allows continuous recording of data poses new challenges that cannot be adequately addressed by the vast majority of existing methods. In this work, we aim to balance statistical accuracy with computational efficiency, by developing a hierarchical two-level algorithmthat can significantly reduce the computational burden in the expense of a negligible loss of detection accuracy. Our choice is motivated by the idea that if a simple test was used to quickly select some potential change-points in the first level, then the second level which consists of a computationally more expensive algorithm, would be applied only to a subset of data, leading to a significant run-time improvement. In addition, in order to alleviate the difficulties arising in high-dimensional data, we use a data selection technique which gives more importance to data that are more useful for detecting changes than to others. Using these ideas, we compute a detection measure which is given as the weighted sum of individual dissimilarity measures and we present techniques that can speed up some standard change-point detection methods. Experimental results on both artificial and real-world data demonstrate the effectiveness of developed approaches and provide a useful insight about the suitability of some of the state-of-the-art methods for detecting changes in many different scenarios.
Signals & Systems
Electrical Engineering
Electrical Engineering, Mathematics and Computer Science
Advisors/Committee Members: Heusdens, R. (mentor), Härmä, A. (mentor).
Subjects/Keywords: change-point detection; segmentation; time-series data; data selection techniques; speedup
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Panagiotou, V. (. (2015). Blind segmentation of time-series: A two-level approach. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:832c8b73-fbc2-412e-9b8d-9063a48e6d57
Chicago Manual of Style (16th Edition):
Panagiotou, V (author). “Blind segmentation of time-series: A two-level approach.” 2015. Masters Thesis, Delft University of Technology. Accessed March 07, 2021.
http://resolver.tudelft.nl/uuid:832c8b73-fbc2-412e-9b8d-9063a48e6d57.
MLA Handbook (7th Edition):
Panagiotou, V (author). “Blind segmentation of time-series: A two-level approach.” 2015. Web. 07 Mar 2021.
Vancouver:
Panagiotou V(. Blind segmentation of time-series: A two-level approach. [Internet] [Masters thesis]. Delft University of Technology; 2015. [cited 2021 Mar 07].
Available from: http://resolver.tudelft.nl/uuid:832c8b73-fbc2-412e-9b8d-9063a48e6d57.
Council of Science Editors:
Panagiotou V(. Blind segmentation of time-series: A two-level approach. [Masters Thesis]. Delft University of Technology; 2015. Available from: http://resolver.tudelft.nl/uuid:832c8b73-fbc2-412e-9b8d-9063a48e6d57

Delft University of Technology
28.
Deng, Mutian (author).
Using Foreign Data Wrapper in PostgreSQL to Expose Point Clouds on File System.
Degree: 2020, Delft University of Technology
URL: http://resolver.tudelft.nl/uuid:0182ce77-96e3-493b-a933-e91910b6302b
► This research is aimed to answer the main research question: to what extent we can use LiDAR point clouds directly in the PostgreSQL by means…
(more)
▼ This research is aimed to answer the main research question: to what extent we can use LiDAR point clouds directly in the PostgreSQL by means of FDW, and thus a FDW supporting the Point Cloud Data Management System is implemented. Then, the range and performance of its functionality are evaluated. The results shows this FDW solution is feasible while the querying time is relevant to the number of returned points. The benefit and problem of this FDW are analyzed.
Geomatics
Advisors/Committee Members: Meijers, B.M. (mentor), van Oosterom, P.J.M. (mentor), Delft University of Technology (degree granting institution).
Subjects/Keywords: Foreign Data Wrapper; Point clouds data management; PostgreSQL
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Deng, M. (. (2020). Using Foreign Data Wrapper in PostgreSQL to Expose Point Clouds on File System. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:0182ce77-96e3-493b-a933-e91910b6302b
Chicago Manual of Style (16th Edition):
Deng, Mutian (author). “Using Foreign Data Wrapper in PostgreSQL to Expose Point Clouds on File System.” 2020. Masters Thesis, Delft University of Technology. Accessed March 07, 2021.
http://resolver.tudelft.nl/uuid:0182ce77-96e3-493b-a933-e91910b6302b.
MLA Handbook (7th Edition):
Deng, Mutian (author). “Using Foreign Data Wrapper in PostgreSQL to Expose Point Clouds on File System.” 2020. Web. 07 Mar 2021.
Vancouver:
Deng M(. Using Foreign Data Wrapper in PostgreSQL to Expose Point Clouds on File System. [Internet] [Masters thesis]. Delft University of Technology; 2020. [cited 2021 Mar 07].
Available from: http://resolver.tudelft.nl/uuid:0182ce77-96e3-493b-a933-e91910b6302b.
Council of Science Editors:
Deng M(. Using Foreign Data Wrapper in PostgreSQL to Expose Point Clouds on File System. [Masters Thesis]. Delft University of Technology; 2020. Available from: http://resolver.tudelft.nl/uuid:0182ce77-96e3-493b-a933-e91910b6302b

University of Miami
29.
Wickramarathne, Thanuka L.
An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach.
Degree: PhD, Electrical and Computer Engineering (Engineering), 2012, University of Miami
URL: https://scholarlyrepository.miami.edu/oa_dissertations/851
► The recent experiences of asymmetric urban military operations have highlighted the pressing need for incorporation of soft data, such as informant statements, into the…
(more)
▼ The recent experiences of asymmetric urban military operations have highlighted the pressing need for incorporation of soft
data, such as informant statements, into the fusion process. Soft
data are fundamentally different from
hard data (generated by physics-based sensors), in the sense that the information they provide tends to be qualitative and
subject to interpretation. These characteristics pose a major obstacle to using existing multi-sensor
data fusion frameworks, which are quite well established for
hard data. Given the critical and sensitive nature of intended applications, soft/
hard data fusion requires a framework that allows for convenient representation of various
data uncertainties common in soft/
hard data, and provides fusion techniques that are robust, mathematically justifiable, and yet effective. This would allow an analyst to make decisions with a better understanding of the associated uncertainties as well as the fusion mechanism itself. We present here a detailed account of an analytical solution to the task of soft/
hard data fusion. The developed analytical framework consists of several main components: (i) a Dempster-Shafer (DS) belief theory based fusion strategy; (ii) a complete characterization of the Fagin-Halpern DS theoretic (DST) conditional notion which forms the basis of the
data fusion framework; (iii) an evidence updating strategy for the purpose of consensus generation; (iv) a credibility estimation technique for validation of evidence; and (v) techniques for reducing computational burden associated with the proposed fusion framework. The proposed fusion strategy possesses several intuitively appealing features, and satisfies certain algebraic and fusion properties making it particularly useful in a soft/
hard fusion environment. This strategy is based on DS belief theory which allows for convenient representation of uncertainties that are typical of soft/
hard domains. The Fagin-Halpern (FH) notion is perhaps the most appropriate DST conditional notion for soft/
hard data fusion scenarios. It also forms the basis for our fusion framework. We provide a complete characterization of the FH conditional notion. This constitutes a strong result, that sets the foundation for understanding the FH conditional notions and also establishes the theoretical grounds for development of algorithms for efficient computation of FH conditionals. We also address the converse problem of determining the evidence that may have generated a given change of belief. This converse result can be of significant practical value in certain applications. A consensus control strategy developed based on our fusion technique allows consensus analysis to be carried out in a multitude of applications that call for extended flexibility in uncertainty modeling. We provide a complete theoretical development of the proposed consensus strategy with rigorous proofs. We make use of these consensus notions to establish a
data validation technique to assess credibility of evidence in the absence of ground truth. Credibility…
Advisors/Committee Members: Kamal Premaratne, Manohar N. Murthi, Miroslav Kubat, James W. Modestino, Marco Pravia.
Subjects/Keywords: Data fusion; soft/hard fusion; dumpster sharer theory; consensus; conditional core theorem; credibility estimation
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Wickramarathne, T. L. (2012). An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach. (Doctoral Dissertation). University of Miami. Retrieved from https://scholarlyrepository.miami.edu/oa_dissertations/851
Chicago Manual of Style (16th Edition):
Wickramarathne, Thanuka L. “An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach.” 2012. Doctoral Dissertation, University of Miami. Accessed March 07, 2021.
https://scholarlyrepository.miami.edu/oa_dissertations/851.
MLA Handbook (7th Edition):
Wickramarathne, Thanuka L. “An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach.” 2012. Web. 07 Mar 2021.
Vancouver:
Wickramarathne TL. An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach. [Internet] [Doctoral dissertation]. University of Miami; 2012. [cited 2021 Mar 07].
Available from: https://scholarlyrepository.miami.edu/oa_dissertations/851.
Council of Science Editors:
Wickramarathne TL. An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach. [Doctoral Dissertation]. University of Miami; 2012. Available from: https://scholarlyrepository.miami.edu/oa_dissertations/851

UCLA
30.
Kim, Yan Shi.
Using Administrative Data to Characterize Patterns of End-of-Life Care in Diverse Settings.
Degree: Health Policy and Management 007I, 2015, UCLA
URL: http://www.escholarship.org/uc/item/1p42v62r
► End-of-life (EOL) care in the United States is costly, highly fragmented, and uncoordinated. Despite devoting almost one third of all Medicare expenditures to caring for…
(more)
▼ End-of-life (EOL) care in the United States is costly, highly fragmented, and uncoordinated. Despite devoting almost one third of all Medicare expenditures to caring for patients during their last year of life, studies have repeatedly shown that the overall quality of care and quality of life at the EOL is poor. In order to provide better quality care at the EOL, it is crucial that we have a solid understanding of how EOL care is provided under our current healthcare system. This dissertation uses two large administrative datasets to explore and explain variations and patterns in the care patients receive at the EOL and contains three papers. Paper one explores the variations in the use of life-sustaining treatments before death. Paper two examines the impact the organization of our healthcare system has on the use and underuse of hospice services among patients in long-term care hospitals with chronic critical illness. Paper three explores the power of financial incentives in the form of Medicare reimbursements in influencing providers' decision on discharging patients from long-term care hospitals. In this work, we showed that patterns of care near the EOL are highly variable across subgroups of patients, provider institutions, and geographic regions; and are heavily influenced by financial incentives as well as the supply of healthcare providers. More importantly, this dissertation illustrates the urgent need to develop new and expand on existing data sources with the relevant information required to gain a more in-depth understanding of what is driving the differences in care in order to deliver true patient-centered and family-oriented care near the EOL.
Subjects/Keywords: Health care management; Administrative Data; End-of-Life Care; Medicare
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Kim, Y. S. (2015). Using Administrative Data to Characterize Patterns of End-of-Life Care in Diverse Settings. (Thesis). UCLA. Retrieved from http://www.escholarship.org/uc/item/1p42v62r
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Kim, Yan Shi. “Using Administrative Data to Characterize Patterns of End-of-Life Care in Diverse Settings.” 2015. Thesis, UCLA. Accessed March 07, 2021.
http://www.escholarship.org/uc/item/1p42v62r.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Kim, Yan Shi. “Using Administrative Data to Characterize Patterns of End-of-Life Care in Diverse Settings.” 2015. Web. 07 Mar 2021.
Vancouver:
Kim YS. Using Administrative Data to Characterize Patterns of End-of-Life Care in Diverse Settings. [Internet] [Thesis]. UCLA; 2015. [cited 2021 Mar 07].
Available from: http://www.escholarship.org/uc/item/1p42v62r.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Kim YS. Using Administrative Data to Characterize Patterns of End-of-Life Care in Diverse Settings. [Thesis]. UCLA; 2015. Available from: http://www.escholarship.org/uc/item/1p42v62r
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
◁ [1] [2] [3] [4] [5] … [1698] ▶
.