You searched for subject:(hard end point data)
.
Showing records 1 – 30 of
44469 total matches.
◁ [1] [2] [3] [4] [5] … [1483] ▶

University of Victoria
1.
Alam, Shahid.
A Framework for Metamorphic Malware Analysis and Real-Time Detection.
Degree: Department of Computer Science, 2014, University of Victoria
URL: http://hdl.handle.net/1828/5576
► Metamorphism is a technique that mutates the binary code using different obfuscations. It is difficult to write a new metamorphic malware and in general malware…
(more)
▼ Metamorphism is a technique that mutates the binary code using different obfuscations. It is difficult to write a new metamorphic malware and in general malware writers reuse old malware. To hide detection the malware writers change the obfuscations (syntax) more than the behavior (semantic) of such a new malware. On this assumption and motivation, this thesis presents a new framework named MARD for Metamorphic Malware Analysis and Real-Time Detection. We also introduce a new intermediate language named MAIL (Malware Analysis Intermediate Language). Each MAIL statement is assigned a pattern that can be used to annotate a control flow graph for pattern matching to analyse and detect metamorphic malware. MARD uses MAIL to achieve platform independence, automation and optimizations for metamorphic malware analysis and detection. As part of the new framework, to build a behavioral signature and detect metamorphic malware in real-time, we propose two novel techniques, named ACFG (Annotated Control Flow Graph) and SWOD-CFWeight (Sliding Window of Difference and Control Flow Weight). Unlike other techniques, ACFG provides a faster matching of CFGs, without compromising
detection accuracy; it can handle malware with smaller CFGs, and contains more information and hence provides more accuracy than a CFG. SWOD-CFWeight mitigates and addresses key issues in current techniques, related to the change of the frequencies of opcodes, such as the use of different compilers, compiler optimizations, operating systems and obfuscations. The size of SWOD can change, which gives anti-malware tool developers the ability to select appropriate parameter values to further optimize malware detection. CFWeight captures the control flow semantics of a program to an extent that helps detect metamorphic malware in real-time. Experimental evaluation of the two proposed techniques, using an existing dataset, achieved detection rates in the range 94% - 99.6% and false positive rates in the range 0.93% - 12.44%. Compared to ACFG, SWOD-CFWeight significantly improves the detection time, and is suitable to be used where the time for malware detection is more important as in real-time (practical) anti-malware applications.
Advisors/Committee Members: Horspool, R. Nigel (supervisor), Traore, Issa (supervisor).
Subjects/Keywords: End point security; Malware detection; Metamorphic malware; Control flow analysis; Heuristics; Data mining; Window of difference
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Alam, S. (2014). A Framework for Metamorphic Malware Analysis and Real-Time Detection. (Thesis). University of Victoria. Retrieved from http://hdl.handle.net/1828/5576
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Alam, Shahid. “A Framework for Metamorphic Malware Analysis and Real-Time Detection.” 2014. Thesis, University of Victoria. Accessed December 16, 2019.
http://hdl.handle.net/1828/5576.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Alam, Shahid. “A Framework for Metamorphic Malware Analysis and Real-Time Detection.” 2014. Web. 16 Dec 2019.
Vancouver:
Alam S. A Framework for Metamorphic Malware Analysis and Real-Time Detection. [Internet] [Thesis]. University of Victoria; 2014. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/1828/5576.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Alam S. A Framework for Metamorphic Malware Analysis and Real-Time Detection. [Thesis]. University of Victoria; 2014. Available from: http://hdl.handle.net/1828/5576
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Otago
2.
Roberts, Dax.
Data Remanence In New Zealand
.
Degree: 2013, University of Otago
URL: http://hdl.handle.net/10523/3768
► International research has shown that individuals and companies in other countries do not always fully remove the data from their computer data storage devices before…
(more)
▼ International research has shown that individuals and companies in other countries do not always fully remove the
data from their computer
data storage devices before disposing of them. Typically this means when people are disposing of their computer
hard drives there is a wealth of personal or corporate information that can be exploited to commit such crimes as identity theft, fraud, stalking and blackmail.
A further literature review showed that no such “
data remanence” research for
hard drives (or any other
data storage devices such as mobile phones, USB thumb drives and the like) had been conducted in New Zealand.
The methodologies for all relevant
hard drive
data remanence experiments were compared and then used to design the most appropriate methodology for this research.
100 second hand
hard drives were then sourced nationally across New Zealand for the experiments of this research to determine the baseline of
data remanence for
hard drives in New Zealand. The results of the experiments were then compared with international results to determine how New Zealand compares and what if any further actions (such as education) should be taken.
Advisors/Committee Members: Wolfe, H (advisor).
Subjects/Keywords: Data;
Remanence;
Hard Drives;
Data Remanence
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Roberts, D. (2013). Data Remanence In New Zealand
. (Doctoral Dissertation). University of Otago. Retrieved from http://hdl.handle.net/10523/3768
Chicago Manual of Style (16th Edition):
Roberts, Dax. “Data Remanence In New Zealand
.” 2013. Doctoral Dissertation, University of Otago. Accessed December 16, 2019.
http://hdl.handle.net/10523/3768.
MLA Handbook (7th Edition):
Roberts, Dax. “Data Remanence In New Zealand
.” 2013. Web. 16 Dec 2019.
Vancouver:
Roberts D. Data Remanence In New Zealand
. [Internet] [Doctoral dissertation]. University of Otago; 2013. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/10523/3768.
Council of Science Editors:
Roberts D. Data Remanence In New Zealand
. [Doctoral Dissertation]. University of Otago; 2013. Available from: http://hdl.handle.net/10523/3768

Texas A&M University
3.
Hafley, Brian Scott.
Developement of monoclonal antibodies for a multiple antigen ELISA to verify safe cooking end-point temperature in beef and pork.
Degree: 2007, Texas A&M University
URL: http://hdl.handle.net/1969.1/4802
► Four proteins exhibiting different rates of denaturation or precipitation with increasing cooking temperature from 63 to 73????C for beef and 67 to 79????C for pork…
(more)
▼ Four proteins exhibiting different rates of denaturation or precipitation with
increasing cooking temperature from 63 to 73????C for beef and 67 to 79????C for pork were
selected for developing a ratio model and incorporating the results into a mathematical
expression. Monoclonal antibodies (Mabs) against lactate dehydrogenase isozyme 5
(LDH-5), bovine serum albumin (BSA), porcine enolase, and bovine myoglobin were
developed for use in a sandwich enzyme-linked immunosorbent assay (ELISA) to
simultaneously investigate changes in protein concentration with incremental increases
in temperature.
Four groups of mice were immunized separately with commercially available or
purified protein (LDH-5, BSA, enolase, or myoglobin). After reporting ample blood
serum titers, spleen cells were harvested and fused with SP2 myeloma tumor cells using
an electro fusion cell manipulator. Hybridoma containing wells were screened against
their respective protein to isolate hybridomas secreting protein specific Mabs. Tissue culture flask produced Mabs were used initially in sandwich ELISA assay
testing. Mabs were tested against ground beef and pork cooked to instantaneous endpoint
temperatures (EPTs). A 6 g section removed from the geometric center of each
sample was homogenized in phosphate buffer, centrifuged, and a 1 ml aliquot collected
for analysis.
Microtiter plates were coated with goat anti-mouse IgG antibody (2 mg/ml) to act
as a capture antibody for the protein specific monoclonal antibody concentrated from cell
culture supernatant. Serial diluted muscle (beef or pork) extract (10 ml) from each EPT
was applied to a microtiter plate. A protein A/G purified polyclonal antibody (Pab) was
applied, followed by a goat anti-rabbit IgG peroxidase conjugated antibody.
Concentration was determined by comparison to a standard curve.
After multiple cell fusions, 24, 29, 66, and 12 cell lines secreting protein specific
Mabs against LDH-5, BSA, enolase, and myoglobin, respectively, were produced. Six
Mabs against LDH-5 reported R2 values > 0.9 indicating high specificity and affinity for
LDH-5. Sandwich ELISA assays development with Mabs against BSA, enolase, and
myoglobin was not as successful. Mouse ascites produced Mabs against BSA, enolase,
and myoglobin were also unsuccessful when used in a sandwich ELISA. However,
preliminary
data suggested a multiple antigen ratio model still remained a viable option.
Advisors/Committee Members: Keeton, Jimmy T. (advisor), Berghman, Luc R. (committee member), Miller, Rhonda K. (committee member), Rooney, Lloyd W. (committee member).
Subjects/Keywords: End-point Temperature
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Hafley, B. S. (2007). Developement of monoclonal antibodies for a multiple antigen ELISA to verify safe cooking end-point temperature in beef and pork. (Thesis). Texas A&M University. Retrieved from http://hdl.handle.net/1969.1/4802
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Hafley, Brian Scott. “Developement of monoclonal antibodies for a multiple antigen ELISA to verify safe cooking end-point temperature in beef and pork.” 2007. Thesis, Texas A&M University. Accessed December 16, 2019.
http://hdl.handle.net/1969.1/4802.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Hafley, Brian Scott. “Developement of monoclonal antibodies for a multiple antigen ELISA to verify safe cooking end-point temperature in beef and pork.” 2007. Web. 16 Dec 2019.
Vancouver:
Hafley BS. Developement of monoclonal antibodies for a multiple antigen ELISA to verify safe cooking end-point temperature in beef and pork. [Internet] [Thesis]. Texas A&M University; 2007. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/1969.1/4802.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Hafley BS. Developement of monoclonal antibodies for a multiple antigen ELISA to verify safe cooking end-point temperature in beef and pork. [Thesis]. Texas A&M University; 2007. Available from: http://hdl.handle.net/1969.1/4802
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Hong Kong
4.
李桂君; Li, Guijun.
Development of recording technology with FePt recording
media and magnetic tunnel junction sensors with conetic
alloy.
Degree: PhD, 2013, University of Hong Kong
URL: Li,
G.
[李桂君].
(2013).
Development
of
recording
technology
with
FePt
recording
media
and
magnetic
tunnel
junction
sensors
with
conetic
alloy.
(Thesis).
University
of
Hong
Kong,
Pokfulam,
Hong
Kong
SAR.
Retrieved
from
http://dx.doi.org/10.5353/th_b5089977
;
http://dx.doi.org/10.5353/th_b5089977
;
http://hdl.handle.net/10722/192832
► With highly demanding requirement in current emerging cloud storage and personal computers, hard disk drive recording with high stability and high volume has attached much…
(more)
▼ With highly demanding requirement in current
emerging cloud storage and personal computers, hard disk drive
recording with high stability and high volume has attached much
attention in industry and academy. Recording media and recording
head feasible for future high-density recording are both crucial to
utilize magnetic recording with 1T bit/in2 recording density.
Recoding media with FePt for high density and high stability was
investigated in this thesis using FePt polymers with imprinting
methods and FePt thin films with ion-beam bombardment technologies.
The FePt polymers can be patterned using imprint at micro-and
nano-scales. The micro-and nano-patterns could be retained on
substrates after sintering at high temperatures. The high magnetic
coercivity was proved with line and dot patterns at different
scales.
Recording heads with Al2O3based magnetic tunneling
junction sensors were also studied in thesis. The magnetic
tunneling junction sensors were proved to work stable at different
temperatures varying from -30 °C to 100 °C. The long time running
test up to 100 hours also proved the stability of the magnetic
tunneling junction sensors working in extreme temperatures.
Withstate-of-art patterning and depositing technologies, new ideas
about using FePt polymer to work as magnetic recording media and
using ion beam bombardments to tune the FePt magnetic properties
were verified. The feasibility of using Al2O3 based magnetic
tunneling junction sensors as recording head was also
discussed.
published_or_final_version
Electrical and Electronic
Engineering
Doctoral
Doctor of Philosophy
Advisors/Committee Members: Lai, PT, Pong, PWT.
Subjects/Keywords: Hard disks (Computer science); Data disk drives.
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
李桂君; Li, G. (2013). Development of recording technology with FePt recording
media and magnetic tunnel junction sensors with conetic
alloy. (Doctoral Dissertation). University of Hong Kong. Retrieved from Li, G. [李桂君]. (2013). Development of recording technology with FePt recording media and magnetic tunnel junction sensors with conetic alloy. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5089977 ; http://dx.doi.org/10.5353/th_b5089977 ; http://hdl.handle.net/10722/192832
Chicago Manual of Style (16th Edition):
李桂君; Li, Guijun. “Development of recording technology with FePt recording
media and magnetic tunnel junction sensors with conetic
alloy.” 2013. Doctoral Dissertation, University of Hong Kong. Accessed December 16, 2019.
Li, G. [李桂君]. (2013). Development of recording technology with FePt recording media and magnetic tunnel junction sensors with conetic alloy. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5089977 ; http://dx.doi.org/10.5353/th_b5089977 ; http://hdl.handle.net/10722/192832.
MLA Handbook (7th Edition):
李桂君; Li, Guijun. “Development of recording technology with FePt recording
media and magnetic tunnel junction sensors with conetic
alloy.” 2013. Web. 16 Dec 2019.
Vancouver:
李桂君; Li G. Development of recording technology with FePt recording
media and magnetic tunnel junction sensors with conetic
alloy. [Internet] [Doctoral dissertation]. University of Hong Kong; 2013. [cited 2019 Dec 16].
Available from: Li, G. [李桂君]. (2013). Development of recording technology with FePt recording media and magnetic tunnel junction sensors with conetic alloy. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5089977 ; http://dx.doi.org/10.5353/th_b5089977 ; http://hdl.handle.net/10722/192832.
Council of Science Editors:
李桂君; Li G. Development of recording technology with FePt recording
media and magnetic tunnel junction sensors with conetic
alloy. [Doctoral Dissertation]. University of Hong Kong; 2013. Available from: Li, G. [李桂君]. (2013). Development of recording technology with FePt recording media and magnetic tunnel junction sensors with conetic alloy. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5089977 ; http://dx.doi.org/10.5353/th_b5089977 ; http://hdl.handle.net/10722/192832

University of New Mexico
5.
Medina, Una E.
MADD MESSAGE EFFECTS: A TWELVE-YEAR RANDOMIZED TRIAL.
Degree: Department of Communication and Journalism, 2010, University of New Mexico
URL: http://hdl.handle.net/1928/12395
► One out of three Americans undergoes drunk-driving crashes; 23% result in death. To deter DWIs (Driving While under Influence), MADD (Mothers Against Drunk Drivers) created…
(more)
▼ One out of three Americans undergoes drunk-driving crashes; 23% result in death. To deter DWIs (Driving While under Influence), MADD (Mothers Against Drunk Drivers) created VIPs (Victim Impact Panels) where victims impact offenders with gory stories, photos, and threats of punishments and loss of freedom, hoping this message will deter DWIs. It is remarkable that although the VIP message is considered a primary DWI intervention, yet no studies have investigated VIP message effects.
VIP message effects, their persistence and decay, are chronicled here over the course of 12 years. This study extends an empirical investigation of VIPs, conducted by Woodall, Delaney, Rogers, and Wheeler (2007) (n = 833) during 1994-1996. At 2 years, these researchers found MADD VIP participants' recidivism rates were 30% higher than their DWI School comparison group, trending toward significance at p = .0583. This study supports those results as significant at 12 years. As an extension, it investigates whether reactance theory explains VIP message effects failure. Reactance theory research, a subset of message effects research, explains how emotional, confrontational, and threatening messages induce psychological reactance in the mind of the message receiver, who then seeks to preserve his or her sense of freedom by behaving contrarily (Brehm, 1966). Hierarchically intensifying effects of these theoretical reactance antecedents are studied here in an unusual manner, as they occur in vivo, in real life.
The same intervention was observed to have different effects depending on prior conditions and demographics. The emotional high-threat, high-confrontation MADD VIP message coincided with significantly shorter time to recidivism (p = .009, d = 1.64) and significantly higher number of subsequent arrests (p < .0001, d = 1.64) among recent prior offenders, and those with no priors under age 30 (p = .01, d = 0.35). Younger offenders may be associated with more iconoclastic behavior than older offenders (Beirness & Simpson, 1997; Greenberg, 2005; NHTSA, 2008), partially explaining the under-30 age effect.
This study furthers persuasive message design as a science and suggests a message-based approach to intervention analysis. There was no effect when MADD VIP was analyzed simply as an intervention. However, there were highly significant effect sizes when the same MADD VIP intervention was analyzed as a message. This study concludes by offering MADD VIP best practice recommendations.
Advisors/Committee Members: Woodall, W. Gill, Schuetz, Janice, Rivera, Mario A., McDermott, Virginia, Delaney, Harold.
Subjects/Keywords: Victim Impact panels; MADD; message effects; randomized trial; effect size; drunk driving; DWI; efficacy trial; method problems; methodological problems; communication theory; theory building; rhetorical analysis; triangulation; drunk driving; interventions; covariates; ANOVA; ANCOVA; survival analysis; message context; message content; message function; message intensity; message frequency; message metrics; message pathos; pathos; message decay; decay rate; message decay rate; intent to persuade; persuasion; confrontation; shame; shaming; public shaming; public censure; forewarning; perceived threat; reactance theory; assumptions; sampling error; recruitment error; non-adherence to condition; random assignment error; factorial design; operationalization; theory construct operationalization; methods informed by literature; methodological symbiosis; questionnaire reliability and validity; secondary data sources; public arrest record; public data; covariate operationalization; reactance constructs; content analysis; theme analysis; prior arrest; censored cases; QSR N6; SPSS; Excel; limitations; under-identification; attrition; population attrition; bimodal distribution; dichotomous variables; data splitting; discretizing data; time to recidivism; subsequent arrests; emotional change; emotion score; outliers; reactance antecedent; message dose; message dosage; treatment fidelity; assess treatment fidelity; predictor variables; controlling variables; demographic covariate; demographic predictor; confirmation bias; data bias; interaction effect; treatment effect; message design; fear appeal; message strength; anger; survival analysis; time dependence; mixed methods; study design; message standardization; internal validity; hard data; hard end-point data; marginal sample size; observed variables; intervening factors; intervening variables; sample size; in vivo; hierarchy of effects; emotional threat; older offenders; young offenders; intervention analysis; message-based approach; best practices; DWI intervention; DWI treatment; prior conditions; iconoclast; Drunks Against MADD Mothers; resistance; message design science
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Medina, U. E. (2010). MADD MESSAGE EFFECTS: A TWELVE-YEAR RANDOMIZED TRIAL. (Doctoral Dissertation). University of New Mexico. Retrieved from http://hdl.handle.net/1928/12395
Chicago Manual of Style (16th Edition):
Medina, Una E. “MADD MESSAGE EFFECTS: A TWELVE-YEAR RANDOMIZED TRIAL.” 2010. Doctoral Dissertation, University of New Mexico. Accessed December 16, 2019.
http://hdl.handle.net/1928/12395.
MLA Handbook (7th Edition):
Medina, Una E. “MADD MESSAGE EFFECTS: A TWELVE-YEAR RANDOMIZED TRIAL.” 2010. Web. 16 Dec 2019.
Vancouver:
Medina UE. MADD MESSAGE EFFECTS: A TWELVE-YEAR RANDOMIZED TRIAL. [Internet] [Doctoral dissertation]. University of New Mexico; 2010. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/1928/12395.
Council of Science Editors:
Medina UE. MADD MESSAGE EFFECTS: A TWELVE-YEAR RANDOMIZED TRIAL. [Doctoral Dissertation]. University of New Mexico; 2010. Available from: http://hdl.handle.net/1928/12395

Brno University of Technology
6.
Bečička, Martin.
Analýza dat získaných z genotypizačních esejí z real-time PCR
.
Degree: 2018, Brno University of Technology
URL: http://hdl.handle.net/11012/81823
► Bakalářská práce se zabývá vizualizací dat získaných z real-time PCR ve vývojovém prostředí MATLAB. Teoretická část práce obsahuje úvod do tématu PCR a real-time PCR,…
(more)
▼ Bakalářská práce se zabývá vizualizací dat získaných z real-time PCR ve vývojovém prostředí MATLAB. Teoretická část práce obsahuje úvod do tématu PCR a real-time PCR, nástrojů pro vyhodnocení dat získaných z real-time PCR s popisem mezinárodního standardu RDML pro jejich ukládání. Praktická část popisuje funkce vytvořeného grafického rozhraní.; Bachelor thesis is concerned with the visualization of real-time PCR
data in MATLAB. Theoretical part of the thesis provides introduction to PCR and real-time PCR, describes tools used for the evaluation of
data acquired from real-time PCR and the international format RDML used for the storage of such
data. The practical part describes the developed graphical interface.
Advisors/Committee Members: Sekora, Jiří (advisor).
Subjects/Keywords: Real-time PCR;
GUI;
end-point analýza;
Matlab;
RDML;
Real-time PCR;
GUI;
end-point analysis;
Matlab;
RDML
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Bečička, M. (2018). Analýza dat získaných z genotypizačních esejí z real-time PCR
. (Thesis). Brno University of Technology. Retrieved from http://hdl.handle.net/11012/81823
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Bečička, Martin. “Analýza dat získaných z genotypizačních esejí z real-time PCR
.” 2018. Thesis, Brno University of Technology. Accessed December 16, 2019.
http://hdl.handle.net/11012/81823.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Bečička, Martin. “Analýza dat získaných z genotypizačních esejí z real-time PCR
.” 2018. Web. 16 Dec 2019.
Vancouver:
Bečička M. Analýza dat získaných z genotypizačních esejí z real-time PCR
. [Internet] [Thesis]. Brno University of Technology; 2018. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/11012/81823.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Bečička M. Analýza dat získaných z genotypizačních esejí z real-time PCR
. [Thesis]. Brno University of Technology; 2018. Available from: http://hdl.handle.net/11012/81823
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Saskatchewan
7.
Fransoo, Stephen 1982-.
Pulse Producer Decision Making Under Risky Conditions: Will End-Point Royalties Change Preferences?.
Degree: 2017, University of Saskatchewan
URL: http://hdl.handle.net/10388/8298
► In 2015, the Agriculture for Growth Act (C-18) came into effect in Canada. This Act modernized plant breeding by including amendments that aligned it with…
(more)
▼ In 2015, the Agriculture for Growth Act (C-18) came into effect in Canada. This Act modernized plant breeding by including amendments that aligned it with the 1991 International Convention for the Protection of New Plant Varieties (UPOV91) (CFIA, 2017). Regulations within the Act grant plant breeders the right to charge an
end-
point royalty (EPR) on harvested grain. This thesis is interested in assessing how provenance and framing, influence pulse producer seed choice decisions. This study created a prospect theory behavioral experiment to answer this question. The study concluded that producers are not overly influenced by provenance and framing and instead make decisions based on the expected utility model, except when questions are manipulated by both EPR and negative framing. The study also concluded that most producers (56%) are willing to tolerate a level of risk. This provided a way to profile producers by risk tolerance and found many similarities and few minor differences between those that are always risk-seeking, always risk-averse, and occasionally risk-seeking.
Advisors/Committee Members: Phillips, Peter W.B.B, Agblor, Kofi, Smyth, Stuart, Feist, Gina.
Subjects/Keywords: Lentils; UPOV 91; End Point Royalty; Decision Making; Risk
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Fransoo, S. 1. (2017). Pulse Producer Decision Making Under Risky Conditions: Will End-Point Royalties Change Preferences?. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/8298
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Fransoo, Stephen 1982-. “Pulse Producer Decision Making Under Risky Conditions: Will End-Point Royalties Change Preferences?.” 2017. Thesis, University of Saskatchewan. Accessed December 16, 2019.
http://hdl.handle.net/10388/8298.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Fransoo, Stephen 1982-. “Pulse Producer Decision Making Under Risky Conditions: Will End-Point Royalties Change Preferences?.” 2017. Web. 16 Dec 2019.
Vancouver:
Fransoo S1. Pulse Producer Decision Making Under Risky Conditions: Will End-Point Royalties Change Preferences?. [Internet] [Thesis]. University of Saskatchewan; 2017. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/10388/8298.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Fransoo S1. Pulse Producer Decision Making Under Risky Conditions: Will End-Point Royalties Change Preferences?. [Thesis]. University of Saskatchewan; 2017. Available from: http://hdl.handle.net/10388/8298
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of California – Berkeley
8.
Bagherieh, Omid.
Estimation, Identification and Data-Driven Control Design for Hard Disk Drives.
Degree: Mechanical Engineering, 2017, University of California – Berkeley
URL: http://www.escholarship.org/uc/item/0998n097
► The demand for online storage has been increasing significantly during the last few years. Hard disk drives are the primary storage devices used in data…
(more)
▼ The demand for online storage has been increasing significantly during the last few years. Hard disk drives are the primary storage devices used in data centers for storing these online contents. The servo assembly of the dual-stage Hard Disk Drive (HDD) is composed of the Voice Coil Motor (VCM) and the Mili-Actuator (MA), where the VCM is responsible for coarse positioning at low frequency regions and the MA is responsible for fine positioning at high frequency regions. Controlling these two actuators is very critical in precision positioning of the read/write head, which is mounted at the edge of the servo assembly. In this dissertation, the precision positioning of the head during the self-servo writing process as well as feed-forward and feedback controls in the track following mode are considered. This dissertation discusses three control design methodologies for hard disk drives servo systems, in order to improve their performance as well as their reliability. The first is a state estimator for non-uniform sampled systems with irregularities in the measurement sampling time, which estimates the states at a uniform sampling time. The second is an online uncertainty identification algorithm, which parameterizes and identifies the uncertain part of transfer functions in a dual-stage HDD. The third is a frequency based data-driven control design methodology, which considers mixed H_2/H_infinity control objectives and is able to synthesize track following servo systems for dual stage actuators utilizing only the frequency response measurement data, without the need of identifying the models of the actuators.The state estimator design for non-uniform sampled systems with irregularity in the measurement sampling time is considered, where it is proposed to design an observer to estimate the states at a uniform sampling time. This observer is designed using a time-varying Kalman filter as well as a gain-scheduling observer. The Kalman filter has the optimal performance, while the gain-scheduling observer requires relatively lower computational power. Simulations are conducted involving the self-servo writing process in hard disk drives, where performance as well as computational complexity of these two observers are compared under different noise scenarios.Uncertainties in system dynamics can change the closed loop transfer functions and affect the performance or even stability of the control algorithm. These uncertainties are parameterized as stable terms using coprime factorizations, and are identified in an online fashion. The uncertainty identification, in comparison to the complete transfer function identification, requires less computational power as well as a smaller order for the identified transfer function.The proposed online uncertainty identification algorithm is utilized to factorize and identify the uncertain part of transfer functions in a dual-stage Hard Disk Drive (HDD). The dual-stage actuators' gains and resonance modes are affected by temperature variations, which in turn affect all closed loop…
Subjects/Keywords: Mechanical engineering; Controls; Data-driven control; Estimation; Hard disk drive; Identification
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Bagherieh, O. (2017). Estimation, Identification and Data-Driven Control Design for Hard Disk Drives. (Thesis). University of California – Berkeley. Retrieved from http://www.escholarship.org/uc/item/0998n097
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Bagherieh, Omid. “Estimation, Identification and Data-Driven Control Design for Hard Disk Drives.” 2017. Thesis, University of California – Berkeley. Accessed December 16, 2019.
http://www.escholarship.org/uc/item/0998n097.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Bagherieh, Omid. “Estimation, Identification and Data-Driven Control Design for Hard Disk Drives.” 2017. Web. 16 Dec 2019.
Vancouver:
Bagherieh O. Estimation, Identification and Data-Driven Control Design for Hard Disk Drives. [Internet] [Thesis]. University of California – Berkeley; 2017. [cited 2019 Dec 16].
Available from: http://www.escholarship.org/uc/item/0998n097.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Bagherieh O. Estimation, Identification and Data-Driven Control Design for Hard Disk Drives. [Thesis]. University of California – Berkeley; 2017. Available from: http://www.escholarship.org/uc/item/0998n097
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Miami
9.
Nunez Sanchez, Rafael Camilo.
Novel Methods for Reasoning with Uncertain Hard and Soft Data using Probabilistic and Belief Theoretic Methods.
Degree: PhD, Electrical and Computer Engineering (Engineering), 2018, University of Miami
URL: https://scholarlyrepository.miami.edu/oa_dissertations/2204
► Effectively combining multiple and complementary sources of information is becoming one of the most promising paths for increased accuracy and more detailed analysis in numerous…
(more)
▼ Effectively combining multiple and complementary sources of information is becoming one of the most promising paths for increased accuracy and more detailed analysis in numerous applications. Neuroscience, business analytics, military intelligence, and sociology are among the areas that could significantly benefit from properly processing diverse
data sources. However, traditional methods for combining multiple sources of information are based on slow or impractical methods that rely either on vast amounts of manual processing or on suboptimal representations of
data. Moreover, most of the existing methods are not well suited for dealing with the increasing amount of human-generated
data. We introduce an analytical framework that allows automatic and efficient processing of both
hard (e.g., physics-based sensors) and soft (e.g., human-generated) information, leading to enhanced decision-making in multisource environments. This framework is based on the Dempster-Shafer (DS) Theory of Evidence as the common language for
data representation and inference. To model and track uncertainties in soft
data, our framework introduces Uncertain Logic, a classically consistent first order logic environment. In addition, our framework defines a filtering and tracking environment for incorporating both
hard and soft
data, where the probability posterior can be decomposed into a product of combining functions over subsets of the state and measurement variables. This combining function approach offers a framework for the development and incorporation of more sophisticated uncertainty modeling and tracking/estimation models, and at the same time allows incorporating and enhancing existing Bayesian methods. Future work is aimed at increasing the computational efficiency of the overall
hard and soft
data fusion framework.
Advisors/Committee Members: Manohar N. Murthi, Kamal Premaratne, Xiaodong Cai, Otavio Bueno, Miroslav Kubat.
Subjects/Keywords: Uncertain Logic; Uncertain Logic Processing; Data Fusion; Hard and Soft Data Fusion
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Nunez Sanchez, R. C. (2018). Novel Methods for Reasoning with Uncertain Hard and Soft Data using Probabilistic and Belief Theoretic Methods. (Doctoral Dissertation). University of Miami. Retrieved from https://scholarlyrepository.miami.edu/oa_dissertations/2204
Chicago Manual of Style (16th Edition):
Nunez Sanchez, Rafael Camilo. “Novel Methods for Reasoning with Uncertain Hard and Soft Data using Probabilistic and Belief Theoretic Methods.” 2018. Doctoral Dissertation, University of Miami. Accessed December 16, 2019.
https://scholarlyrepository.miami.edu/oa_dissertations/2204.
MLA Handbook (7th Edition):
Nunez Sanchez, Rafael Camilo. “Novel Methods for Reasoning with Uncertain Hard and Soft Data using Probabilistic and Belief Theoretic Methods.” 2018. Web. 16 Dec 2019.
Vancouver:
Nunez Sanchez RC. Novel Methods for Reasoning with Uncertain Hard and Soft Data using Probabilistic and Belief Theoretic Methods. [Internet] [Doctoral dissertation]. University of Miami; 2018. [cited 2019 Dec 16].
Available from: https://scholarlyrepository.miami.edu/oa_dissertations/2204.
Council of Science Editors:
Nunez Sanchez RC. Novel Methods for Reasoning with Uncertain Hard and Soft Data using Probabilistic and Belief Theoretic Methods. [Doctoral Dissertation]. University of Miami; 2018. Available from: https://scholarlyrepository.miami.edu/oa_dissertations/2204

Brno University of Technology
10.
Batelková, Andrea.
Řízení firemních dat a návrh jejich zálohování
.
Degree: 2012, Brno University of Technology
URL: http://hdl.handle.net/11012/7856
► Tato bakalářská práce analyzuje současný proces fungování zálohování a nakládání s daty ve společnosti STAVOPROGRES BRNO, spol. s r.o. a navrhuje, jakým způsobem jej změnit,…
(more)
▼ Tato bakalářská práce analyzuje současný proces fungování zálohování a nakládání s daty ve společnosti STAVOPROGRES BRNO, spol. s r.o. a navrhuje, jakým způsobem jej změnit, aby byl co nejefektivnějším. Nabízí vytvoření centrálního zálohování dat, při kterém budou veškerá vytvořená
data zálohována automaticky. Součástí práce je také popis teoretických východisek a další možnosti uložení zálohovaných dat ve společnosti.; This bachelor thesis deals with the current process of corporate
data management and central
data backup design at STAVOPROGRES BRNO spol. s r.o. Thesis offers options for changes to make process more optimal. Thesis points to the solution of a central
data backup in which all created
data will be backed up automatically. Thesis also describes theoretical resources and other options to manage
data inside the company.
Advisors/Committee Members: Kříž, Jiří (advisor).
Subjects/Keywords: Zálohování;
archivace;
data;
pevný disk;
email.;
Backup;
archiving;
data;
hard drive;
email.
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Batelková, A. (2012). Řízení firemních dat a návrh jejich zálohování
. (Thesis). Brno University of Technology. Retrieved from http://hdl.handle.net/11012/7856
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Batelková, Andrea. “Řízení firemních dat a návrh jejich zálohování
.” 2012. Thesis, Brno University of Technology. Accessed December 16, 2019.
http://hdl.handle.net/11012/7856.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Batelková, Andrea. “Řízení firemních dat a návrh jejich zálohování
.” 2012. Web. 16 Dec 2019.
Vancouver:
Batelková A. Řízení firemních dat a návrh jejich zálohování
. [Internet] [Thesis]. Brno University of Technology; 2012. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/11012/7856.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Batelková A. Řízení firemních dat a návrh jejich zálohování
. [Thesis]. Brno University of Technology; 2012. Available from: http://hdl.handle.net/11012/7856
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Cambridge
11.
Lu, Ruodan.
Automated Generation of Geometric Digital Twins of Existing Reinforced Concrete Bridges
.
Degree: 2019, University of Cambridge
URL: https://www.repository.cam.ac.uk/handle/1810/289430
► The cost and effort of modelling existing bridges from point clouds currently outweighs the perceived benefits of the resulting model. The time required for generating…
(more)
▼ The cost and effort of modelling existing bridges from point clouds currently outweighs the perceived benefits of the resulting model. The time required for generating a geometric Bridge Information Model, a holistic data model which has recently become known as a "Digital Twin", of an existing bridge from Point Cloud Data is roughly ten times greater than laser scanning it. There is a pressing need to automate this process. This is particularly true for the highway infrastructure sector because Bridge Digital Twin Generation is an efficient means for documenting bridge condition data. Based on a two-year inspection cycle, there is a need for at least 315,000 bridge inspections per annum across the United States and the United Kingdom. This explains why there is a huge market demand for less labour-intensive bridge documentation techniques that can efficiently boost bridge management productivity.
Previous research has achieved the automatic generation of surface primitives combined with rule-based classification to create labelled cuboids and cylinders from point clouds. While existing methods work well in synthetic datasets or simplified cases, they encounter huge challenges when dealing with real-world bridge point clouds, which are often unevenly distributed and suffer from occlusions. In addition, real bridge topology is much more complicated than idealized cases. Real bridge geometries are defined with curved horizontal alignments, and varying vertical elevations and cross-sections. These characteristics increase the modelling difficulties, which is why none of the existing methods can handle reliably.
The objective of this PhD research is to devise, implement, and benchmark a novel framework that can reasonably generate labelled geometric object models of constructed bridges comprising concrete elements in an established data format (i.e. Industry Foundation Classes). This objective is achieved by answering the following research questions: (1) how to effectively detect reinforced concrete bridge components in Point Cloud Data? And (2) how to effectively fit 3D solid models in the format of Industry Foundation Classes to the detected point clusters?
The proposed framework employs bridge engineering knowledge that mimics the intelligence of human modellers to detect and model reinforced concrete bridge objects in point clouds. This framework directly extracts structural bridge components and then models them without generating low-level shape primitives. Experimental results suggest that the proposed framework can perform quickly and reliably with complex and incomplete real-world bridge point clouds encounter occlusions and unevenly distributed points. The results of experiments on ten real-world bridge point clouds indicate that the framework achieves an overall micro-average detection F1-score of 98.4%, an average modelling accuracy of (C2C) ̅_Auto 7.05 cm, and the average modelling time of merely 37.8 seconds. Compared to the laborious and time-consuming manual practice, the proposed framework can…
Subjects/Keywords: Digital Twin;
Bridge;
Point Cloud Data;
IFC
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Lu, R. (2019). Automated Generation of Geometric Digital Twins of Existing Reinforced Concrete Bridges
. (Thesis). University of Cambridge. Retrieved from https://www.repository.cam.ac.uk/handle/1810/289430
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Lu, Ruodan. “Automated Generation of Geometric Digital Twins of Existing Reinforced Concrete Bridges
.” 2019. Thesis, University of Cambridge. Accessed December 16, 2019.
https://www.repository.cam.ac.uk/handle/1810/289430.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Lu, Ruodan. “Automated Generation of Geometric Digital Twins of Existing Reinforced Concrete Bridges
.” 2019. Web. 16 Dec 2019.
Vancouver:
Lu R. Automated Generation of Geometric Digital Twins of Existing Reinforced Concrete Bridges
. [Internet] [Thesis]. University of Cambridge; 2019. [cited 2019 Dec 16].
Available from: https://www.repository.cam.ac.uk/handle/1810/289430.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Lu R. Automated Generation of Geometric Digital Twins of Existing Reinforced Concrete Bridges
. [Thesis]. University of Cambridge; 2019. Available from: https://www.repository.cam.ac.uk/handle/1810/289430
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of New South Wales
12.
Lee, Chung Tong.
Fixed point theorems in mathematical models for data aggregation.
Degree: Computer Science & Engineering, 2011, University of New South Wales
URL: http://handle.unsw.edu.au/1959.4/50302
;
https://unsworks.unsw.edu.au/fapi/datastream/unsworks:9183/SOURCE02?view=true
► Model construction requires selecting and identifying relevant aspects of a situation in the real world. Describing these aspects and the relations between them by a…
(more)
▼ Model construction requires selecting and identifying relevant aspects of a situation in the real world. Describing these aspects and the relations between them by a system of equations essentially constructs a mathematical model. Functional definitions are very useful in describing relations. They provide additional imperative information for computation, compared to their implicit counterparts. However, it is rarely the case that all entities have a one-way dependency on others. Rather, entities in a system interact with one another and display interdependencies. When translated to mathematical models with functional definitions, the equation systems may contain circular definitions.This thesis demonstrates how to apply fixed
point theorems to mathematical models when the relations between entities involve circular definitions. Fixed
point solutions are computed via iteration. As a simplified example, suppose that the relations between two variables x and y can be described by functions f and g such that x = f(y) and y = g(x). Then the set of fixed points of the composite function f ∘ g is the solution for x, i.e., x = f(g(x)). In this thesis, formulations of this type have been applied to different problem domains which are commonly found in the Internet environment. These include rating aggregation, voting, reputation and trust, and information retrieval. In the simulation for rating aggregation, the quality of an assessor depends on the discrepancy between the ratings he gives and the final ratings. On the other hand, the final rating is defined as a weighted average of ratings given by different assessors, using assessor qualities as the weights. This model shows robustness against random attacks and collusion.The voting study in this dissertation involved real-life
data from the MSN Q&A service. Voter quality is defined to capture the agreement between voters, again a circular definition. Existence of a solution is asserted by Brouwer's Fixed
Point Theorem. This new voting system shows advantages over simple majority vote counting, being more robust against random attacks and showing identification hints for ballot stuffing.Using intuitively self-evident axioms on the trust building process, the method of a weighted quasi-arithmetic average is proved to be adequate to serve as a mathematical model for trust. Further, reputation is defined as an aggregation of trust over a community. The transaction properties and the reputation of the rating agents are used as the weight factors for the aggregation. This is essentially a circular definition. Solution existence is guaranteed when a suitable weighting function is chosen.Topic difficulty and system retrieval performance exhibit a negative reinforcement relationship which is an excellent example of a circular definition. Using an estimation accuracy interpretation, the mathematical model with fixed
point solution in this thesis gives a more natural result on TREC
data than that from the HITS algorithm with eigenvector solutions.Finally, this dissertation…
Advisors/Committee Members: Ignjatovic, Aleksandar, Computer Science & Engineering, Faculty of Engineering, UNSW, Martin, Eric, Computer Science & Engineering, Faculty of Engineering, UNSW.
Subjects/Keywords: Data Aggregation; Fixed Point Theorem; Mathematical Model
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Lee, C. T. (2011). Fixed point theorems in mathematical models for data aggregation. (Doctoral Dissertation). University of New South Wales. Retrieved from http://handle.unsw.edu.au/1959.4/50302 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:9183/SOURCE02?view=true
Chicago Manual of Style (16th Edition):
Lee, Chung Tong. “Fixed point theorems in mathematical models for data aggregation.” 2011. Doctoral Dissertation, University of New South Wales. Accessed December 16, 2019.
http://handle.unsw.edu.au/1959.4/50302 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:9183/SOURCE02?view=true.
MLA Handbook (7th Edition):
Lee, Chung Tong. “Fixed point theorems in mathematical models for data aggregation.” 2011. Web. 16 Dec 2019.
Vancouver:
Lee CT. Fixed point theorems in mathematical models for data aggregation. [Internet] [Doctoral dissertation]. University of New South Wales; 2011. [cited 2019 Dec 16].
Available from: http://handle.unsw.edu.au/1959.4/50302 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:9183/SOURCE02?view=true.
Council of Science Editors:
Lee CT. Fixed point theorems in mathematical models for data aggregation. [Doctoral Dissertation]. University of New South Wales; 2011. Available from: http://handle.unsw.edu.au/1959.4/50302 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:9183/SOURCE02?view=true

University of Edinburgh
13.
Tammana, Praveen Aravind Babu.
Software-defined datacenter network debugging.
Degree: PhD, 2018, University of Edinburgh
URL: http://hdl.handle.net/1842/31326
► Software-defined Networking (SDN) enables flexible network management, but as networks evolve to a large number of end-points with diverse network policies, higher speed, and higher…
(more)
▼ Software-defined Networking (SDN) enables flexible network management, but as networks evolve to a large number of end-points with diverse network policies, higher speed, and higher utilization, abstraction of networks by SDN makes monitoring and debugging network problems increasingly harder and challenging. While some problems impact packet processing in the data plane (e.g., congestion), some cause policy deployment failures (e.g., hardware bugs); both create inconsistency between operator intent and actual network behavior. Existing debugging tools are not sufficient to accurately detect, localize, and understand the root cause of problems observed in a large-scale networks; either they lack in-network resources (compute, memory, or/and network bandwidth) or take long time for debugging network problems. This thesis presents three debugging tools: PathDump, SwitchPointer, and Scout, and a technique for tracing packet trajectories called CherryPick. We call for a different approach to network monitoring and debugging: in contrast to implementing debugging functionality entirely in-network, we should carefully partition the debugging tasks between end-hosts and network elements. Towards this direction, we present CherryPick, PathDump, and SwitchPointer. The core of CherryPick is to cherry-pick the links that are key to representing an end-to-end path of a packet, and to embed picked linkIDs into its header on its way to destination. PathDump is an end-host based network debugger based on tracing packet trajectories, and exploits resources at the end-hosts to implement various monitoring and debugging functionalities. PathDump currently runs over a real network comprising only of commodity hardware, and yet, can support surprisingly a large class of network debugging problems with minimal in-network functionality. The key contributions of SwitchPointer is to efficiently provide network visibility to end-host based network debuggers like PathDump by using switch memory as a "directory service" - each switch, rather than storing telemetry data necessary for debugging functionalities, stores pointers to end hosts where relevant telemetry data is stored. The key design choice of thinking about memory as a directory service allows to solve performance problems that were hard or infeasible with existing designs. Finally, we present and solve a network policy fault localization problem that arises in operating policy management frameworks for a production network. We develop Scout, a fully-automated system that localizes faults in a large scale policy deployment and further pin-points the physical-level failures which are most likely cause for observed faults.
Subjects/Keywords: automated debugging tools; data center networks; debugging; Software-defined Networking; SDN; PathDump; SwitchPointer; Scout; CherryPick; end-to-end; end-host
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Tammana, P. A. B. (2018). Software-defined datacenter network debugging. (Doctoral Dissertation). University of Edinburgh. Retrieved from http://hdl.handle.net/1842/31326
Chicago Manual of Style (16th Edition):
Tammana, Praveen Aravind Babu. “Software-defined datacenter network debugging.” 2018. Doctoral Dissertation, University of Edinburgh. Accessed December 16, 2019.
http://hdl.handle.net/1842/31326.
MLA Handbook (7th Edition):
Tammana, Praveen Aravind Babu. “Software-defined datacenter network debugging.” 2018. Web. 16 Dec 2019.
Vancouver:
Tammana PAB. Software-defined datacenter network debugging. [Internet] [Doctoral dissertation]. University of Edinburgh; 2018. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/1842/31326.
Council of Science Editors:
Tammana PAB. Software-defined datacenter network debugging. [Doctoral Dissertation]. University of Edinburgh; 2018. Available from: http://hdl.handle.net/1842/31326

Brno University of Technology
14.
Kugler, Petr.
Zálohování dat firmy
.
Degree: 2015, Brno University of Technology
URL: http://hdl.handle.net/11012/36996
► Tato bakalářská práce seznamuje s procesem zálohování dat, vysvětluje pojem datová úložiště a dále se zabývá analýzou a návrhem zlepšení zálohování dat ve firmě Xella…
(more)
▼ Tato bakalářská práce seznamuje s procesem zálohování dat, vysvětluje pojem datová úložiště a dále se zabývá analýzou a návrhem zlepšení zálohování dat ve firmě Xella CZ, s.r.o.; This work introduces the process of backing up
data, explains the concept of storage and further analyzes and proposals for improving
data backup company Xella CZ, s.r.o.
Advisors/Committee Members: Kříž, Jiří (advisor).
Subjects/Keywords: Zálohování;
zálohování dat;
datová úložiště;
data;
pevný disk;
archivace;
Backup;
data backup;
data storage;
data;
hard disk;
archiving
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Kugler, P. (2015). Zálohování dat firmy
. (Thesis). Brno University of Technology. Retrieved from http://hdl.handle.net/11012/36996
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Kugler, Petr. “Zálohování dat firmy
.” 2015. Thesis, Brno University of Technology. Accessed December 16, 2019.
http://hdl.handle.net/11012/36996.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Kugler, Petr. “Zálohování dat firmy
.” 2015. Web. 16 Dec 2019.
Vancouver:
Kugler P. Zálohování dat firmy
. [Internet] [Thesis]. Brno University of Technology; 2015. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/11012/36996.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Kugler P. Zálohování dat firmy
. [Thesis]. Brno University of Technology; 2015. Available from: http://hdl.handle.net/11012/36996
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Brno University of Technology
15.
Kugler, Petr.
Zálohování dat firmy
.
Degree: 2015, Brno University of Technology
URL: http://hdl.handle.net/11012/33767
► Tato bakalářská práce seznamuje s procesem zálohování dat, vysvětluje pojem datová úložiště a dále se zabývá analýzou a návrhem zlepšení zálohování dat ve firmě Xella…
(more)
▼ Tato bakalářská práce seznamuje s procesem zálohování dat, vysvětluje pojem datová úložiště a dále se zabývá analýzou a návrhem zlepšení zálohování dat ve firmě Xella CZ, s.r.o.; Tato bakalářská práce seznamuje s procesem zálohování dat, vysvětluje pojem datová úložiště a dále se zabývá analýzou a návrhem zlepšení zálohování dat ve firmě Xella CZ, s.r.o.
Advisors/Committee Members: Kříž, Jiří (advisor).
Subjects/Keywords: Zálohování;
zálohování dat;
datová úložiště;
data;
pevný disk;
archivace;
Backup;
data backup;
data storage;
data;
hard disk;
archiving
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Kugler, P. (2015). Zálohování dat firmy
. (Thesis). Brno University of Technology. Retrieved from http://hdl.handle.net/11012/33767
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Kugler, Petr. “Zálohování dat firmy
.” 2015. Thesis, Brno University of Technology. Accessed December 16, 2019.
http://hdl.handle.net/11012/33767.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Kugler, Petr. “Zálohování dat firmy
.” 2015. Web. 16 Dec 2019.
Vancouver:
Kugler P. Zálohování dat firmy
. [Internet] [Thesis]. Brno University of Technology; 2015. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/11012/33767.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Kugler P. Zálohování dat firmy
. [Thesis]. Brno University of Technology; 2015. Available from: http://hdl.handle.net/11012/33767
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
16.
Holmberg, Jonas.
OFFLINE SCHEDULING OF TASK SETS WITH COMPLEX END-TO-END DELAY CONSTRAINTS.
Degree: Design and Engineering, 2017, Mälardalen University
URL: http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35694
► Software systems in the automotive domain are generally safety critical and subject to strict timing requirements. Systems of this character are often constructed utilizing…
(more)
▼ Software systems in the automotive domain are generally safety critical and subject to strict timing requirements. Systems of this character are often constructed utilizing periodically executed tasks, that have a hard deadline. In addition, these systems may have additional deadlines that can be specified on cause-effect chains, or simply task chains. They are defined by existing tasks in the system, hence the chains are not stand alone additions to the system. Each chain provide an end-to-end timing constraint targeting the propagation of data through the chain of tasks. These constraints specify the additional timing requirements that need to be fulfilled, when searching for a valid schedule. In this thesis, an offline non-preemptive scheduling method is presented, designed for single core systems. The scheduling problem is defined and formulated utilizing Constraint Programming. In addition, to ensure that end-to-end timing requirements are met, job-level dependencies are considered during the schedule generation. Utilizing this approach can guarantee that individual task periods along with end-to-end timing requirements are always met, if a schedule exists. The results show a good increase in schedulability ratio when utilizing job-level dependencies compared to the case where job-level dependencies are not specified. When the system utilization increases this improvement is even greater. Depending on the system size and complexity the improvement can vary, but in many cases it is more than double. The scheduling generation is also performed within a reasonable time frame. This would be a good benefit during the development process of a system, since it allows fast verification when changes are made to the system. Further, the thesis provide an overview of the entire process, starting from a system model and ending at a fully functional schedule executing on a hardware platform.
Subjects/Keywords: Embedded Systems; Offline Scheduling; End-to-End Delay; Constraint Programming; Job-level Dependencies; Automotive; Hard real-time systems; Embedded Systems; Inbäddad systemteknik
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Holmberg, J. (2017). OFFLINE SCHEDULING OF TASK SETS WITH COMPLEX END-TO-END DELAY CONSTRAINTS. (Thesis). Mälardalen University. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35694
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Holmberg, Jonas. “OFFLINE SCHEDULING OF TASK SETS WITH COMPLEX END-TO-END DELAY CONSTRAINTS.” 2017. Thesis, Mälardalen University. Accessed December 16, 2019.
http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35694.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Holmberg, Jonas. “OFFLINE SCHEDULING OF TASK SETS WITH COMPLEX END-TO-END DELAY CONSTRAINTS.” 2017. Web. 16 Dec 2019.
Vancouver:
Holmberg J. OFFLINE SCHEDULING OF TASK SETS WITH COMPLEX END-TO-END DELAY CONSTRAINTS. [Internet] [Thesis]. Mälardalen University; 2017. [cited 2019 Dec 16].
Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35694.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Holmberg J. OFFLINE SCHEDULING OF TASK SETS WITH COMPLEX END-TO-END DELAY CONSTRAINTS. [Thesis]. Mälardalen University; 2017. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35694
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Adelaide
17.
Arnold, Anne Jillian.
A game-theoretic approach to modelling crop royalties.
Degree: 2015, University of Adelaide
URL: http://hdl.handle.net/2440/95230
► Plant variety rights assist crop breeders to appropriate returns from new varieties and incentivise varietal improvement. Royalties are one form of plant variety rights and…
(more)
▼ Plant variety rights assist crop breeders to appropriate returns from new varieties and incentivise varietal improvement. Royalties are one form of plant variety rights and this dissertation asks which combination of the available royalty instruments is best from the perspective of consumers, farmers, crop breeders, and the overall economy. We use a game-theoretic approach to model strategic interactions between breeders and farmers. The model allows farmer privilege, whereby farmers save seed one year to plant in the future, and we show a
point-of- sale royalty with either or both of the remaining royalties is optimal, whether or not we allow the possibility of farmers under-paying royalties through under-declaring output or saved seed. We also develop a Principal–Agent model, in which risk-neutral breeders share the risk with risk-averse farmers. In this model, the optimum royalty depends on various parameters, including the costs of compliance and enforcement.
Advisors/Committee Members: Bayer, Ralph-Christopher (advisor), Binenbaum, Eran (advisor), Anderson, Kym (advisor), Wong, Jacob (advisor), School of Economics (school).
Subjects/Keywords: game-theory; economic model; end-point royalty; point-of-sale royalty; saved seed; farmer privilege; principal–agent model
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Arnold, A. J. (2015). A game-theoretic approach to modelling crop royalties. (Thesis). University of Adelaide. Retrieved from http://hdl.handle.net/2440/95230
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Arnold, Anne Jillian. “A game-theoretic approach to modelling crop royalties.” 2015. Thesis, University of Adelaide. Accessed December 16, 2019.
http://hdl.handle.net/2440/95230.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Arnold, Anne Jillian. “A game-theoretic approach to modelling crop royalties.” 2015. Web. 16 Dec 2019.
Vancouver:
Arnold AJ. A game-theoretic approach to modelling crop royalties. [Internet] [Thesis]. University of Adelaide; 2015. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/2440/95230.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Arnold AJ. A game-theoretic approach to modelling crop royalties. [Thesis]. University of Adelaide; 2015. Available from: http://hdl.handle.net/2440/95230
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Virginia Tech
18.
Monti, Henry Matthew.
An Integrated End-User Data Service for HPC Centers.
Degree: PhD, Computer Science, 2013, Virginia Tech
URL: http://hdl.handle.net/10919/19259
► The advent of extreme-scale computing systems, e.g., Petaflop supercomputers, High Performance Computing (HPC) cyber-infrastructure, Enterprise databases, and experimental facilities such as large-scale particle colliders, are…
(more)
▼ The advent of extreme-scale computing systems, e.g., Petaflop supercomputers, High Performance Computing (HPC) cyber-infrastructure, Enterprise databases, and experimental facilities such as large-scale particle colliders, are pushing the envelope on dataset sizes. Supercomputing centers routinely generate and consume ever increasing amounts of
data while executing high-throughput computing jobs. These are often result-datasets or checkpoint snapshots from long-running simulations, but can also be input
data from experimental facilities such as the Large Hadron Collider (LHC) or the Spallation Neutron Source (SNS). These growing datasets are often processed by a geographically dispersed user base across multiple different HPC installations. Moreover,
end-user workflows are also increasingly distributed in nature with massive input, output, and even intermediate
data often being transported to and from several HPC resources or
end-users for further processing or visualization. The growing
data demands of applications coupled with the distributed nature of HPC workflows, have the potential to place significant strain on both the storage and network resources at HPC centers. Despite this potential impact, rather than stringently managing HPC center resources, a common practice is to leave application-associated
data management to the
end-user, as the user is intimately aware of the application\'s workflow and
data needs. This means
end-users must frequently interact with the local storage in HPC centers, the scratch space, which is used for job input, output, and intermediate
data. Scratch is built using a parallel file system that supports very high aggregate I/O throughput, e.g., Lustre, PVFS, and GPFS. To ensure efficient I/O and faster job turnaround, use of scratch by applications is encouraged. Consequently, job input and output
data are required to be moved in and out of the scratch space by
end-users before and after the job runs, respectively. In practice,
end-users arbitrarily stage and offload
data as and when they deem fit, without any consideration to the center\'s performance, often leaving
data on the scratch long after it is needed. HPC centers resort to "purge" mechanisms that sweep the scratch space to remove files found to be no longer in use, based on not having been accessed in a preselected time threshold called the purge window that commonly ranges from a few days to a week. This ad-hoc
data management ignores the interactions between different usersd́ata storage and transmission demands, and their impact on center serviceability leading to suboptimal use of precious center resources. To address the issues of exponentially increasing
data sizes and ad-hoc
data management, we present a fresh perspective to scratch storage management by fundamentally rethinking the manner in which scratch space is employed. Our approach is twofold. First, we re-design the scratch system as a "cache" and build "retention", "population", and "eviction" policies that are tightly…
Advisors/Committee Members: Butt, Ali Raza Ashraf (committeechair), Ribbens, Calvin J. (committee member), Vazhkudai, Sudharshan Sankaran (committee member), Feng, Wu-Chun (committee member), Lin, Heshan (committee member).
Subjects/Keywords: End-User Data Services; Scratch as a Cache; Data Offloading; Data Staging
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Monti, H. M. (2013). An Integrated End-User Data Service for HPC Centers. (Doctoral Dissertation). Virginia Tech. Retrieved from http://hdl.handle.net/10919/19259
Chicago Manual of Style (16th Edition):
Monti, Henry Matthew. “An Integrated End-User Data Service for HPC Centers.” 2013. Doctoral Dissertation, Virginia Tech. Accessed December 16, 2019.
http://hdl.handle.net/10919/19259.
MLA Handbook (7th Edition):
Monti, Henry Matthew. “An Integrated End-User Data Service for HPC Centers.” 2013. Web. 16 Dec 2019.
Vancouver:
Monti HM. An Integrated End-User Data Service for HPC Centers. [Internet] [Doctoral dissertation]. Virginia Tech; 2013. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/10919/19259.
Council of Science Editors:
Monti HM. An Integrated End-User Data Service for HPC Centers. [Doctoral Dissertation]. Virginia Tech; 2013. Available from: http://hdl.handle.net/10919/19259

University of Edinburgh
19.
Stewart, Hannah J.
Objective measurement of imitation problems in autism.
Degree: 2011, University of Edinburgh
URL: http://hdl.handle.net/1842/6066
► Imitation is a complex behaviour used to allow faster learning of skills, including pivotal social cognitive processes such as language and gesture. Difficulties in imitating…
(more)
▼ Imitation is a complex behaviour used to allow faster learning of skills, including pivotal social cognitive processes such as language and gesture. Difficulties in imitating others have been broadly found within Autistic Spectrum Disorders (ASD) populations. This paper discusses two possible theories explaining these deficits: self-other mapping theory, whereby imitation deficits in ASD have been proposed to restrict the ability to map relationships between social representations of others and themselves; and self-other comparison theory, whereby the individual must distinguish similarities and differences between themself and the other, which are then related to emotional and contextual differences learnt through experience in order to provide emotional context.
Whilst imitation difficulties have been widely reported, the recordings of such difficulties have been done subjectively. This paper, however, approaches this well- known phenomenon objectively through the use of a clinical-kinematics assessment tool (C-Kat). Furthermore, this paper discusses different ways to precisely describe the imitative act to be copied and the performance of the imitator.
This paper aimed to objectively investigate whether, when compared to typically developing peers, an imitative deficit was present in ASD adolescents (ASD n = 16; TD n = 24). Secondly, it aimed to determine whether such a deficit existed only for bodily imitation. Results showed a clear group difference and suggested a developmental delay in imitation ability within ASD rather than a deficit. Furthermore, results suggested a specific ASD difficulty in bodily imitation. However, following comparison of imitation stimuli and measures, the possibility is discussed that these results may be due to focusing on elements other than the critical movement information within the action to be imitated.
Advisors/Committee Members: Williams, Justin J H G, McIntosh, Rob.
Subjects/Keywords: autism; imitation; kinematics; adolescents; motor control; movement end-point re-enactment; action imitation; bodily imitation
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Stewart, H. J. (2011). Objective measurement of imitation problems in autism. (Thesis). University of Edinburgh. Retrieved from http://hdl.handle.net/1842/6066
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Stewart, Hannah J. “Objective measurement of imitation problems in autism.” 2011. Thesis, University of Edinburgh. Accessed December 16, 2019.
http://hdl.handle.net/1842/6066.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Stewart, Hannah J. “Objective measurement of imitation problems in autism.” 2011. Web. 16 Dec 2019.
Vancouver:
Stewart HJ. Objective measurement of imitation problems in autism. [Internet] [Thesis]. University of Edinburgh; 2011. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/1842/6066.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Stewart HJ. Objective measurement of imitation problems in autism. [Thesis]. University of Edinburgh; 2011. Available from: http://hdl.handle.net/1842/6066
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Saskatchewan
20.
Bolek, Katarzyna.
Public, Producer, Private Partnerships and EPR systems in Australian Wheat Breeding.
Degree: 2015, University of Saskatchewan
URL: http://hdl.handle.net/10388/ETD-2015-03-1991
► Australia has a crop research system with higher research intensity than exists internationally. Motivated to improve R&D policy in Canada, this dissertation focuses on the…
(more)
▼ Australia has a crop research system with higher research intensity than exists internationally. Motivated to improve R&D policy in Canada, this dissertation focuses on the Australian
End Point Royalty (EPR) system for wheat and addresses four principal questions: (1) How was the Australian system created and how does it work? (2) How do public, producer and private ownership of breeding programs affect the pricing of varieties? (3) How do EPR rates affect wheat variety adoption? (4) Finally, how would uniform EPR rates, similar to those used in France, affect variety selection, total production and revenue if used in the Australian market? In addressing the first question I use existing literature and interviews with prominent personnel in the Australian wheat breeding system, including management of InterGrain, AGT, DAFWA, GRDC and others. Interviews were conducted during field study in Australia in 2011. In addressing the second question I employ a horizontal location model to analyze three game theoretic scenarios of a two firm oligopoly market with private, public and producer owned-breeding companies. The results show public and producer ownership of one of the wheat breeding programs reduces price level relative to private only ownership. I derive a novel result showing that when competing with private firms who must price above marginal cost, the public firm should also price above marginal cost in order to maximize total industry surplus. In addressing the third question I develop and estimate an econometric wheat variety adoption model for Western Australia. I find EPR rates have a negative inelastic, statistically significant impact on the adoption of varieties.
Finally, in addressing the last question, I use the econometric model to simulate the adoption of Australian wheat varieties, given a counterfactual of revenue neutral uniform EPR rates. The uniform EPR rates speed up both the adoption and dis-adoption of varieties, thereby increasing weighted average yield and total production. The value of the increase in value of production exceeds the revenue for breeders under varying EPR rates, suggesting uniform EPR system may be an attractive alternative to varying EPR rates.
Advisors/Committee Members: Gray, Richard S., Fulton, Murray, Gilchrist, Donald, Micheels, Eric.
Subjects/Keywords: Wheat Breeding; End Point Royalties; EPR; Partnerships; Australian system; Funding R&D
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Bolek, K. (2015). Public, Producer, Private Partnerships and EPR systems in Australian Wheat Breeding. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/ETD-2015-03-1991
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Bolek, Katarzyna. “Public, Producer, Private Partnerships and EPR systems in Australian Wheat Breeding.” 2015. Thesis, University of Saskatchewan. Accessed December 16, 2019.
http://hdl.handle.net/10388/ETD-2015-03-1991.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Bolek, Katarzyna. “Public, Producer, Private Partnerships and EPR systems in Australian Wheat Breeding.” 2015. Web. 16 Dec 2019.
Vancouver:
Bolek K. Public, Producer, Private Partnerships and EPR systems in Australian Wheat Breeding. [Internet] [Thesis]. University of Saskatchewan; 2015. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/10388/ETD-2015-03-1991.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Bolek K. Public, Producer, Private Partnerships and EPR systems in Australian Wheat Breeding. [Thesis]. University of Saskatchewan; 2015. Available from: http://hdl.handle.net/10388/ETD-2015-03-1991
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Manchester
21.
Loftus, John Paul Matthew.
On The Development of Control Systems Technology for
Fermentation Processes.
Degree: 2016, University of Manchester
URL: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:306385
► Fermentation processes play an integral role in the manufacture of pharmaceutical products. The Quality by Design initiative, combined with Process Analytical Technologies, aims to facilitate…
(more)
▼ Fermentation processes play an integral role in the
manufacture of pharmaceutical products. The Quality by Design
initiative, combined with Process Analytical Technologies, aims to
facilitate the consistent production of high quality products in
the most efficient and economical way. The ability to estimate and
control product quality from these processes is essential in
achieving this aim. Large historical datasets are commonplace in
the pharmaceutical industry and multivariate methods based on PCA
and PLS have been successfully used in a wide range of applications
to extract useful information from such datasets. This thesis has
focused on the development and application of novel multivariate
methods to the estimation and control of product quality from a
number of processes. The document is divided into four main
categories. Firstly, the related literature and inherent
mathematical techniques are summarised. Following this, the three
main technical areas of work are presented. The first of these
relates to the development of a novel method for estimating the
quality of products from a proprietary process using PCA. The
ability to estimate product quality is useful for identifying
production steps that are potentially problematic and also
increases process efficiency by ensuring that any defective
products are detected before they undergo any further processing.
The proposed method is simple and robust and has been applied to
two separate case studies, the results of which demonstrate the
efficacy of the technique. The second area of work concentrates on
the development of a novel method of identifying the operational
phases of batch fermentation processes and is based on PCA and
associated statistics. Knowledge of the operational phases of a
process can be beneficial from a monitoring and control perspective
and allows a process to be divided into phases that can be
approximated by a linear model. The devised methodology is applied
to two separate fermentation processes and results show the
capability of the proposed method. The third area of work focuses
on undertaking a performance evaluation of two multivariate
algorithms, PLS and EPLS, in controlling the end-point product
yield of fermentation processes. Control of end-point product
quality is of crucial importance in many manufacturing industries,
such as the pharmaceutical industry. Developing a controller based
on historical and identification process data is attractive due to
the simplicity of modelling and the increasing availability of
process data. The methodology is applied to two case studies and
performance evaluated. From both a prediction and control
perspective, it is seen that EPLS outperforms PLS, which is
important if modelling data is limited.
None.
None.
Advisors/Committee Members: CARRASCO GOMEZ, JOAQUIN J, Lennox, Barry, Carrasco Gomez, Joaquin.
Subjects/Keywords: Process Control; Multivariate Statistics; Product Quality; End-Point Control; Partial-Least Squares; Principal Component Analysis
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Loftus, J. P. M. (2016). On The Development of Control Systems Technology for
Fermentation Processes. (Doctoral Dissertation). University of Manchester. Retrieved from http://www.manchester.ac.uk/escholar/uk-ac-man-scw:306385
Chicago Manual of Style (16th Edition):
Loftus, John Paul Matthew. “On The Development of Control Systems Technology for
Fermentation Processes.” 2016. Doctoral Dissertation, University of Manchester. Accessed December 16, 2019.
http://www.manchester.ac.uk/escholar/uk-ac-man-scw:306385.
MLA Handbook (7th Edition):
Loftus, John Paul Matthew. “On The Development of Control Systems Technology for
Fermentation Processes.” 2016. Web. 16 Dec 2019.
Vancouver:
Loftus JPM. On The Development of Control Systems Technology for
Fermentation Processes. [Internet] [Doctoral dissertation]. University of Manchester; 2016. [cited 2019 Dec 16].
Available from: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:306385.
Council of Science Editors:
Loftus JPM. On The Development of Control Systems Technology for
Fermentation Processes. [Doctoral Dissertation]. University of Manchester; 2016. Available from: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:306385

University of Arizona
22.
Dhane, Kedar.
IN-SITU ELECTRO-CHEMICAL RESIDUE SENSOR AND PROCESS MODEL APPLICATION IN RINSING AND DRYING OF NANO-STRUCTURES
.
Degree: 2010, University of Arizona
URL: http://hdl.handle.net/10150/195656
► Typical surface preparation consists of exposure to cleaning chemical to remove contaminants followed by rinsing with ultra-pure water which is followed by drying. Large quantities…
(more)
▼ Typical surface preparation consists of exposure to cleaning chemical to remove contaminants followed by rinsing with ultra-pure water which is followed by drying. Large quantities of water, various chemicals, and energy are used during rinsing and drying processes. Currently there is no in-situ metrology available to determine the cleanliness of micro- and nano-structures as these processes are taking place. This is a major technology gap and leads to over use of resources and adversely affects the throughput.Surface preparation of patterned wafers by batch processing becomes a major challenge as semiconductor fabrication moves deeper in submicron technology nodes. Many fabs have already employed single wafer tools. The main roadblock for single-wafer tools is their lower throughput. This obstacle is eased by introduction of multi chamber tools. To reduce cycle time and resource utilization during rinse and dry processes without sacrificing surface cleanliness and throughput, in-situ metrology is developed and used to compare typical single wafer spinning tools with immersion tools for rinsing of patterned wafers. This novel metrology technology includes both hardware for an in-situ measurement and software for process
data analysis. Successful incorporation of this metrology will eliminate dependency on external analysis techniques such as Inductively Coupled Mass Spectroscopy (ICPMS), Scanning Electron Microscope (SEM), and Tunneling Electron Microscope (TEM), and will lead to fast response time.In this study the electro-chemical residue sensor (ECRS) was incorporated in a lab scale single-wafer spinning and single- wafer immersion tool. The ECRS was used to monitor dynamics of rinsing of various cleans such as ammonium peroxide mixture (APM), hydrochloric peroxide mixture (HPM), and sulfuric peroxide mixture (SPM). It was observed that different cleaning chemicals impact the subsequent rinse not only through adsorption and desorption but also through surface charge. The results are analyzed by using a comprehensive process model which takes into account various transport mechanisms such as adsorption, desorption, diffusion, convection, and surface charge. This novel metrology can be used at very low concentration with very high accuracy. It is used to study the effect of the key process parameters such as flow rate, spin rate, temperature, and chemical concentration.
Advisors/Committee Members: Shadman, Farhang (advisor), Blowers, Paul (committeemember), Sierra, Reyes (committeemember).
Subjects/Keywords: drying;
nano-structures;
rinsing;
sensor;
single wafer rinsing;
surface preparation end point
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Dhane, K. (2010). IN-SITU ELECTRO-CHEMICAL RESIDUE SENSOR AND PROCESS MODEL APPLICATION IN RINSING AND DRYING OF NANO-STRUCTURES
. (Doctoral Dissertation). University of Arizona. Retrieved from http://hdl.handle.net/10150/195656
Chicago Manual of Style (16th Edition):
Dhane, Kedar. “IN-SITU ELECTRO-CHEMICAL RESIDUE SENSOR AND PROCESS MODEL APPLICATION IN RINSING AND DRYING OF NANO-STRUCTURES
.” 2010. Doctoral Dissertation, University of Arizona. Accessed December 16, 2019.
http://hdl.handle.net/10150/195656.
MLA Handbook (7th Edition):
Dhane, Kedar. “IN-SITU ELECTRO-CHEMICAL RESIDUE SENSOR AND PROCESS MODEL APPLICATION IN RINSING AND DRYING OF NANO-STRUCTURES
.” 2010. Web. 16 Dec 2019.
Vancouver:
Dhane K. IN-SITU ELECTRO-CHEMICAL RESIDUE SENSOR AND PROCESS MODEL APPLICATION IN RINSING AND DRYING OF NANO-STRUCTURES
. [Internet] [Doctoral dissertation]. University of Arizona; 2010. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/10150/195656.
Council of Science Editors:
Dhane K. IN-SITU ELECTRO-CHEMICAL RESIDUE SENSOR AND PROCESS MODEL APPLICATION IN RINSING AND DRYING OF NANO-STRUCTURES
. [Doctoral Dissertation]. University of Arizona; 2010. Available from: http://hdl.handle.net/10150/195656
23.
Oliveira, Bruno Miguel Almeida de.
Comparação dos métodos ENF e 4-ENF para determinação da tenacidade ao corte de juntas adesivas.
Degree: 2016, Instituto Politécnico do Porto
URL: http://www.rcaap.pt/detail.jsp?id=oai:recipp.ipp.pt:10400.22/8252
► Ao longo destes últimos anos as ligações adesivas têm vindo a verificar um aumento progressivo em aplicações estruturais em detrimento das ligações mecânicas convencionais. Esta…
(more)
▼ Ao longo destes últimos anos as ligações adesivas têm vindo a verificar um aumento progressivo em aplicações estruturais em detrimento das ligações mecânicas convencionais. Esta alteração de paradigma deve-se às vantagens que as juntas adesivas possuem relativamente aos outros métodos de ligação. A mecânica da fratura e os Modelos de Dano Coesivo (MDC) são critérios comuns para prever a resistência em juntas adesivas e usam como parâmetros fundamentais as taxas de libertação de energia. Pelo facto do ensaio 4-Point End Notched Flexure (4-ENF), aplicado em juntas adesivas, ainda estar pouco estudado é de grande relevância um estudo acerca da sua viabilidade para a determinação da taxa crítica de libertação de energia de deformação ao corte (GIIc).
Esta dissertação tem como objetivo principal efetuar uma comparação entre os métodos End- Notched Flexure (ENF) e 4-ENF na determinação de GIIc em juntas adesivas. Para tal foram utilizados 3 adesivos: Araldite® AV138, Araldite® 2015 e SikaForce® 7752. O trabalho experimental passou pela conceção e fabrico de uma ferramenta para realização do ensaio 4-ENF, seguindo-se o fabrico e a preparação dos provetes para os ensaios. Pelo facto do ensaio 4-ENF ainda se encontrar pouco divulgado em juntas adesivas, e não se encontrar normalizado, uma parte importante do trabalho passou pela pesquisa e análise em trabalhos de investigação e artigos científicos. A análise dos resultados foi realizada por comparação direta dos valores de GIIc com os resultados obtidos no ensaio ENF, sendo realizada por série de adesivo, através da comparação das curvas P-δ e curvas-R.
Como resultado verificou-se que o ensaio 4-ENF em ligações adesivas não é o mais versátil para a determinação do valor de GIIc, e que apenas um método de obtenção de GIIc é viável. Este método é baseado na medição do comprimento de fenda (a). Ficou evidenciado que o ensaio ENF, devido a ser um ensaio normalizado, por apresentar um setup mais simples e por apresentar uma maior disponibilidade de métodos para a determinação do valor de GIIc, é o mais recomendado. Conclui-se assim que o ensaio 4-ENF, embora sendo uma alternativa ao ensaio ENF, tem aplicação mais limitada.
Over the last few years, adhesively-bonded joints have been increasingly used in structural applications instead of conventional mechanical joints. This paradigm change is due to advantages of the adhesively-bonded joints when compared with the other joining methods. Fracture mechanics and the Cohesive Zone Models (CZM) are the usual techniques to predict joints strength and they use the energy release rate as fundamental parameters. Since the 4-End Notched Flexure (4-ENF) test, applied to adhesively-bonded joints, is understudied, the studies about the viability of this method to estimate the critical shear strain energy release rate (GIIc) have great relevance.
The main objective of this thesis is the comparison between the End-Notched Flexure (ENF) and 4- ENF test methods in the determination of GIIc of adhesively-bonded joints. Three types of adhesives…
Advisors/Committee Members: Campilho, Raul Duarte Salgueiral Gomes.
Subjects/Keywords: End-Notched Flexure (ENF); Four-Point End-Notched Flexure (4-ENF); Ligações adesivas; Adesivos estruturais; End-Notched Flexure (ENF); Four-Point End-Notched Flexure (4-ENF); Adhesively-bonded joints; Structural adhesives; Materiais e Tecnologias de Fabrico
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Oliveira, B. M. A. d. (2016). Comparação dos métodos ENF e 4-ENF para determinação da tenacidade ao corte de juntas adesivas. (Thesis). Instituto Politécnico do Porto. Retrieved from http://www.rcaap.pt/detail.jsp?id=oai:recipp.ipp.pt:10400.22/8252
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Oliveira, Bruno Miguel Almeida de. “Comparação dos métodos ENF e 4-ENF para determinação da tenacidade ao corte de juntas adesivas.” 2016. Thesis, Instituto Politécnico do Porto. Accessed December 16, 2019.
http://www.rcaap.pt/detail.jsp?id=oai:recipp.ipp.pt:10400.22/8252.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Oliveira, Bruno Miguel Almeida de. “Comparação dos métodos ENF e 4-ENF para determinação da tenacidade ao corte de juntas adesivas.” 2016. Web. 16 Dec 2019.
Vancouver:
Oliveira BMAd. Comparação dos métodos ENF e 4-ENF para determinação da tenacidade ao corte de juntas adesivas. [Internet] [Thesis]. Instituto Politécnico do Porto; 2016. [cited 2019 Dec 16].
Available from: http://www.rcaap.pt/detail.jsp?id=oai:recipp.ipp.pt:10400.22/8252.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Oliveira BMAd. Comparação dos métodos ENF e 4-ENF para determinação da tenacidade ao corte de juntas adesivas. [Thesis]. Instituto Politécnico do Porto; 2016. Available from: http://www.rcaap.pt/detail.jsp?id=oai:recipp.ipp.pt:10400.22/8252
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Brno University of Technology
24.
Cabalka, Ondřej.
Zálohování dat a datová úložiště
.
Degree: 2011, Brno University of Technology
URL: http://hdl.handle.net/11012/5235
► Tato bakalářská práce upozorňuje na hojně opomíjenou důležitost firemních dat a jejich ochranu. Seznamuje s procesem zálohování jako základním prvkem bezpečnosti a efektivnosti informačního systému.…
(more)
▼ Tato bakalářská práce upozorňuje na hojně opomíjenou důležitost firemních dat a jejich ochranu. Seznamuje s procesem zálohování jako základním prvkem bezpečnosti a efektivnosti informačního systému. Součástí práce jsou základní i pokročilé metody zálohování vhodné pro malé a střední podniky, v závislosti na jejich potřebě a finančních možnostech.; This work highlights the frequently overlooked importace of corporate
data and their protection. It introduces the backup process as an essential element of safety and efficiency of information systém. The work is both basic and advanced backup mothods suitable for small and medium enterprises, depending on their needs and budget.
Advisors/Committee Members: Kříž, Jiří (advisor).
Subjects/Keywords: data;
ztráta dat;
zálohování;
obnova;
archivace;
pevný disk;
data;
data loss;
backup;
restore;
archiving;
hard disk
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Cabalka, O. (2011). Zálohování dat a datová úložiště
. (Thesis). Brno University of Technology. Retrieved from http://hdl.handle.net/11012/5235
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Cabalka, Ondřej. “Zálohování dat a datová úložiště
.” 2011. Thesis, Brno University of Technology. Accessed December 16, 2019.
http://hdl.handle.net/11012/5235.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Cabalka, Ondřej. “Zálohování dat a datová úložiště
.” 2011. Web. 16 Dec 2019.
Vancouver:
Cabalka O. Zálohování dat a datová úložiště
. [Internet] [Thesis]. Brno University of Technology; 2011. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/11012/5235.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Cabalka O. Zálohování dat a datová úložiště
. [Thesis]. Brno University of Technology; 2011. Available from: http://hdl.handle.net/11012/5235
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

UCLA
25.
Uppala, Medha.
Separable Temporal Modeling of Point Processes on Linear Networks & Balancing Data Sufficiency and Privacy.
Degree: Statistics, 2018, UCLA
URL: http://www.escholarship.org/uc/item/6g33n64x
► The first part of the dissertation focusses on spatial and temporal modeling of point processes on linear networks. Point processes on/near linear networks can simply…
(more)
▼ The first part of the dissertation focusses on spatial and temporal modeling of point processes on linear networks. Point processes on/near linear networks can simply be defined as point events occurring on or near line segment network structures embedded in a certain space. A separable modeling framework is presented that fits a formation and a dissolution model of point processes on linear networks over time. Two major applications of the separable temporal model are spider web building activity in brick mortar lines and wildfire ignition origins near road networks.The second part of the dissertation focusses on analyses of large energy databases, specifically the Energy Atlas database. The main motivation of this part is to explore and understand the issues of balancing necessary data resolution while maintaining consumer privacy.The issue of data resolution and its importance are explored by first tackling a specific policy objective. This is achieved by applying a longitudinal quantile regression model to parcel-level monthly energy consumption in the Westwood neighborhood; the model results aid in fulfilling efficiency goals outlined in the California Senate Bill 350. Then the issue of record privacy is explored through a review of current privacy methods, implementation, data ownership, and concluded with avenues of future research.
Subjects/Keywords: Statistics; Data Privacy; Data resolution; Linear Networks; Point Processes; Wildfires
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Uppala, M. (2018). Separable Temporal Modeling of Point Processes on Linear Networks & Balancing Data Sufficiency and Privacy. (Thesis). UCLA. Retrieved from http://www.escholarship.org/uc/item/6g33n64x
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Uppala, Medha. “Separable Temporal Modeling of Point Processes on Linear Networks & Balancing Data Sufficiency and Privacy.” 2018. Thesis, UCLA. Accessed December 16, 2019.
http://www.escholarship.org/uc/item/6g33n64x.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Uppala, Medha. “Separable Temporal Modeling of Point Processes on Linear Networks & Balancing Data Sufficiency and Privacy.” 2018. Web. 16 Dec 2019.
Vancouver:
Uppala M. Separable Temporal Modeling of Point Processes on Linear Networks & Balancing Data Sufficiency and Privacy. [Internet] [Thesis]. UCLA; 2018. [cited 2019 Dec 16].
Available from: http://www.escholarship.org/uc/item/6g33n64x.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Uppala M. Separable Temporal Modeling of Point Processes on Linear Networks & Balancing Data Sufficiency and Privacy. [Thesis]. UCLA; 2018. Available from: http://www.escholarship.org/uc/item/6g33n64x
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Delft University of Technology
26.
Panagiotou, V.
Blind segmentation of time-series: A two-level approach:.
Degree: 2015, Delft University of Technology
URL: http://resolver.tudelft.nl/uuid:832c8b73-fbc2-412e-9b8d-9063a48e6d57
► Change-point detection is an indispensable tool for awide variety of applications which has been extensively studied in the literature over the years. However, the development…
(more)
▼ Change-
point detection is an indispensable tool for awide variety of applications which has been extensively studied in the literature over the years. However, the development of wireless devices and miniature sensors that allows continuous recording of
data poses new challenges that cannot be adequately addressed by the vast majority of existing methods.
In this work, we aim to balance statistical accuracy with computational efficiency, by developing a hierarchical two-level algorithmthat can significantly reduce the computational burden in the expense of a negligible loss of detection accuracy. Our choice is motivated by the idea that if a simple test was used to quickly select some potential change-points in the first level, then the second level which consists of a computationally more expensive algorithm, would be applied only to a subset of
data, leading to a significant run-time improvement. In addition, in order to alleviate the difficulties arising in high-dimensional
data, we use a
data selection technique which gives more importance to
data that are more useful for detecting changes than to others. Using these ideas, we compute a detection
measure which is given as the weighted sum of individual dissimilarity measures and we present techniques that can speed up some standard change-
point detection methods.
Experimental results on both artificial and real-world
data demonstrate the effectiveness of developed approaches and provide a useful insight about the suitability of some of the state-of-the-art methods for detecting changes in many different scenarios.
Advisors/Committee Members: Heusdens, R., Härmä, A..
Subjects/Keywords: change-point detection; segmentation; time-series data; data selection techniques; speedup
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Panagiotou, V. (2015). Blind segmentation of time-series: A two-level approach:. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:832c8b73-fbc2-412e-9b8d-9063a48e6d57
Chicago Manual of Style (16th Edition):
Panagiotou, V. “Blind segmentation of time-series: A two-level approach:.” 2015. Masters Thesis, Delft University of Technology. Accessed December 16, 2019.
http://resolver.tudelft.nl/uuid:832c8b73-fbc2-412e-9b8d-9063a48e6d57.
MLA Handbook (7th Edition):
Panagiotou, V. “Blind segmentation of time-series: A two-level approach:.” 2015. Web. 16 Dec 2019.
Vancouver:
Panagiotou V. Blind segmentation of time-series: A two-level approach:. [Internet] [Masters thesis]. Delft University of Technology; 2015. [cited 2019 Dec 16].
Available from: http://resolver.tudelft.nl/uuid:832c8b73-fbc2-412e-9b8d-9063a48e6d57.
Council of Science Editors:
Panagiotou V. Blind segmentation of time-series: A two-level approach:. [Masters Thesis]. Delft University of Technology; 2015. Available from: http://resolver.tudelft.nl/uuid:832c8b73-fbc2-412e-9b8d-9063a48e6d57

Edith Cowan University
27.
James, Peter.
Secure portable execution and storage environments: A capability to improve security for remote working.
Degree: 2015, Edith Cowan University
URL: http://ro.ecu.edu.au/theses/1707
► Remote working is a practice that provides economic benefits to both the employing organisation and the individual. However, evidence suggests that organisations implementing remote working…
(more)
▼ Remote working is a practice that provides economic benefits to both the employing organisation and the individual. However, evidence suggests that organisations implementing remote working have limited appreciation of the security risks, particularly those impacting upon the confidentiality and integrity of information and also on the integrity and availability of the remote worker’s computing environment. Other research suggests that an organisation that does appreciate these risks may veto remote working, resulting in a loss of economic benefits. With the implementation of high speed broadband, remote working is forecast to grow and therefore it is appropriate that improved approaches to managing security risks are researched. This research explores the use of secure portable execution and storage environments (secure PESEs) to improve information security for the remote work categories of telework, and mobile and deployed working.
This thesis with publication makes an original contribution to improving remote work information security through the development of a body of knowledge (consisting of design models and design instantiations) and the assertion of a nascent design theory. The research was conducted using design science research (DSR), a paradigm where the research philosophies are grounded in design and construction.
Following an assessment of both the remote work information security issues and threats, and preparation of a set of functional requirements, a secure PESE concept was defined. The concept is represented by a set of attributes that encompass the security properties of preserving the confidentiality, integrity and availability of the computing environment and data. A computing environment that conforms to the concept is considered to be a secure PESE, the implementation of which consists of a highly portable device utilising secure storage and an up-loadable (on to a PC) secure execution environment. The secure storage and execution environment combine to address the information security risks in the remote work location.
A research gap was identified as no existing ‘secure PESE like’ device fully conformed to the concept, enabling a research problem and objectives to be defined. Novel secure storage and execution environments were developed and used to construct a secure PESE suitable for commercial remote work and a high assurance secure PESE suitable for security critical remote work. The commercial secure PESE was trialled with an existing telework team looking to improve security and the high assurance secure PESE was trialled within an organisation that had previously vetoed remote working due to the sensitivity of the data it processed.
An evaluation of the research findings found that the objectives had been satisfied. Using DSR evaluation frameworks it was determined that the body of knowledge had improved an area of study with sufficient evidence generated to assert a nascent design theory for secure PESEs.
The thesis highlights the limitations of the research while opportunities…
Subjects/Keywords: Information Security; Cyber Security; Secure Data at Rest; Secure Portable Storage; Secure Portable Execution Environment; Secure Remote Working; Secure Teleworking; Secure Mobile Working; Secure Deployed Working; Hardened Browser; Hardened Operating System; Anti Digital Forensics; Design Science Research; End Point Security; Human Resources Management; Information Security; Technology and Innovation
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
James, P. (2015). Secure portable execution and storage environments: A capability to improve security for remote working. (Thesis). Edith Cowan University. Retrieved from http://ro.ecu.edu.au/theses/1707
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
James, Peter. “Secure portable execution and storage environments: A capability to improve security for remote working.” 2015. Thesis, Edith Cowan University. Accessed December 16, 2019.
http://ro.ecu.edu.au/theses/1707.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
James, Peter. “Secure portable execution and storage environments: A capability to improve security for remote working.” 2015. Web. 16 Dec 2019.
Vancouver:
James P. Secure portable execution and storage environments: A capability to improve security for remote working. [Internet] [Thesis]. Edith Cowan University; 2015. [cited 2019 Dec 16].
Available from: http://ro.ecu.edu.au/theses/1707.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
James P. Secure portable execution and storage environments: A capability to improve security for remote working. [Thesis]. Edith Cowan University; 2015. Available from: http://ro.ecu.edu.au/theses/1707
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Linköping University
28.
Johansson, Samuel.
Machine learning algorithms in a distributed context.
Degree: Computer and Information Science, 2018, Linköping University
URL: http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-148920
► Interest in distributed approaches to machine learning has increased significantly in recent years due to continuously increasing data sizes for training machine learning models.…
(more)
▼ Interest in distributed approaches to machine learning has increased significantly in recent years due to continuously increasing data sizes for training machine learning models. In this thesis we describe three popular machine learning algorithms: decision trees, Naive Bayes and support vector machines (SVM) and present existing ways of distributing them. We also perform experiments with decision trees distributed with bagging, boosting and hard data partitioning and evaluate them in terms of performance measures such as accuracy, F1 score and execution time. Our experiments show that the execution time of bagging and boosting increase linearly with the number of workers, and that boosting performs significantly better than bagging and hard data partitioning in terms of F1 score. The hard data partitioning algorithm works well for large datasets where the execution time decrease as the number of workers increase without any significant loss in accuracy or F1 score, while the algorithm performs poorly on small data with an increase in execution time and loss in accuracy and F1 score when the number of workers increase.
Subjects/Keywords: Machine learning; ensemble algorithms; hard data partitioning; decision trees; Computer Sciences; Datavetenskap (datalogi)
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Johansson, S. (2018). Machine learning algorithms in a distributed context. (Thesis). Linköping University. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-148920
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Johansson, Samuel. “Machine learning algorithms in a distributed context.” 2018. Thesis, Linköping University. Accessed December 16, 2019.
http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-148920.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Johansson, Samuel. “Machine learning algorithms in a distributed context.” 2018. Web. 16 Dec 2019.
Vancouver:
Johansson S. Machine learning algorithms in a distributed context. [Internet] [Thesis]. Linköping University; 2018. [cited 2019 Dec 16].
Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-148920.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Johansson S. Machine learning algorithms in a distributed context. [Thesis]. Linköping University; 2018. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-148920
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Miami
29.
Wickramarathne, Thanuka L.
An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach.
Degree: PhD, Electrical and Computer Engineering (Engineering), 2012, University of Miami
URL: https://scholarlyrepository.miami.edu/oa_dissertations/851
► The recent experiences of asymmetric urban military operations have highlighted the pressing need for incorporation of soft data, such as informant statements, into the…
(more)
▼ The recent experiences of asymmetric urban military operations have highlighted the pressing need for incorporation of soft
data, such as informant statements, into the fusion process. Soft
data are fundamentally different from
hard data (generated by physics-based sensors), in the sense that the information they provide tends to be qualitative and
subject to interpretation. These characteristics pose a major obstacle to using existing multi-sensor
data fusion frameworks, which are quite well established for
hard data. Given the critical and sensitive nature of intended applications, soft/
hard data fusion requires a framework that allows for convenient representation of various
data uncertainties common in soft/
hard data, and provides fusion techniques that are robust, mathematically justifiable, and yet effective. This would allow an analyst to make decisions with a better understanding of the associated uncertainties as well as the fusion mechanism itself. We present here a detailed account of an analytical solution to the task of soft/
hard data fusion. The developed analytical framework consists of several main components: (i) a Dempster-Shafer (DS) belief theory based fusion strategy; (ii) a complete characterization of the Fagin-Halpern DS theoretic (DST) conditional notion which forms the basis of the
data fusion framework; (iii) an evidence updating strategy for the purpose of consensus generation; (iv) a credibility estimation technique for validation of evidence; and (v) techniques for reducing computational burden associated with the proposed fusion framework. The proposed fusion strategy possesses several intuitively appealing features, and satisfies certain algebraic and fusion properties making it particularly useful in a soft/
hard fusion environment. This strategy is based on DS belief theory which allows for convenient representation of uncertainties that are typical of soft/
hard domains. The Fagin-Halpern (FH) notion is perhaps the most appropriate DST conditional notion for soft/
hard data fusion scenarios. It also forms the basis for our fusion framework. We provide a complete characterization of the FH conditional notion. This constitutes a strong result, that sets the foundation for understanding the FH conditional notions and also establishes the theoretical grounds for development of algorithms for efficient computation of FH conditionals. We also address the converse problem of determining the evidence that may have generated a given change of belief. This converse result can be of significant practical value in certain applications. A consensus control strategy developed based on our fusion technique allows consensus analysis to be carried out in a multitude of applications that call for extended flexibility in uncertainty modeling. We provide a complete theoretical development of the proposed consensus strategy with rigorous proofs. We make use of these consensus notions to establish a
data validation technique to assess credibility of evidence in the absence of ground truth. Credibility…
Advisors/Committee Members: Kamal Premaratne, Manohar N. Murthi, Miroslav Kubat, James W. Modestino, Marco Pravia.
Subjects/Keywords: Data fusion; soft/hard fusion; dumpster sharer theory; consensus; conditional core theorem; credibility estimation
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Wickramarathne, T. L. (2012). An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach. (Doctoral Dissertation). University of Miami. Retrieved from https://scholarlyrepository.miami.edu/oa_dissertations/851
Chicago Manual of Style (16th Edition):
Wickramarathne, Thanuka L. “An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach.” 2012. Doctoral Dissertation, University of Miami. Accessed December 16, 2019.
https://scholarlyrepository.miami.edu/oa_dissertations/851.
MLA Handbook (7th Edition):
Wickramarathne, Thanuka L. “An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach.” 2012. Web. 16 Dec 2019.
Vancouver:
Wickramarathne TL. An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach. [Internet] [Doctoral dissertation]. University of Miami; 2012. [cited 2019 Dec 16].
Available from: https://scholarlyrepository.miami.edu/oa_dissertations/851.
Council of Science Editors:
Wickramarathne TL. An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach. [Doctoral Dissertation]. University of Miami; 2012. Available from: https://scholarlyrepository.miami.edu/oa_dissertations/851

Lincoln University
30.
Deng Yanbo.
Designing a framework for end user applications.
Degree: 2013, Lincoln University
URL: http://hdl.handle.net/10182/5269
► End user developers (i.e. non-professional developers) often create database applications to meet their immediate needs. However, these applications can often be difficult to generalise or…
(more)
▼ End user developers (i.e. non-professional developers) often create database applications to meet their immediate needs. However, these applications can often be difficult to generalise or adapt when requirements inevitably change. As part of this thesis, we visited several research institutions to investigate the issues of end user developed databases. We found that different user groups in the same organisation might require similar, but different, data management applications. However, the very specific designs used in most of these systems meant it was difficult to adapt them for other similar uses.
In this thesis we propose a set of guidelines for supporting end user developers to create more flexible and adaptable data management applications. Our approach involves professional and end user developers working together to find a “middle way” between very specific and very generic designs. We propose a framework solution that allows the data model to have several co-existing variations which can satisfy the requirements of different user groups in a common domain. A “framework provider” (IT professional) will create the initial framework and data model. Configuration tools are then provided for a “framework manager” to easily customise the model to the specific needs of various user groups. The system also provides client toolkits and application generators to help end user developers (EUDs) to quickly create and customise applications based on the framework.
The framework approach was applied to a case study involving a Laboratory Information Management System (LIMS) for data on research experiments. We demonstrated that the framework developed could be successfully applied to several groups working in the same domain and could be extended to include new or changed requirements.
We also evaluated the framework through software trials at several research organisations. All participants successfully used the configuration tools to extend the LIMS framework within an average of 40 minutes. EUDs were also able to easily create basic applications within an average of 25 minutes. The overall feedback was that the framework approach was a useful and efficient way to create adaptable data management applications. More importantly, participants were able to immediately see how the framework could be applied to their own laboratory data.
Subjects/Keywords: software flexibility; data management; framework approaches; database evolution; end user development
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Yanbo, D. (2013). Designing a framework for end user applications. (Thesis). Lincoln University. Retrieved from http://hdl.handle.net/10182/5269
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Yanbo, Deng. “Designing a framework for end user applications.” 2013. Thesis, Lincoln University. Accessed December 16, 2019.
http://hdl.handle.net/10182/5269.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Yanbo, Deng. “Designing a framework for end user applications.” 2013. Web. 16 Dec 2019.
Vancouver:
Yanbo D. Designing a framework for end user applications. [Internet] [Thesis]. Lincoln University; 2013. [cited 2019 Dec 16].
Available from: http://hdl.handle.net/10182/5269.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Yanbo D. Designing a framework for end user applications. [Thesis]. Lincoln University; 2013. Available from: http://hdl.handle.net/10182/5269
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
◁ [1] [2] [3] [4] [5] … [1483] ▶
.