You searched for subject:( Privacy)
.
Showing records 1 – 30 of
2785 total matches.
◁ [1] [2] [3] [4] [5] … [93] ▶

Mississippi State University
1.
Bott, Gregory J.
A privacy calculus model for personal mobile devices.
Degree: PhD, Business, College of, 2017, Mississippi State University
URL: http://sun.library.msstate.edu/ETD-db/theses/available/etd-02212017-151043/
;
► Personal mobile devices (PMDs) initiated a multi-dimensional paradigmatic shift in personal computing and personal information collection fueled by the indispensability of the Internet and…
(more)
▼ Personal mobile devices (PMDs) initiated a multi-dimensional paradigmatic shift in personal computing and personal information collection fueled by the indispensability of the Internet and the increasing functionality of the devices. From 2005 to 2016, the perceived necessity of conducting transactions on the Internet moved from optional to indispensable. The context of these transactions changes from traditional desktop and laptop computers, to the inclusion of smartphones and tablets (PMDs). However, the traditional
privacy calculus published by (Dinev and Hart 2006) was conceived before this technological and contextual change, and several core assumptions of that model must be re-examined and possibly adapted or changed to account for this shift.
This paradigm shift impacts the decision process individuals use to disclose personal information using PMDs. By nature of their size, portability, and constant proximity to the user, PMDs collect, contain, and distribute unprecedented amounts of personal information. Even though the context within which people are sharing information has changed significantly,
privacy calculus research applied to PMDs has not moved far from the seminal work by Dinev and Hart (2006). The traditional
privacy calculus risk-benefit model is limited in the PMD context because users are unaware of
how much personal information is being shared, how often it is shared, or to whom it is shared. Furthermore, the traditional model explains and predicts intent to disclose rather than actual disclosure. However, disclosure intentions are a poor predictor of actual information disclosure. Because of perceived indispensability of the information and the inability to assess potential risk, the deliberate comparison of risks to benefits prior to disclosurea core assumption of the traditional
privacy calculusmay not be the most effective basis of a model to predict and explain disclosure. The present research develops a Personal Mobile Device
Privacy Calculus model designed to predict and explain disclosure behavior within the specific context of actual disclosure of personal information using PMDs.
Advisors/Committee Members: Dr. Merrill Warkentin (chair), Dr. Robert Crossler (committee member), Dr. Joel Collier (committee member), Dr. Robert Otondo (committee member), Dr. Lawrence Marett (committee member).
Subjects/Keywords: privacy; privacy calculus; Information privacy
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Bott, G. J. (2017). A privacy calculus model for personal mobile devices. (Doctoral Dissertation). Mississippi State University. Retrieved from http://sun.library.msstate.edu/ETD-db/theses/available/etd-02212017-151043/ ;
Chicago Manual of Style (16th Edition):
Bott, Gregory J. “A privacy calculus model for personal mobile devices.” 2017. Doctoral Dissertation, Mississippi State University. Accessed January 19, 2021.
http://sun.library.msstate.edu/ETD-db/theses/available/etd-02212017-151043/ ;.
MLA Handbook (7th Edition):
Bott, Gregory J. “A privacy calculus model for personal mobile devices.” 2017. Web. 19 Jan 2021.
Vancouver:
Bott GJ. A privacy calculus model for personal mobile devices. [Internet] [Doctoral dissertation]. Mississippi State University; 2017. [cited 2021 Jan 19].
Available from: http://sun.library.msstate.edu/ETD-db/theses/available/etd-02212017-151043/ ;.
Council of Science Editors:
Bott GJ. A privacy calculus model for personal mobile devices. [Doctoral Dissertation]. Mississippi State University; 2017. Available from: http://sun.library.msstate.edu/ETD-db/theses/available/etd-02212017-151043/ ;

Leiden University
2.
Huskes, Tim.
Influences on privacy protection behaviour of Japanese people.
Degree: 2018, Leiden University
URL: http://hdl.handle.net/1887/63788
► Privacy has become the subject of heated international debate in recent years. In particular the privacy of personal data has emerged in the centre of…
(more)
▼ Privacy has become the
subject of heated international debate in recent years.
In particular the
privacy of personal data has emerged in the centre of
discussions about
privacy. Some argue that
privacy is not a universal concept,
and that people from different countries differ in their attitude towards
privacy.
There is a contested notion that Japanese people have little to no sense of
privacy. Existing research has focussed on how Japanese people conceive
privacy
and their attitude towards
privacy, but
privacy protection behaviour is not
examined, nor is the translation from the conception of
privacy and attitude
towards
privacy to protection behaviour discussed.
This thesis examines influencing factors on the
privacy protection behaviour of
Japanese people through interviews. The analysis of the interview data uses a
model of
privacy-related behaviour, developed by
Beldad, De Jong et al. (2011) as a comprehensive replacement for existing
less complete models.
Participants in this study mainly conceived
privacy as their personal
information. The chief method of protection was withholding information. The
influences on
privacy protection behaviour concentrated in participants'
concerns about their information
privacy and their ability to perceive risks to
their information
privacy.
Advisors/Committee Members: Herber, Erik (advisor).
Subjects/Keywords: information privacy; privacy protection; privacy-related behaviour; online privacy; privacy awareness
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Huskes, T. (2018). Influences on privacy protection behaviour of Japanese people. (Masters Thesis). Leiden University. Retrieved from http://hdl.handle.net/1887/63788
Chicago Manual of Style (16th Edition):
Huskes, Tim. “Influences on privacy protection behaviour of Japanese people.” 2018. Masters Thesis, Leiden University. Accessed January 19, 2021.
http://hdl.handle.net/1887/63788.
MLA Handbook (7th Edition):
Huskes, Tim. “Influences on privacy protection behaviour of Japanese people.” 2018. Web. 19 Jan 2021.
Vancouver:
Huskes T. Influences on privacy protection behaviour of Japanese people. [Internet] [Masters thesis]. Leiden University; 2018. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/1887/63788.
Council of Science Editors:
Huskes T. Influences on privacy protection behaviour of Japanese people. [Masters Thesis]. Leiden University; 2018. Available from: http://hdl.handle.net/1887/63788
3.
Högberg, Johan.
The effect of effort, control and value frames on online users privacy decision.
Degree: Social and Psychological Studies, 2013, Karlstad UniversityKarlstad University
URL: http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-30647
► A frame refers to a decision maker’s perception of a decision problem. Frames affect outcomes of decisions and are partially controlled by how decision…
(more)
▼ A frame refers to a decision maker’s perception of a decision problem. Frames affect outcomes of decisions and are partially controlled by how decision problems are formulated. This study investigated the effect of framing alternatives in a privacy decision as gaining or losing value, need to make an effort and gaining control in an online environment. Also a structure among the many effects found in earlier research concerning privacy in the context of Internet based services was sought. For these purposes two experiments and one survey were conducted at a university in Sweden. The study included 238 individuals, 197 of them being in the age range of 19-30. The participants were approached in public areas at the University and were asked to register on a fictive online cloud service. During registration they got a choice of registering automatically with little control and manually with control over what information would be published. The most salient effect found was the impact of framing the low control alternative as time saving, meaning that the participants were willing to give up privacy to save time. The practical implication of these results would be for developers of new online services to focus on making it easy and time efficient to take control over private information. For value and control frames no significant effects were found. Also exploring the result of the survey, a structure with the two components online concern and willingness to take risk online were found.
Subjects/Keywords: privacy; control; framing; online privacy; privacy decision
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Högberg, J. (2013). The effect of effort, control and value frames on online users privacy decision. (Thesis). Karlstad UniversityKarlstad University. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-30647
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Högberg, Johan. “The effect of effort, control and value frames on online users privacy decision.” 2013. Thesis, Karlstad UniversityKarlstad University. Accessed January 19, 2021.
http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-30647.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Högberg, Johan. “The effect of effort, control and value frames on online users privacy decision.” 2013. Web. 19 Jan 2021.
Vancouver:
Högberg J. The effect of effort, control and value frames on online users privacy decision. [Internet] [Thesis]. Karlstad UniversityKarlstad University; 2013. [cited 2021 Jan 19].
Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-30647.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Högberg J. The effect of effort, control and value frames on online users privacy decision. [Thesis]. Karlstad UniversityKarlstad University; 2013. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-30647
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Dalhousie University
4.
Ali, Sohail.
UML for Inclusion of Privacy in Software Modeling.
Degree: Master of Computer Science, Faculty of Computer Science, 2013, Dalhousie University
URL: http://hdl.handle.net/10222/21777
► Online commerce and service obtain much private data from users. Collection, storage, management, and use of private data are subject to various privacy laws, regulations,…
(more)
▼ Online commerce and service obtain much private data
from users. Collection, storage, management, and use of private
data are
subject to various
privacy laws, regulations, and
standards. To adhere to legal requirements, many
privacy services,
such as security, notice, and consent, are required. Inclusion of
the required
privacy services early in the life cycle of the
software development is preferred and advocated. We extend UML use
case diagrams with
privacy components to represent example
privacy
services. These components are used to visually model
privacy
requirements in the analysis phase of the SDLC. We create a
prototype by extending Microsoft Visio, a popular UML modeling
tool, with our proposed
privacy components. In summary, we show how
privacy services may be specified in UML use case diagrams rather
than adding
privacy as an afterthought to software systems and
services. The tool is demonstrated with real-world scenarios from
the health sector.
Advisors/Committee Members: NA (external-examiner), Dr. Abdel Farrag (graduate-coordinator), Dr. S. Sampalli (thesis-reader), Dr. V. Keselj (thesis-reader), Dr. Dawn Jutla, Dr. Peter Bodorik (thesis-supervisor), Not Applicable (ethics-approval), Not Applicable (manuscripts), Not Applicable (copyright-release).
Subjects/Keywords: UML privacy modeling; privacy services
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Ali, S. (2013). UML for Inclusion of Privacy in Software Modeling. (Masters Thesis). Dalhousie University. Retrieved from http://hdl.handle.net/10222/21777
Chicago Manual of Style (16th Edition):
Ali, Sohail. “UML for Inclusion of Privacy in Software Modeling.” 2013. Masters Thesis, Dalhousie University. Accessed January 19, 2021.
http://hdl.handle.net/10222/21777.
MLA Handbook (7th Edition):
Ali, Sohail. “UML for Inclusion of Privacy in Software Modeling.” 2013. Web. 19 Jan 2021.
Vancouver:
Ali S. UML for Inclusion of Privacy in Software Modeling. [Internet] [Masters thesis]. Dalhousie University; 2013. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/10222/21777.
Council of Science Editors:
Ali S. UML for Inclusion of Privacy in Software Modeling. [Masters Thesis]. Dalhousie University; 2013. Available from: http://hdl.handle.net/10222/21777

University of Waikato
5.
Karunaratne, K D Thilini Prathiba.
The reasonable expectation of privacy
.
Degree: 2020, University of Waikato
URL: http://hdl.handle.net/10289/13698
► It is increasingly recognised that privacy is an inherent human value, interest and a right that is worthy of legal protection. Privacy is no longer…
(more)
▼ It is increasingly recognised that
privacy is an inherent human value, interest and a right that is worthy of legal protection.
Privacy is no longer simply about being let alone ‘in one’s home’, rather the concept of
privacy has evolved in a broader sense and is recongised to be an entitlement of an individual even in public spaces.
With increased surveillance by both state and non-state actors,
privacy is more important than ever before. Various countries across the globe have declared
privacy as a constitutional right that must be protected, and in countries where a constitutional right does not exist, the judiciary has stepped in to declare
privacy as a right. Unfortunately, in New Zealand there is no general right to
privacy. Rather, there are specific legal protections that safeguard certain aspects of an individual’s
privacy. For example, certain offences created in the Crimes Act 1961 protects bodily
privacy and informational
privacy is protected under the
Privacy Act 1993.
The New Zealand judiciary has also taken firm stances by creating two
privacy torts, namely, tort of publication of private facts and intrusion into seclusion. However, the question remains whether there is a general right to
privacy in New Zealand outside of these
privacy protections. This is precisely what this thesis seeks to answer by looking closer at the concept of reasonable expectation of
privacy. The judiciary has repeatedly used this concept when addressing
privacy breaches. The continued use of this concept is important. But why?
Rather than just a mere test, the reasonable expectation of
privacy has been attributed by the courts as an ‘entitlement’ of the individual. The courts have interpreted and applied the reasonable expectation of
privacy in a range of case law such as
privacy trots, search and surveillance, and other areas of the law including broadcasting matters. The courts have found that an individual’s reasonable expectation of
privacy can be breached, and subsequent remedies are available. This thesis argues that the frequent and sustained use of reasonable expectation of
privacy as an entitlement that deserves protection by the common law, is without a doubt sufficient to be an expression or an embodiment of a general right to
privacy in New Zealand.
Advisors/Committee Members: Dizon, Michael (advisor).
Subjects/Keywords: Privacy;
Reasonable expectation of privacy
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Karunaratne, K. D. T. P. (2020). The reasonable expectation of privacy
. (Masters Thesis). University of Waikato. Retrieved from http://hdl.handle.net/10289/13698
Chicago Manual of Style (16th Edition):
Karunaratne, K D Thilini Prathiba. “The reasonable expectation of privacy
.” 2020. Masters Thesis, University of Waikato. Accessed January 19, 2021.
http://hdl.handle.net/10289/13698.
MLA Handbook (7th Edition):
Karunaratne, K D Thilini Prathiba. “The reasonable expectation of privacy
.” 2020. Web. 19 Jan 2021.
Vancouver:
Karunaratne KDTP. The reasonable expectation of privacy
. [Internet] [Masters thesis]. University of Waikato; 2020. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/10289/13698.
Council of Science Editors:
Karunaratne KDTP. The reasonable expectation of privacy
. [Masters Thesis]. University of Waikato; 2020. Available from: http://hdl.handle.net/10289/13698

McMaster University
6.
Calero, Vanessa.
A Framework for Measuring Privacy Risks of YouTube.
Degree: MSc, 2020, McMaster University
URL: http://hdl.handle.net/11375/25730
► While privacy risks associated with known social networks such as Facebook and Instagram are well studied, there is a limited investigation of privacy risks of…
(more)
▼ While privacy risks associated with known social networks such as Facebook and Instagram
are well studied, there is a limited investigation of privacy risks of YouTube
videos, which are mainly uploaded by teenagers and young adults, called YouTubers.
This research aims on quantifying the privacy risks of videos when sensitive
information about the private life of a YouTuber is being shared publicly. We developed
a privacy metric for YouTube videos called Privacy Exposure Index (PEI) extending
the existing social networking privacy frameworks. To understand the factors
moderating privacy behaviour of YouTubers, we conducted an extensive survey
of about 100 YouTubers. We have also investigated how YouTube Subscribers and
Viewers may desire to influence the privacy exposure of YouTubers through interactive
commenting on Videos or using other parallels YouTubers’ social networking
channels. For this purpose, we conducted a second survey of about 2000 viewers.
The results of these surveys demonstrate that YouTubers are concerned about their
privacy. Nevertheless inconsistent to this concern they exhibit privacy exposing behaviour
on their videos. In addition, we found YouTubers are being encouraged by
their audience to continue disclosing more personal information on new contents.
Finally, we empirically evaluated the soundness, consistency and applicability of
PEI by analyzing 100 videos uploaded by 10 YouTubers over a period of two years.
Thesis
Master of Science (MSc)
This research aims on quantifying the privacy risks of videos when sensitive
information about the private life of a YouTuber is being shared publicly.
Advisors/Committee Members: Samavi, Reza, Computing and Software.
Subjects/Keywords: privacy framework; YouTube privacy Framework
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Calero, V. (2020). A Framework for Measuring Privacy Risks of YouTube. (Masters Thesis). McMaster University. Retrieved from http://hdl.handle.net/11375/25730
Chicago Manual of Style (16th Edition):
Calero, Vanessa. “A Framework for Measuring Privacy Risks of YouTube.” 2020. Masters Thesis, McMaster University. Accessed January 19, 2021.
http://hdl.handle.net/11375/25730.
MLA Handbook (7th Edition):
Calero, Vanessa. “A Framework for Measuring Privacy Risks of YouTube.” 2020. Web. 19 Jan 2021.
Vancouver:
Calero V. A Framework for Measuring Privacy Risks of YouTube. [Internet] [Masters thesis]. McMaster University; 2020. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/11375/25730.
Council of Science Editors:
Calero V. A Framework for Measuring Privacy Risks of YouTube. [Masters Thesis]. McMaster University; 2020. Available from: http://hdl.handle.net/11375/25730

University of Notre Dame
7.
Evercita Cuevas Eugenio.
Some Methods for Differentially Private Data
Synthesis</h1>.
Degree: Applied and Computational Mathematics and
Statistics, 2019, University of Notre Dame
URL: https://curate.nd.edu/show/pk02c824r7h
► Balancing between protecting the privacy of individuals who contribute to data sets and releasing data sets of good utility is of extreme importance. Even…
(more)
▼ Balancing between protecting the
privacy of
individuals who contribute to data sets and releasing data sets of
good utility is of extreme importance. Even with data sets
anonymized, there is a still a possibility that an intruder may
identify a
subject in a released data set. Many of the existing
methods for data
privacy and confidentiality do not quantify the
amount of
privacy that the data set may leak. Differential
privacy
provides a conceptual approach to bring strong mathematical
guarantee for
privacy protection and quantifies the amount of
privacy the data set leaks when it is released for public use. My
dissertation explores the recently developed differentially private
data synthesis (DIPS) methods for incorporating differential
privacy when generating synthetic data to be publicly released. I
first developed a DIPS algorithm called CIPHER to construct
differentially
privacy microdata from low dimensional histograms by
solving linear equations with Tikhonov regularization. CIPHER
decomposes joint probabilities via basic probability rules to
construct the equation set and subsequently solves linear
equations. Simulations and qualitative banking data case study was
conducted to compare CIPHER to existing methods called MWEM
(multiplicative weighting via exponential mechanism) and the
full-dimensional histogram (FDH) sanitization. Next my dissertation
focuses on a exponential random graph model that incorporates
differential
privacy for social network data. An additional level
of complexity is present in social network data as the possibly
many relationships between nodes and edges must be considered. The
algorithm developed in my work was applied to several real-life
data sets to understand how well the differential private synthetic
social network data released by our algorithm compares to that of
the original network. Lastly, my dissertation focuses on
multiplicative weights and the single observation influence
measure. This focused on exploring more in depth the multiplicative
weighting via exponential mechanism and incorporating a single
observation influence measure to allow the algorithm to be applied
to any type of data, as long as the model and sufficient statistics
are known.
Advisors/Committee Members: Ick-Hoon Jin, Committee Member, Fang Liu, Research Director, Lizhen Lin, Committee Member.
Subjects/Keywords: Statistics; Data Privacy; Differential Privacy
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Eugenio, E. C. (2019). Some Methods for Differentially Private Data
Synthesis</h1>. (Thesis). University of Notre Dame. Retrieved from https://curate.nd.edu/show/pk02c824r7h
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Eugenio, Evercita Cuevas. “Some Methods for Differentially Private Data
Synthesis</h1>.” 2019. Thesis, University of Notre Dame. Accessed January 19, 2021.
https://curate.nd.edu/show/pk02c824r7h.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Eugenio, Evercita Cuevas. “Some Methods for Differentially Private Data
Synthesis</h1>.” 2019. Web. 19 Jan 2021.
Vancouver:
Eugenio EC. Some Methods for Differentially Private Data
Synthesis</h1>. [Internet] [Thesis]. University of Notre Dame; 2019. [cited 2021 Jan 19].
Available from: https://curate.nd.edu/show/pk02c824r7h.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Eugenio EC. Some Methods for Differentially Private Data
Synthesis</h1>. [Thesis]. University of Notre Dame; 2019. Available from: https://curate.nd.edu/show/pk02c824r7h
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Illinois – Urbana-Champaign
8.
Bindschadler, Vincent.
Privacy-preserving seedbased data synthesis.
Degree: PhD, Computer Science, 2018, University of Illinois – Urbana-Champaign
URL: http://hdl.handle.net/2142/101661
► How can we share sensitive datasets in such a way as to maximize utility while simultaneously safeguarding privacy? This thesis proposes an answer to this…
(more)
▼ How can we share sensitive datasets in such a way as to maximize utility while simultaneously safeguarding
privacy? This thesis proposes an answer to this question by re-framing the problem of sharing sensitive datasets as a data synthesis task. Specifically, we propose a framework to synthesize full data records in a
privacy-preserving way so that they can be shared instead of the original sensitive data.
The core the framework is a technique called seedbased data synthesis. Seedbased data synthesis produces data records by conditioning the output of a generative model on some input data record called the seed. This technique produces synthetic records that are similar to their seeds, which results in high quality outputs. But it simultaneously introduces statistical dependence between synthetic records and their seeds, which may compromise
privacy. As a countermeasure, we introduce a new class of techniques that can achieve strong
privacy notions in this setting:
privacy tests.
Privacy tests are algorithms that probabilistically reject candidate synthetics records which are determined to leak sensitive information. Synthetic records that fail the test are simply discarded, whereas those that pass the test are deemed safe and included in the synthetic dataset to be shared. We design two
privacy tests that provably yield differential
privacy.
We analyze the quality of synthetic datasets based on a cryptography-inspired definition of distinguishability: if synthetic data records are indistinguishable from real records, then they are (by definition) as useful as real data. On the theory front, we characterize the utility-
privacy trade-off of seedbased data synthesis. On the experimental front, we design an efficient procedure to experimentally quantify distinguishability.
We experimentally validate the seedbased data synthesis framework using five probabilistic generative models. Specifically, using real-world datasets as input, we produce synthetic data records for four different application scenarios and data types: location trajectories, census microdata, medical data, and facial images. We evaluate the quality of the produced synthetic records using both application-dependent utility metrics and distinguishability, and show that the framework is capable of producing highly realistic synthetic data records while providing differential
privacy for conservative parameters.
Advisors/Committee Members: Gunter, Carl A. (advisor), Gunter, Carl A. (Committee Chair), Zhai, ChengXiang (committee member), Borisov, Nikita (committee member), Smith, Adam D (committee member).
Subjects/Keywords: Privacy; Data Privacy; Synthetic Data
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Bindschadler, V. (2018). Privacy-preserving seedbased data synthesis. (Doctoral Dissertation). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/101661
Chicago Manual of Style (16th Edition):
Bindschadler, Vincent. “Privacy-preserving seedbased data synthesis.” 2018. Doctoral Dissertation, University of Illinois – Urbana-Champaign. Accessed January 19, 2021.
http://hdl.handle.net/2142/101661.
MLA Handbook (7th Edition):
Bindschadler, Vincent. “Privacy-preserving seedbased data synthesis.” 2018. Web. 19 Jan 2021.
Vancouver:
Bindschadler V. Privacy-preserving seedbased data synthesis. [Internet] [Doctoral dissertation]. University of Illinois – Urbana-Champaign; 2018. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/2142/101661.
Council of Science Editors:
Bindschadler V. Privacy-preserving seedbased data synthesis. [Doctoral Dissertation]. University of Illinois – Urbana-Champaign; 2018. Available from: http://hdl.handle.net/2142/101661
9.
Uteck, Anne E.
Reconceptualizing Spatial Privacy for the Internet of Everything
.
Degree: 2013, University of Ottawa
URL: http://hdl.handle.net/10393/30295
► Twenty years ago, a team of Silicon Valley researchers, led by computing scientist Mark Weiser, envisioned a world in which computing would become an integral…
(more)
▼ Twenty years ago, a team of Silicon Valley researchers, led by computing scientist Mark Weiser, envisioned a world in which computing would become an integral part of our everyday experience. Today, this vision is being realized. As technologies are combined, integrated and connected to networks, we are moving to a society characterized by “ubiquitous computing” — a paradigm used to describe pervasive technological embeddedness; from things, to people, to places. Enabling technologies, such as Global Positioning Systems (GPS), Radio-Frequency Identification (RFID) and advanced wireless devices are being introduced and woven into the fabric of our daily lives. With these convergences emerges the unique ability to locate and track people and things anywhere, anytime—including real-time. There are compelling advantages to such an enhanced surveillance capability serving important public interests. Yet, bringing computing technologies beyond the desktop and into the everyday physical world more directly and more pervasively compromise the spaces and places of our lives, challenging our fundamental ideas about spatial boundaries and the privacy expectations that accompany them.
This dissertation examines these issues with the aim of reconceptualizing spatial privacy so that it is capable of sustained, effective legal protection in a world of ubiquitous computing.
Chapter One provides a detailed study of the technological landscape, highlighting three key characteristics of ubiquitous computing: (i) physicality, (ii) invisibility and (iii) context-awareness. Having examined what is considered the “next wave” of computing technology. Chapter Two explores the quantitative and qualitative changes in surveillance activity facilitated by ubiquitous computing. It identifies and discusses the emerging privacy implications raised by ubiquitous surveillance technologies, asserting the increasing importance of reconceiving spatial privacy as computing technology becomes physically embedded in the real world. Chapter Three examines the conceptual and legal privacy landscape, surveying leading privacy theories in order to articulate the array of underlying values and interests. This survey includes not only privacy scholarship but also privacy jurisprudence, principally as it has been developed under section 8 of the Canadian Charter of Rights and Freedoms. Central to this dissertation, this analysis demonstrates the extent to which current privacy law is not adequate to protect the spatial dimension of privacy. Addressing this deficit, Chapter Three calls for a reconceptualization of the traditional category of territorial privacy so that it is capable of sustaining effective legal protection. This conceptual reformation of spatial privacy begins in Chapter Four, which provides a multi-disciplinary investigation of the meaning of place. It adopts an experiential conception developed within the field of Humanistic Geography, better reflecting the spatiality and interactive nature of our everyday lives. Based on this foundation, a new…
Subjects/Keywords: ubicomp privacy
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Uteck, A. E. (2013). Reconceptualizing Spatial Privacy for the Internet of Everything
. (Thesis). University of Ottawa. Retrieved from http://hdl.handle.net/10393/30295
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Uteck, Anne E. “Reconceptualizing Spatial Privacy for the Internet of Everything
.” 2013. Thesis, University of Ottawa. Accessed January 19, 2021.
http://hdl.handle.net/10393/30295.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Uteck, Anne E. “Reconceptualizing Spatial Privacy for the Internet of Everything
.” 2013. Web. 19 Jan 2021.
Vancouver:
Uteck AE. Reconceptualizing Spatial Privacy for the Internet of Everything
. [Internet] [Thesis]. University of Ottawa; 2013. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/10393/30295.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Uteck AE. Reconceptualizing Spatial Privacy for the Internet of Everything
. [Thesis]. University of Ottawa; 2013. Available from: http://hdl.handle.net/10393/30295
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Arizona
10.
Newell, Patricia Brierley.
The meaning and use of privacy: A study of young adults.
Degree: 1992, University of Arizona
URL: http://hdl.handle.net/10150/185867
► Two hundred and forty-three young adults responded to an open and non-directive question asking them to describe an occasion when they required privacy, defined here…
(more)
▼ Two hundred and forty-three young adults responded to an open and non-directive question asking them to describe an occasion when they required
privacy, defined here as a condition of separation from the public domain, which is voluntary and temporary, and into which other persons, including the state, as representative of the public domain, do not justifiably intrude. A subsequent questionnaire established SES variables and the average frequency and duration of
privacy experiences. 72.4% associated a desire for
privacy with social antecedent conditions, 16% for task oriented purposes, 9.5% because of some organismic reason and 2.1% indicated aversive conditions of the physical environment such as noise. Results showed that 82% of all subjects required
privacy due to adverse circumstances. A fairly large proportion, especially among females and minorities, were not able to achieve
privacy although they required it. For the most part this was because they took no action. Although there were significant sex and race differences found for the process of acquiring
privacy, there was marked similarity in the places and behaviours employed during
privacy. There was one exception. Females mentioned safe places significantly more than males. Satisfactorily achieving
privacy was associated with positive action in the case of 76 subjects, with psychological withdrawal by 17 subjects, with avoidance actions by 8 subjects and by no action by 7 subjects. Since 199 of the 243 subjects indicated initial negative affect and the majority of those achieving
privacy indicated positive results, such as feeling better, more relaxed and confident and being ready to face the world again, it was felt that the results supported a systems model of
privacy which fulfills a cross-cultural therapeutic function. From the systems perspective
privacy is seen as fulfilling two functions; systems maintenance and systems development. Systems maintenance refers to the balancing act performed by the human body to remain within healthy operating limits. Systems development refers to the general tendency of mankind to extend boundaries, learn new skills, and progress towards self-actualisation. Results support a definition of
privacy that reflects an interactive Person-Environment condition.
Advisors/Committee Members: Ittelson, W.H (advisor), Bartlett, N.R. (committeemember), Bechtel, R.B. (committeemember), Wilkin, D.C. (committeemember), Salomon, V. (committeemember).
Subjects/Keywords: Privacy.
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Newell, P. B. (1992). The meaning and use of privacy: A study of young adults.
(Doctoral Dissertation). University of Arizona. Retrieved from http://hdl.handle.net/10150/185867
Chicago Manual of Style (16th Edition):
Newell, Patricia Brierley. “The meaning and use of privacy: A study of young adults.
” 1992. Doctoral Dissertation, University of Arizona. Accessed January 19, 2021.
http://hdl.handle.net/10150/185867.
MLA Handbook (7th Edition):
Newell, Patricia Brierley. “The meaning and use of privacy: A study of young adults.
” 1992. Web. 19 Jan 2021.
Vancouver:
Newell PB. The meaning and use of privacy: A study of young adults.
[Internet] [Doctoral dissertation]. University of Arizona; 1992. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/10150/185867.
Council of Science Editors:
Newell PB. The meaning and use of privacy: A study of young adults.
[Doctoral Dissertation]. University of Arizona; 1992. Available from: http://hdl.handle.net/10150/185867

Tartu University
11.
Baghery, Karim.
Reducing trust and improving security in zk-SNARKs and commitments
.
Degree: 2020, Tartu University
URL: http://hdl.handle.net/10062/68424
► zk-SNARK-id on tõhusad ja praktilised mitteinteraktiivsed tõestussüsteemid, mis on konstrueeritud viitestringi mudelis ning tänu kompaktsetele tõestustele ja väga tõhusale verifitseeritavusele on need laialdaselt kasutusele võetud…
(more)
▼ zk-SNARK-id on tõhusad ja praktilised mitteinteraktiivsed tõestussüsteemid, mis on konstrueeritud viitestringi mudelis ning tänu kompaktsetele tõestustele ja väga tõhusale verifitseeritavusele on need laialdaselt kasutusele võetud suuremahulistes praktilistes rakendustes.
Selles töös uurime zk-SNARK-e kahest vaatenurgast: nende usalduse vähendamine ja turvalisuse tugevdamine. Esimeses suunas uurime kui palju saab vähendada usaldust paaristuspõhiste zk-SNARK-ide puhul ilma nende tõhusust ohverdamata niiviisi, et kasutajad saavad teatud turvataseme ka siis kui seadistusfaas tehti pahatahtlikult või kui avalikustati seadistusfaasi salajane teave. Me pakume välja mõned tõhusad konstruktsioonid, mis suudavad takistada zk-SNARK-i seadistusfaasi ründeid ja mis saavutavad senisest tugevama turvataseme. Näitame ka seda, et sarnased tehnikad võimaldavad leevendada usaldust tagauksega kinnitusskeemides, mis on krüptograafiliste primitiivide veel üks silmapaistev perekond ja mis samuti nõub usaldatud seadistusfaasi. Teises suunas esitame mõned tõhusad konstruktsioonid, mis tagavad parema turvalisuse minimaalsete lisakuludega. Mõned esitatud konstruktsioonidest võimaldavad lihtsustada praegusi TK-turvalisi protokolle, nimelt privaatsust säilitavate nutilepingusüsteemide Hawk ja Gyges konstruktsiooni, ja parandada nende tõhusust. Uusi konstruktsioone saab aga otse kasutada uutes protokollides, mis soovivad kasutada zk-SNARK-e.
Osa väljapakutud zk-SNARK-e on implementeeritud teegis Libsnark ja empiirilised tulemused kinnitavad, et usalduse vähendamiseks või suurema turvalisuse saavutamiseks on arvutuslikud lisakulud väikesed.; Zero-knowledge Succinct Non-interactive ARguments of Knowledge (zk-SNARKs) are an efficient family of NIZK proof systems that are constructed in the Common Reference String (CRS) model and due to their succinct proofs and very efficient verification, they are widely adopted in large-scale practical applications.
In this thesis, we study zk-SNARKs from two perspectives, namely reducing trust and improving security in them. In the first direction, we investigate how much one can mitigate trust in pairing-based zk-SNARKs without sacrificing their efficiency. In such constructions, the parties of protocol will obtain a certain level of security even if the setup phase was done maliciously or the secret information of the setup phase was revealed. As a result of this direction, we present some efficient constructions that can resist against subverting of the setup phase of zk-SNARKs and achieve a certain level of security which is stronger than before. We also show that similar techniques will allow us to mitigate the trust in the trapdoor commitment schemes that are another prominent family of cryptographic primitives that require a trusted setup phase. In the second direction, we present some efficient constructions that achieve more security with minimal overhead. Some of the presented constructions allow to simplify the construction of current UC-secure protocols and improve their efficiency. New…
Advisors/Committee Members: Lipmaa, Helger, juhendaja (advisor).
Subjects/Keywords: privacy;
cryptography
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Baghery, K. (2020). Reducing trust and improving security in zk-SNARKs and commitments
. (Thesis). Tartu University. Retrieved from http://hdl.handle.net/10062/68424
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Baghery, Karim. “Reducing trust and improving security in zk-SNARKs and commitments
.” 2020. Thesis, Tartu University. Accessed January 19, 2021.
http://hdl.handle.net/10062/68424.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Baghery, Karim. “Reducing trust and improving security in zk-SNARKs and commitments
.” 2020. Web. 19 Jan 2021.
Vancouver:
Baghery K. Reducing trust and improving security in zk-SNARKs and commitments
. [Internet] [Thesis]. Tartu University; 2020. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/10062/68424.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Baghery K. Reducing trust and improving security in zk-SNARKs and commitments
. [Thesis]. Tartu University; 2020. Available from: http://hdl.handle.net/10062/68424
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Texas – Austin
12.
-6848-2988.
Exploiting leakage in privacy-protecting systems.
Degree: PhD, Computer science, 2016, University of Texas – Austin
URL: http://hdl.handle.net/2152/45559
► Conventional systems store data unencrypted. This allows them to easily access and manipulate their data. However, by not protecting their data, these systems are at…
(more)
▼ Conventional systems store data unencrypted. This allows them to easily access and manipulate their data. However, by not protecting their data, these systems are at a greater risk if they are compromised by a malicious hacker. More advanced systems add encryption to their data, but this causes other issues. Normal encryption often ruins the ability to run computations on data, negating many of the reasons to store the data in the first place. More recently, some systems have attempted to strike a compromise between security and functionality by using encryption that partially protects their data while still allowing certain operations to be performed. Examples of these systems include general purpose frameworks like Mylar for Web applications, as well as domain- and application-specific systems like P3 for photo storage. This dissertation examines the
privacy concerns that arise when using these systems with realistic datasets and real-world usage scenarios. The first system we explore is Mylar, an extension to the popular Meteor framework. Meteor is a JavaScript-based framework for concurrently developing the client and server parts of Web apps. Mylar allows users to share and search over data while protecting against a compromised or malicious server. We expand Mylar's vague definitions of passive and active adversaries into three threat models and show that Mylar is insecure against all three models. Mylar's metadata leaks sensitive information to an adversary with one-time access to Mylar's encrypted database. Mylar provides no protection against adversaries which can monitor user access patterns, allowing them to watch for data dependent behavior corresponding to sensitive information. Finally, Mylar fails to protect against active attackers who, by nature of the system, have been given the ability to modify the database and run search over the encrypted data. We next look at set of systems designed to protect sensitive images by selectively obfuscating them. We examine a system called P3 which splits an image into two images: a secret image that contains most of the identifying information and a public image that can be distributed with less risk of leaking information. We also investigate mosaicing (often called pixelation) and blurring, two commonly used image obfuscation techniques. Examining the obfuscated images, it's obvious that all three of these systems leak information. However, it's not clear how to exploit this leakage or if doing so is even possible. The authors of P3 specifically examined P3 using a number of techniques that mimic human image recognition. We bypass the need for human recognition by making use of modern machine learning techniques. Using neural networks, we are able to classify the obfuscated image content automatically without needing human assistance or having to define image features. Finally, we conclude by proposing a number of guidelines for creating modern
privacy-preserving systems. We look at problems that arise when creating a scheme on paper as well as issues that…
Advisors/Committee Members: Gouda, Mohamed G., 1947- (advisor), Shmatikov, Vitaly (advisor), Alvisi, Lorenzo (committee member), Witchel, Emmett (committee member).
Subjects/Keywords: Security; Privacy
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
-6848-2988. (2016). Exploiting leakage in privacy-protecting systems. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/45559
Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Chicago Manual of Style (16th Edition):
-6848-2988. “Exploiting leakage in privacy-protecting systems.” 2016. Doctoral Dissertation, University of Texas – Austin. Accessed January 19, 2021.
http://hdl.handle.net/2152/45559.
Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
MLA Handbook (7th Edition):
-6848-2988. “Exploiting leakage in privacy-protecting systems.” 2016. Web. 19 Jan 2021.
Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Vancouver:
-6848-2988. Exploiting leakage in privacy-protecting systems. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2016. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/2152/45559.
Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Council of Science Editors:
-6848-2988. Exploiting leakage in privacy-protecting systems. [Doctoral Dissertation]. University of Texas – Austin; 2016. Available from: http://hdl.handle.net/2152/45559
Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

University of Pretoria
13.
[No author].
In search of search privacy
.
Degree: 2011, University of Pretoria
URL: http://upetd.up.ac.za/thesis/available/etd-07222011-052805/
► Search engines have become integral to the way in which we use the Web of today. Not only are they an important real time source…
(more)
▼ Search engines have become integral to the way in
which we use the Web of today. Not only are they an important real
time source of links to relevant information, but they also serve
as a starting point to the Web. A veritable treasure trove of the
latest news, satellite images, directions from anywhere to
anywhere, local traffic updates and global trends ranging from the
spread of influenza to which celebrity happens to be the most
popular at a particular time. The more popular search engines are
collecting incredible amounts of information. In addition to
indexing significant portions of the Web they record what hundreds
of millions of users around the world are searching for. As more
people use a particular search engine, it has the potential to
record more information on what is deemed relevant (and in doing so
provide better relevance in the future, thereby attracting more
users). Unfortunately, the relevance derived from this cycle
between the search user and the search engine comes at a cost:
privacy. In this work, we take an in depth look at what
privacy
means within the context of search. We discuss why it is that the
search engine must be considered a threat to search
privacy. We
then investigate potential solutions and eventually propose our own
in a bid to enhance search
privacy.
Advisors/Committee Members: Prof M S Olivier (advisor).
Subjects/Keywords: Search privacy;
Search engines;
Online privacy;
UCTD
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
author], [. (2011). In search of search privacy
. (Doctoral Dissertation). University of Pretoria. Retrieved from http://upetd.up.ac.za/thesis/available/etd-07222011-052805/
Chicago Manual of Style (16th Edition):
author], [No. “In search of search privacy
.” 2011. Doctoral Dissertation, University of Pretoria. Accessed January 19, 2021.
http://upetd.up.ac.za/thesis/available/etd-07222011-052805/.
MLA Handbook (7th Edition):
author], [No. “In search of search privacy
.” 2011. Web. 19 Jan 2021.
Vancouver:
author] [. In search of search privacy
. [Internet] [Doctoral dissertation]. University of Pretoria; 2011. [cited 2021 Jan 19].
Available from: http://upetd.up.ac.za/thesis/available/etd-07222011-052805/.
Council of Science Editors:
author] [. In search of search privacy
. [Doctoral Dissertation]. University of Pretoria; 2011. Available from: http://upetd.up.ac.za/thesis/available/etd-07222011-052805/

University of Waterloo
14.
Irannejad, Arezoo.
Designing Privacy-Enhanced Interfaces on Digital Tabletops for Public Settings.
Degree: 2013, University of Waterloo
URL: http://hdl.handle.net/10012/7366
► Protection of personal information has become a critical issue in the digital world. Many companies and service provider websites have adopted privacy policies and practices…
(more)
▼ Protection of personal information has become a critical issue in the digital world. Many companies and service provider websites have adopted privacy policies and practices to protect users’ personal information to some extent. In addition, various governments are adopting privacy protection legislation. System developers, service providers, and interface designers play an important role in determining how to make systems fulfill legal requirements and satisfy users. The human factor requirements for effective privacy interface design can be categorized into four groups: (1) comprehension, (2) consciousness, (3) control, and (4) consent (Patrick & Kenny, 2003).
Moreover, the type of technology that people are engaged with has a crucial role in determining what type of practices should be adopted. As Weiser (1996) envisioned, we are now in an “ubiquitous computing” (Ubicomp) era in which technologies such as digital tabletops (what Weiser called LiveBoards) are emerging for use in public settings. The collaborative and open nature of this type of smart device introduces new privacy threats that have not yet been thoroughly investigated and as a result have not been addressed in companies’ and governmental privacy statements and legislation.
In this thesis, I provide an analytical description of the privacy threats unique to tabletop display environments. I then present several design suggestions for a tabletop display interface that addresses and mitigates these threats, followed by a qualitative evaluation of these designs based on Patrick and Kenny’s (2003) model. Results show that most participants have often experienced being shoulder-surfed or had privacy issues when sharing information with someone in a collaborative environment. Therefore, they found most of the techniques designed in this thesis helpful in providing information privacy for them when they are engaged with online social activities on digital tabletops in public settings. Among all of the proposed tested designs, the first three have proven to be effective in providing the required privacy. However, designs 4 and 5 had some shortfalls that made them less helpful for participants. The main problem with these two designs was that participants had difficulty understanding what they had to do in order to complete the given tasks.
Subjects/Keywords: Digital Tabletop; Privacy Policy; Facebook; Privacy Legislation
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Irannejad, A. (2013). Designing Privacy-Enhanced Interfaces on Digital Tabletops for Public Settings. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/7366
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Irannejad, Arezoo. “Designing Privacy-Enhanced Interfaces on Digital Tabletops for Public Settings.” 2013. Thesis, University of Waterloo. Accessed January 19, 2021.
http://hdl.handle.net/10012/7366.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Irannejad, Arezoo. “Designing Privacy-Enhanced Interfaces on Digital Tabletops for Public Settings.” 2013. Web. 19 Jan 2021.
Vancouver:
Irannejad A. Designing Privacy-Enhanced Interfaces on Digital Tabletops for Public Settings. [Internet] [Thesis]. University of Waterloo; 2013. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/10012/7366.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Irannejad A. Designing Privacy-Enhanced Interfaces on Digital Tabletops for Public Settings. [Thesis]. University of Waterloo; 2013. Available from: http://hdl.handle.net/10012/7366
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Waterloo
15.
Devet, Casey.
The Best of Both Worlds: Combining Information-Theoretic and Computational Private Information Retrieval for Communication Efficiency.
Degree: 2014, University of Waterloo
URL: http://hdl.handle.net/10012/8640
► The goal of Private Information Retrieval (PIR) is the ability to query a database successfully without the operator of the database server discovering which record(s)…
(more)
▼ The goal of Private Information Retrieval (PIR) is the ability to query a database successfully without the operator of the database server discovering which record(s) of the database the querier is interested in. There are two main classes of PIR protocols: those that provide privacy guarantees based on the computational limitations of servers, called computational PIR or CPIR, and those that rely on multiple servers not colluding for privacy, called information-theoretic PIR or IT-PIR. These two classes have different advantages and disadvantages that make them more or less attractive to designers of PIR-enabled privacy enhancing technologies.
We present a hybrid PIR protocol that combines two PIR protocols: one CPIR protocol and one IT-PIR protocol. Our protocol inherits many positive aspects of both classes and mitigates some of the negative aspects. For example, our hybrid protocol maintains partial privacy when the security assumptions of one of the component protocols is broken, mitigating the privacy loss in such an event. We have implemented our protocol as an extension of the Percy++ library so that it combines a PIR protocol by Aguilar Melchor and Gaborit with one by Goldberg. We show that our hybrid protocol uses less communication than either of these component protocols and that our scheme is particularly beneficial when the number of records in a database is large compared to the size of the records. This situation arises in applications such as TLS certificate verification, anonymous communications systems, private LDAP lookups, and others.
The server-side computation involved in the PIR protocols that we discuss in this thesis all lend themselves to parallelization. As an extension to the Percy++ library we have implemented parallelized server computation for each of these protocols using both multithreading and distributed computation. We show that using parallelization allows the servers to reduce the latency involved in serving PIR queries.
Subjects/Keywords: private information retrieval; privacy enhancing technologies; privacy
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Devet, C. (2014). The Best of Both Worlds: Combining Information-Theoretic and Computational Private Information Retrieval for Communication Efficiency. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/8640
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Devet, Casey. “The Best of Both Worlds: Combining Information-Theoretic and Computational Private Information Retrieval for Communication Efficiency.” 2014. Thesis, University of Waterloo. Accessed January 19, 2021.
http://hdl.handle.net/10012/8640.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Devet, Casey. “The Best of Both Worlds: Combining Information-Theoretic and Computational Private Information Retrieval for Communication Efficiency.” 2014. Web. 19 Jan 2021.
Vancouver:
Devet C. The Best of Both Worlds: Combining Information-Theoretic and Computational Private Information Retrieval for Communication Efficiency. [Internet] [Thesis]. University of Waterloo; 2014. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/10012/8640.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Devet C. The Best of Both Worlds: Combining Information-Theoretic and Computational Private Information Retrieval for Communication Efficiency. [Thesis]. University of Waterloo; 2014. Available from: http://hdl.handle.net/10012/8640
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Pretoria
16.
Brandi, Wesley
Antonio.
In search of
search privacy.
Degree: Computer Science, 2011, University of Pretoria
URL: http://hdl.handle.net/2263/26552
► Search engines have become integral to the way in which we use the Web of today. Not only are they an important real time source…
(more)
▼ Search engines have become integral to the way in which
we use the Web of today. Not only are they an important real time
source of links to relevant information, but they also serve as a
starting point to the Web. A veritable treasure trove of the latest
news, satellite images, directions from anywhere to anywhere, local
traffic updates and global trends ranging from the spread of
influenza to which celebrity happens to be the most popular at a
particular time. The more popular search engines are collecting
incredible amounts of information. In addition to indexing
significant portions of the Web they record what hundreds of
millions of users around the world are searching for. As more
people use a particular search engine, it has the potential to
record more information on what is deemed relevant (and in doing so
provide better relevance in the future, thereby attracting more
users). Unfortunately, the relevance derived from this cycle
between the search user and the search engine comes at a cost:
privacy. In this work, we take an in depth look at what
privacy
means within the context of search. We discuss why it is that the
search engine must be considered a threat to search
privacy. We
then investigate potential solutions and eventually propose our own
in a bid to enhance search
privacy.
Advisors/Committee Members: Prof M S Olivier (advisor).
Subjects/Keywords: Search
privacy; Search
engines; Online
privacy;
UCTD
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Brandi, W. (2011). In search of
search privacy. (Doctoral Dissertation). University of Pretoria. Retrieved from http://hdl.handle.net/2263/26552
Chicago Manual of Style (16th Edition):
Brandi, Wesley. “In search of
search privacy.” 2011. Doctoral Dissertation, University of Pretoria. Accessed January 19, 2021.
http://hdl.handle.net/2263/26552.
MLA Handbook (7th Edition):
Brandi, Wesley. “In search of
search privacy.” 2011. Web. 19 Jan 2021.
Vancouver:
Brandi W. In search of
search privacy. [Internet] [Doctoral dissertation]. University of Pretoria; 2011. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/2263/26552.
Council of Science Editors:
Brandi W. In search of
search privacy. [Doctoral Dissertation]. University of Pretoria; 2011. Available from: http://hdl.handle.net/2263/26552

University of Windsor
17.
Patel, Dipeshkumar Shaileshkumar.
Parallel Implementation of Privacy Preserving Multi-Layer Neural Networks.
Degree: MS, Computer Science, 2020, University of Windsor
URL: https://scholar.uwindsor.ca/etd/8386
► With recent technological advancements, the amount of personal user data that is being generated is immense. Due to the large volume of data, machine learning…
(more)
▼ With recent technological advancements, the amount of personal user data that is being generated is immense. Due to the large volume of data, machine learning algorithms such as neural networks are serving as the backbone to derive patterns from this data quickly. This need for big data analytics comes at the cost of the
privacy of user data. The second challenge that must be solved relates to the scalability of the machine learning algorithm. Neural networks are known to deteriorate as the volume of the data increases due to complex sum and sigmoid calculations. Therefore in this thesis, an attempt to parallelize the neural network while also maintaining the
privacy of user data is made. This model would provide a viable option for big data analytics without sacrificing the
privacy of individual users while also maintaining precision and the classification accuracy of the model. The implementation of the parallelized
privacy preserving neural network will be based on the MapReduce computing model which provides advanced features such as fault tolerance, data replication, and load balancing.
Advisors/Committee Members: Saeed Samet.
Subjects/Keywords: Deep Learning; Differential Privacy; Privacy-preserving
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Patel, D. S. (2020). Parallel Implementation of Privacy Preserving Multi-Layer Neural Networks. (Masters Thesis). University of Windsor. Retrieved from https://scholar.uwindsor.ca/etd/8386
Chicago Manual of Style (16th Edition):
Patel, Dipeshkumar Shaileshkumar. “Parallel Implementation of Privacy Preserving Multi-Layer Neural Networks.” 2020. Masters Thesis, University of Windsor. Accessed January 19, 2021.
https://scholar.uwindsor.ca/etd/8386.
MLA Handbook (7th Edition):
Patel, Dipeshkumar Shaileshkumar. “Parallel Implementation of Privacy Preserving Multi-Layer Neural Networks.” 2020. Web. 19 Jan 2021.
Vancouver:
Patel DS. Parallel Implementation of Privacy Preserving Multi-Layer Neural Networks. [Internet] [Masters thesis]. University of Windsor; 2020. [cited 2021 Jan 19].
Available from: https://scholar.uwindsor.ca/etd/8386.
Council of Science Editors:
Patel DS. Parallel Implementation of Privacy Preserving Multi-Layer Neural Networks. [Masters Thesis]. University of Windsor; 2020. Available from: https://scholar.uwindsor.ca/etd/8386

Tampere University
18.
Lu, Pengfei.
Apply the LINDDUN framework for privacy requirement analysis
.
Degree: 2017, Tampere University
URL: https://trepo.tuni.fi/handle/10024/100871
► LINDDUN is a framework to identify privacy threats and elicit privacy requirements from a system. It has complete procedures and strong support on privacy requirements…
(more)
▼ LINDDUN is a framework to identify privacy threats and elicit privacy requirements from a system. It has complete procedures and strong support on privacy requirements analysis. This research tries to figure out how practically we can apply the LINDDUN methodology in privacy requirements analysis. This thesis studies LINDDUN in a case project name Rin-Tin-Tinder for privacy threats and privacy requirements analysis. The analysis results are compared with the privacy requirement elicited by the project team in a workshop session. The analysis result is verified through a comparison with the Microsoft privacy guideline.
The discussions and analysis on comparison implies strengths and weakness of the LINDDUN methodology. Compared to workshop, the LINDDUN methodology lead the analyst to identify more privacy threats and get more privacy requirements, and makes analyzing process more predictable. Meanwhile, the LINDDUN methodology has a blind spot on users' unintentional false instructions. The thesis discussed possible directions to improve LINDDUN and summarize a guide rules on assumption making, which is an important procedure in LINDDUN. These findings will be helpful for LINDDUN s further improvement.
Subjects/Keywords: LINDDUN;
privacy threat;
privacy requirement;
improvement.
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Lu, P. (2017). Apply the LINDDUN framework for privacy requirement analysis
. (Masters Thesis). Tampere University. Retrieved from https://trepo.tuni.fi/handle/10024/100871
Chicago Manual of Style (16th Edition):
Lu, Pengfei. “Apply the LINDDUN framework for privacy requirement analysis
.” 2017. Masters Thesis, Tampere University. Accessed January 19, 2021.
https://trepo.tuni.fi/handle/10024/100871.
MLA Handbook (7th Edition):
Lu, Pengfei. “Apply the LINDDUN framework for privacy requirement analysis
.” 2017. Web. 19 Jan 2021.
Vancouver:
Lu P. Apply the LINDDUN framework for privacy requirement analysis
. [Internet] [Masters thesis]. Tampere University; 2017. [cited 2021 Jan 19].
Available from: https://trepo.tuni.fi/handle/10024/100871.
Council of Science Editors:
Lu P. Apply the LINDDUN framework for privacy requirement analysis
. [Masters Thesis]. Tampere University; 2017. Available from: https://trepo.tuni.fi/handle/10024/100871
19.
MAC AONGHUSA, POL.
Personal privacy and online systems.
Degree: School of Computer Science & Statistics. Discipline of Computer Science, 2019, Trinity College Dublin
URL: http://hdl.handle.net/2262/86116
► A significant portion of the modern internet is funded by commercial return from customised content such as advertising where user interests are learned from users'…
(more)
▼ A significant portion of the modern internet is funded by commercial return from customised content such as advertising where user interests are learned from users' online behaviour and used to display personalised content.
Privacy becomes a concern when personalisation reveals evidence of learning about sensitive topics a user would rather keep private. Examples of potentially sensitive topics we consider include health, finance and sexual orientation.
In this thesis we develop novel technologies allowing users to improve control over their personal
privacy. We consider three aspects of
privacy protection here: i) detecting evidence of unwanted profiling, ii) assessing the potential impact of a threat, and, iii) a flexible framework to help users to take control the flow of information used in personalisation.
We model online systems as black-box adversaries with unknown internal workings but with an objective to maximise commercial utility. In a black-box environment absolute measures of
privacy are problematic and so our formalism builds on a notion of
privacy relative to a baseline. The relative models we develop have the advantage of being learn-able from observation of the black-box system and so can be readily implemented as practical technologies for
privacy threat detection, analysis and
privacy defence which we validate against data from well-known, real-world online systems.
Advisors/Committee Members: Leith, Douglas.
Subjects/Keywords: Privacy Search Anonymity Security; Personal Privacy
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
MAC AONGHUSA, P. (2019). Personal privacy and online systems. (Thesis). Trinity College Dublin. Retrieved from http://hdl.handle.net/2262/86116
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
MAC AONGHUSA, POL. “Personal privacy and online systems.” 2019. Thesis, Trinity College Dublin. Accessed January 19, 2021.
http://hdl.handle.net/2262/86116.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
MAC AONGHUSA, POL. “Personal privacy and online systems.” 2019. Web. 19 Jan 2021.
Vancouver:
MAC AONGHUSA P. Personal privacy and online systems. [Internet] [Thesis]. Trinity College Dublin; 2019. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/2262/86116.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
MAC AONGHUSA P. Personal privacy and online systems. [Thesis]. Trinity College Dublin; 2019. Available from: http://hdl.handle.net/2262/86116
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
20.
Liu, Changchang.
Rethinking the Science of Statistical Privacy
.
Degree: PhD, 2019, Princeton University
URL: http://arks.princeton.edu/ark:/88435/dsp017d278w81g
► Nowadays, more and more data, such as social network data, mobility data, business data, medical data, are shared or made public to enable real world…
(more)
▼ Nowadays, more and more data, such as social network data, mobility data, business data, medical data, are shared or made public to enable real world applications. Such data is likely to contain sensitive information and thus needs to be obfuscated prior to release, to protect
privacy. However, existing statistical data
privacy mechanisms in the security community have several weaknesses: 1) they are limited in protecting sensitive information in the static scenario, and can not be generally applied to accommodate temporal dynamics. With the increasing development of data science, a large amount of sensitive data such as personal social relationships are becoming public, making the
privacy concerns of a time series of data more and more challenging; 2) these
privacy mechanisms do not explicitly capture correlations, leaving open the possibility of inference attacks. In many real world scenarios, the data tuple dependence/ correlation occurs naturally in datasets due to social, behavioral and genetic interactions between users; 3) there are very few practical guidelines on how to apply existing statistical
privacy notions in practice, and a key challenge is how to set an appropriate value for the
privacy parameters.
In this thesis, we aim to overcome these weaknesses to provide
privacy guarantees for protecting dynamic data structures, dependent (correlated) data structures. We also aim to discover useful and interpretable guidelines for selecting proper values of parameters in the state-of-the-art
privacy-preserving frameworks. Furthermore, we investigate how an auxiliary information – in the form of prior distribution of the database and correlation across records and time – can influence the proper choice of the
privacy parameters. Specifically, we 1) first propose the design of a
privacy-preserving system called LinkMirage, that mediates access to dynamic social
relationships in social networks, while effectively supporting social graph-based data analytics; 2) explicitly incorporate structural properties of data into current differential
privacy metrics and mechanisms, to enable
privacy-preserving data analytics for dependent/correlated data; and 3) finally provide a quantitative analysis of how hypothesis testing can guide the choice of the
privacy parameters in an interpretable manner for differential
privacy and other statistical
privacy frameworks.
Overall, our work aims to place the field of statistical data
privacy on a firm analytic foundation that is coupled with the design of practical systems.
Advisors/Committee Members: Mittal, Prateek (advisor).
Subjects/Keywords: Auxiliary Information;
Differential Privacy;
Statistical Privacy
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Liu, C. (2019). Rethinking the Science of Statistical Privacy
. (Doctoral Dissertation). Princeton University. Retrieved from http://arks.princeton.edu/ark:/88435/dsp017d278w81g
Chicago Manual of Style (16th Edition):
Liu, Changchang. “Rethinking the Science of Statistical Privacy
.” 2019. Doctoral Dissertation, Princeton University. Accessed January 19, 2021.
http://arks.princeton.edu/ark:/88435/dsp017d278w81g.
MLA Handbook (7th Edition):
Liu, Changchang. “Rethinking the Science of Statistical Privacy
.” 2019. Web. 19 Jan 2021.
Vancouver:
Liu C. Rethinking the Science of Statistical Privacy
. [Internet] [Doctoral dissertation]. Princeton University; 2019. [cited 2021 Jan 19].
Available from: http://arks.princeton.edu/ark:/88435/dsp017d278w81g.
Council of Science Editors:
Liu C. Rethinking the Science of Statistical Privacy
. [Doctoral Dissertation]. Princeton University; 2019. Available from: http://arks.princeton.edu/ark:/88435/dsp017d278w81g

University of Notre Dame
21.
Claire McKay Bowen.
Data Privacy via Integration of Differential Privacy and
Data Synthesis</h1>.
Degree: Applied and Computational Mathematics and
Statistics, 2018, University of Notre Dame
URL: https://curate.nd.edu/show/n009w092301
► When sharing data among collaborators or releasing data publicly, one of the crucial concerns is the extreme risk of exposing personal information of individuals…
(more)
▼ When sharing data among collaborators or
releasing data publicly, one of the crucial concerns is the extreme
risk of exposing personal information of individuals who contribute
to the data. Many statistical methods of data
privacy and
confidentiality have little to no means in measuring an altered
data set’s
privacy guarantee. Differential
privacy, a condition on
data releasing algorithms, quantifies disclosure risk, but is
traditionally used in a query based
privacy method instead of in a
synthetic dataset release. My dissertation develops and explores
various methods of incorporating differential
privacy in synthetic
data generation using predicted values within a Bayesian framework.
I call these methods, differentially private data synthesis (DIPS)
techniques. In my dissertation, I first conducted a comparative
study of several DIPS approaches on various data types as well as a
case study on Male Fertility data. Next, I created a method (called
SPECKS) to compare DIPS data to real-life data, and another method
to improve the statistical inferences of non-parametric DIPS
approaches. These methods were tested on voter registration data.
Finally, I developed a DIPS technique for social network data
called Noisy Edges and Traits (NET) and applied it to two real-life
data sets.
Advisors/Committee Members: Fang Liu, Research Director.
Subjects/Keywords: data privacy; differential privacy; data synthesis
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Bowen, C. M. (2018). Data Privacy via Integration of Differential Privacy and
Data Synthesis</h1>. (Thesis). University of Notre Dame. Retrieved from https://curate.nd.edu/show/n009w092301
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Bowen, Claire McKay. “Data Privacy via Integration of Differential Privacy and
Data Synthesis</h1>.” 2018. Thesis, University of Notre Dame. Accessed January 19, 2021.
https://curate.nd.edu/show/n009w092301.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Bowen, Claire McKay. “Data Privacy via Integration of Differential Privacy and
Data Synthesis</h1>.” 2018. Web. 19 Jan 2021.
Vancouver:
Bowen CM. Data Privacy via Integration of Differential Privacy and
Data Synthesis</h1>. [Internet] [Thesis]. University of Notre Dame; 2018. [cited 2021 Jan 19].
Available from: https://curate.nd.edu/show/n009w092301.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Bowen CM. Data Privacy via Integration of Differential Privacy and
Data Synthesis</h1>. [Thesis]. University of Notre Dame; 2018. Available from: https://curate.nd.edu/show/n009w092301
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Duke University
22.
Chen, Yan.
Applying Differential Privacy with Sparse Vector Technique
.
Degree: 2018, Duke University
URL: http://hdl.handle.net/10161/16906
► In today's fast-paced developing digital world, a wide range of services such as web services, social networks, and mobile devices collect a large amount…
(more)
▼ In today's fast-paced developing digital world, a wide range of services such as web services, social networks, and mobile devices collect a large amount of personal data from their users. Although sharing and mining large-scale personal data can help improve the functionality of these services, it also raises
privacy concerns for the individuals who contribute to the data. Differential
privacy has emerged as a de facto standard for analyzing sensitive data with strong provable
privacy guarantees for individuals. There is a rich literature that has led to the development of differentially private algorithms for numerous data analysis tasks. The
privacy proof of these algorithms are mainly based on (a) the
privacy guarantees of a small number of primitives, and (b) a set of composition theorems that help reason about the
privacy guarantee of algorithms based on the used private primitives. In this dissertation, we focus on the usage of one popular differentially private primitive, Sparse Vector Technique, which can support multiple queries with limited
privacy cost. First, we revisit the original Sparse Vector Technique and its variants, proving that many of its variants violate the definition of differential
privacy. Furthermore, we design an attack algorithm demonstrating that an adversary can reconstruct the true database with high accuracy having access to these ``broken" variants. Next, we utilize the original Sparse Vector Technique primitive to design new solutions for practical problems. We propose the first algorithms to publish regression diagnostics under differential
privacy for evaluating regression models. Specifically, we create differentially private versions of residual plots for linear regression as well as receiver operating characteristic (ROC) curves and binned residual plot for logistic regression. Comprehensive empirical studies show these algorithms are effective and enable users to evaluate the correctness of their model assumptions. We then make use of Sparse Vector Technique as a key primitive to design a novel algorithm for differentially private stream processing, supporting queries on streaming data. This novel algorithm is data adaptive and can simultaneously support multiple queries, such as unit counts, sliding windows and event monitoring, over a single or multiple stream resolutions. Through extensive evaluations, we show that this new technique outperforms the state-of-the-art algorithms, which are specialized to particular query types.
Advisors/Committee Members: Machanavajjhala, Ashwin (advisor).
Subjects/Keywords: Computer science;
Data Privacy;
Differential Privacy
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Chen, Y. (2018). Applying Differential Privacy with Sparse Vector Technique
. (Thesis). Duke University. Retrieved from http://hdl.handle.net/10161/16906
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Chen, Yan. “Applying Differential Privacy with Sparse Vector Technique
.” 2018. Thesis, Duke University. Accessed January 19, 2021.
http://hdl.handle.net/10161/16906.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Chen, Yan. “Applying Differential Privacy with Sparse Vector Technique
.” 2018. Web. 19 Jan 2021.
Vancouver:
Chen Y. Applying Differential Privacy with Sparse Vector Technique
. [Internet] [Thesis]. Duke University; 2018. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/10161/16906.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Chen Y. Applying Differential Privacy with Sparse Vector Technique
. [Thesis]. Duke University; 2018. Available from: http://hdl.handle.net/10161/16906
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Maryland
23.
Groce, Adam Dowlin.
New Notions and Mechanisms for Statistical Privacy.
Degree: Computer Science, 2014, University of Maryland
URL: http://hdl.handle.net/1903/15813
► Many large databases of personal information currently exist in the hands of corporations, nonprofits, and governments. The data in these databases could be used to…
(more)
▼ Many large databases of personal information currently exist in the hands of corporations, nonprofits, and governments. The data in these databases could be used to answer any number of important questions, aiding in everything from basic research to day-to-day corporate decision-making. These questions must be answered while respecting the
privacy of the individuals whose data are being used. However, even defining
privacy in this setting can be difficult. The standard definition in the field is differential
privacy. During the years since its introduction, a wide variety of query algorithms have been found that can achieve meaningful utility while at the same time protecting the
privacy of individuals. However, differential
privacy is a very strong definition, and in some settings it can seem too strong. Given the difficulties involved in getting differentially private output to all desirable queries, many have looked for ways to weaken differential
privacy without losing its meaningful
privacy guarantees.
Here we discuss two such weakenings. The first is computational differential
privacy, originally defined by Mironov et al. We find the promise of this weakening to be limited. We show two results that severely curtail the potential for computationally private mechanisms to add any utility over those that achieve standard differential
privacy when working in the standard setting with all data held by a single entity.
We then propose our own weakening, coupled-worlds
privacy. This definition is meant to capture the cases where reasonable bounds can be placed on the adversary's certainty about the data (or, equivalently, the adversary's auxiliary information). We discuss the motivation for the definition, its relationship to other definitions in the literature, and its useful properties. Coupled-worlds
privacy is actually a framework with which specific definitions can be instantiated, and we discuss a particular instantiation, distributional differential
privacy, which we believe is of particular interest.
Having introduced this definition, we then seek new distributionally differentially private query algorithms that can release useful information without the need to add noise, as is necessary when satisfying differential
privacy. We show that one can release a variety of query output with distributional differential
privacy, including histograms, sums, and least-squares regression lines.
Advisors/Committee Members: Katz, Jonathan (advisor).
Subjects/Keywords: Computer science; Cryptography; Differential Privacy; Privacy
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Groce, A. D. (2014). New Notions and Mechanisms for Statistical Privacy. (Thesis). University of Maryland. Retrieved from http://hdl.handle.net/1903/15813
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Groce, Adam Dowlin. “New Notions and Mechanisms for Statistical Privacy.” 2014. Thesis, University of Maryland. Accessed January 19, 2021.
http://hdl.handle.net/1903/15813.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Groce, Adam Dowlin. “New Notions and Mechanisms for Statistical Privacy.” 2014. Web. 19 Jan 2021.
Vancouver:
Groce AD. New Notions and Mechanisms for Statistical Privacy. [Internet] [Thesis]. University of Maryland; 2014. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/1903/15813.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Groce AD. New Notions and Mechanisms for Statistical Privacy. [Thesis]. University of Maryland; 2014. Available from: http://hdl.handle.net/1903/15813
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

NSYSU
24.
Lai, Yi-jen.
The Tension between Benefit and Risk on Facebook: The Effect of Need for Popularity and Privacy Concern on Facebook Privacy Management.
Degree: Master, Institute of Marketing Communication, 2015, NSYSU
URL: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0727115-211706
► For Facebook users, the desire to be popular and their privacy concerns have made privacy management a paradoxical issue. This study explores how usersâ need…
(more)
▼ For Facebook users, the desire to be popular and their
privacy concerns have made
privacy management a paradoxical issue. This study explores how usersâ need for popularity and
privacy concerns influence their behaviors to manage
privacy on Facebook within the framework of Communication
Privacy Management Theory (CPM). Using survey data collected through Facebook users (N=543), this study found that social
privacy concern positively influenced the behavior of boundary permeability, but there was no significant relation found between social
privacy concern and the behavior of boundary permeability, apart from this, no significant relation found between social
privacy concern and the behavior of boundary linkage; need for popularity on Facebook positively influenced the behavior of boundary permeability and the behavior of boundary linkage, but there was no significant relation found between need for popularity and the behavior of boundary ownership.
Privacy management served as a key factor to weigh the risks and benefits on Facebook. The results supported CPM. This study also provide a new interpretation of "
privacy paradox" -the inconsistency between
privacy attitude and disclosure behavior, by verifying that
privacy management is pursuing the optimal level of
privacy that fulfill usersâ need for disclosure while users are able to avoid negative or personal unsatisfactory feeling resulted from overdisclosure.
Advisors/Committee Members: Ting-Peng Liang (chair), Shao-Jung Wang (committee member), An-Shou Cheng (chair).
Subjects/Keywords: disclosure; privacy; social privacy concern; need for popularity; privacy management; privacy paradox; Facebook
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Lai, Y. (2015). The Tension between Benefit and Risk on Facebook: The Effect of Need for Popularity and Privacy Concern on Facebook Privacy Management. (Thesis). NSYSU. Retrieved from http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0727115-211706
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Lai, Yi-jen. “The Tension between Benefit and Risk on Facebook: The Effect of Need for Popularity and Privacy Concern on Facebook Privacy Management.” 2015. Thesis, NSYSU. Accessed January 19, 2021.
http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0727115-211706.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Lai, Yi-jen. “The Tension between Benefit and Risk on Facebook: The Effect of Need for Popularity and Privacy Concern on Facebook Privacy Management.” 2015. Web. 19 Jan 2021.
Vancouver:
Lai Y. The Tension between Benefit and Risk on Facebook: The Effect of Need for Popularity and Privacy Concern on Facebook Privacy Management. [Internet] [Thesis]. NSYSU; 2015. [cited 2021 Jan 19].
Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0727115-211706.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Lai Y. The Tension between Benefit and Risk on Facebook: The Effect of Need for Popularity and Privacy Concern on Facebook Privacy Management. [Thesis]. NSYSU; 2015. Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0727115-211706
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Tulane University
25.
Fenske, Ellis.
Anonymity and Linkability.
Degree: 2018, Tulane University
URL: https://digitallibrary.tulane.edu/islandora/object/tulane:79040
► This thesis considers systems for anonymous communication between users of a cybersystem. Specifically, we consider the scenario where communications generated by the same user repeatedly…
(more)
▼ This thesis considers systems for anonymous communication between users of a cybersystem. Specifically, we consider the scenario where communications generated by the same user repeatedly over time can or must be linked. Linked user behavior can leak information, which adversaries can use to de-anonymize users. Analyzing linked behavior can also generate information about the use of anonymity protocols that can be valuable for research, leading to more effective protocols. But techniques to collect such data must include assurances that the methods and outputs do not compromise user privacy.
A main result of this thesis is an anonymity protocol called Private Set-Union Cardinality, designed to aggregate linked private user data safely. We prove that Private Set-Union Cardinality securely calculates the noisy cardinality of the union of a collection of distributed private data sets. This protocol is intended to take measurements in real-world anonymity systems like Tor and we prove it is secure even if a majority of the participants are dishonest as well as under general concurrent composition.
The remaining results analyze path selection in anonymous routing systems. To obtain our results, we develop a mathematical framework to measure information leakage during repeated linkable path selection and propose new metrics: a radius that measures worst-case behavior, and a neighborhood graph that visualizes degradation of the system over time as a whole. We use these metrics to derive theoretical upper bounds on an adversary's accuracy in de-anonymization.
Finally, we investigate an attack where users can be de-anonymized due to the information an adversary learns when failing to observe some event. We call these occurrences non-observations and we develop a theory of non-observations in anonymous routing systems, deriving theoretical bounds on the information leakage due to this behavior in the general case and for Tor.
1
Ellis Fenske
Advisors/Committee Members: Mislove, Michael (Thesis advisor), School of Science & Engineering Mathematics (Degree granting institution).
Subjects/Keywords: Anonymity; Privacy; Cryptography
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Fenske, E. (2018). Anonymity and Linkability. (Thesis). Tulane University. Retrieved from https://digitallibrary.tulane.edu/islandora/object/tulane:79040
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Fenske, Ellis. “Anonymity and Linkability.” 2018. Thesis, Tulane University. Accessed January 19, 2021.
https://digitallibrary.tulane.edu/islandora/object/tulane:79040.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Fenske, Ellis. “Anonymity and Linkability.” 2018. Web. 19 Jan 2021.
Vancouver:
Fenske E. Anonymity and Linkability. [Internet] [Thesis]. Tulane University; 2018. [cited 2021 Jan 19].
Available from: https://digitallibrary.tulane.edu/islandora/object/tulane:79040.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Fenske E. Anonymity and Linkability. [Thesis]. Tulane University; 2018. Available from: https://digitallibrary.tulane.edu/islandora/object/tulane:79040
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
26.
Pu, Shi.
GPU-based Parallel Computing Models and Implementations for Two-party Privacy-preserving Protocols.
Degree: 2013, Texas Digital Library
URL: http://hdl.handle.net/1969;
http://hdl.handle.net/2249.1/66738
► In (two-party) privacy-preserving-based applications, two users use encrypted inputs to compute a function without giving out plaintext of their input values. Privacy-preserving computing algorithms have…
(more)
▼ In (two-party)
privacy-preserving-based applications, two users use encrypted inputs to compute a function without giving out plaintext of their input values.
Privacy-preserving computing algorithms have to utilize a large amount of computing resources to handle the encryption-decryption operations. In this dissertation, we study optimal utilization of computing resources on the graphic processor unit (GPU) architecture for
privacy-preserving protocols based on secure function evaluation (SFE) and the Elliptic Curve Cryptographic (ECC) and related algorithms. A number of
privacy-preserving protocols are implemented, including private set intersection (PSI), secret handshaking (SH), secure Edit distance (ED) and Smith-Waterman (SW) problems. PSI is chosen to represent ECC point multiplication related computations, SH for bilinear pairing, and the last two for SFE-based dynamic programming (DP) problems. They represent different types of computations, so that in-depth understanding of the benefits and limitations of the GPU architecture for
privacy preserving protocols is gained.
For SFE-based ED and SW problems, a wavefront parallel computing model on the CPU-GPU architecture under the semi-honest security model is proposed. Low level parallelization techniques for GPU-based gate (de-)garbler, synchronized parallel memory access, pipelining, and general GPU resource mapping policies are developed. This dissertation shows that the GPU architecture can be fully utilized to speed up SFE-based ED and SW algorithms, which are constructed with billions of garbled gates, on a contemporary GPU card GTX-680, with very little waste of processing cycles or memory space.
For PSI and SH protocols and underlying ECC algorithms, the analysis in this research shows that the conventional Montgomery-based number system is more friendly to the GPU architecture than the Residue Number System (RNS) is. Analysis on experiment results further shows that the lazy reduction in higher extension fields can have performance benefits only when the GPU architecture has enough fast memory. The resulting Elliptic curve Arithmetic GPU Library (EAGL) can run 3350.9 R-ate (bilinear) pairing/sec, and 47000 point multiplication/sec at the 128-bit security level, on one GTX-680 card. The primary performance bottleneck is found to be lacking of advanced memory management functions in the contemporary GPU architecture for bilinear pairing operations. Substantial performance gain can be expected when the on-chip memory size and/or more advanced memory prefetching mechanisms are supported in future generations of GPUs.
Advisors/Committee Members: Liu, Jyh-charn (advisor).
Subjects/Keywords: Privacy-preserving computing
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Pu, S. (2013). GPU-based Parallel Computing Models and Implementations for Two-party Privacy-preserving Protocols. (Thesis). Texas Digital Library. Retrieved from http://hdl.handle.net/1969; http://hdl.handle.net/2249.1/66738
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Pu, Shi. “GPU-based Parallel Computing Models and Implementations for Two-party Privacy-preserving Protocols.” 2013. Thesis, Texas Digital Library. Accessed January 19, 2021.
http://hdl.handle.net/1969; http://hdl.handle.net/2249.1/66738.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Pu, Shi. “GPU-based Parallel Computing Models and Implementations for Two-party Privacy-preserving Protocols.” 2013. Web. 19 Jan 2021.
Vancouver:
Pu S. GPU-based Parallel Computing Models and Implementations for Two-party Privacy-preserving Protocols. [Internet] [Thesis]. Texas Digital Library; 2013. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/1969; http://hdl.handle.net/2249.1/66738.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Pu S. GPU-based Parallel Computing Models and Implementations for Two-party Privacy-preserving Protocols. [Thesis]. Texas Digital Library; 2013. Available from: http://hdl.handle.net/1969; http://hdl.handle.net/2249.1/66738
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Leiden University
27.
Wegewijs, L.
Privacy, Politics and the Fourth Estate.
Degree: 2014, Leiden University
URL: http://hdl.handle.net/1887/31824
► This thesis compares the use of frames in newspaper articles and parliamentary documents with regard to privacy related events. The objectives are to clarify framing…
(more)
▼ This thesis compares the use of frames in newspaper articles and parliamentary documents with regard to
privacy related events. The objectives are to clarify framing dynamics between Dutch media and parliament, as well as to answer
privacy-issue specific questions with regard to framing. Based on two major events demarcating the research period – i.e., the 2001 World Trade Center attacks and Edward Snowden’s NSA revelations mid 2013 – it is hypothesized that the way state authorities are framed changes over time from ensuring security towards violating
privacy. Furthermore it is hypothesized that changes in
privacy frames correspond between similar events and differ between distinct clusters of events. Finally, the ‘who-follows-who question’ is treated by means of a ‘lead/lag’ model that compares framing overlap between newspaper articles and parliamentary documents. The data used to achieve both objectives are acquired by coding two Dutch national newspapers (Telegraaf and Volkskrant) and written questions from Dutch parliament for the period between January 1999 and March 2014. Researching this particular period enables to assess the expected dynamics between both arenas between above mentioned landslide events. The data suggest that the expected changes in the way state authorities are frames is absent. The ‘lead/lag model suggests that on average media are leading parliament with regard to framing
privacy related events. The data furthermore suggests no over-time shift in influence from one arena to the other. Unfortunately, the lead/lag model, as well as the long-term frame dynamics, provides only rough indicators for answering the research questions and assessing the set hypotheses. Therefore, the provided insights are only tentative and ask for further research, so as to deepen understanding about
privacy frames and framing dynamics between media and parliament even further.
Advisors/Committee Members: Meffert, Dr. M.F (advisor), Tromble, Dr. R.K (advisor).
Subjects/Keywords: Media; Framing; Privacy
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Wegewijs, L. (2014). Privacy, Politics and the Fourth Estate. (Masters Thesis). Leiden University. Retrieved from http://hdl.handle.net/1887/31824
Chicago Manual of Style (16th Edition):
Wegewijs, L. “Privacy, Politics and the Fourth Estate.” 2014. Masters Thesis, Leiden University. Accessed January 19, 2021.
http://hdl.handle.net/1887/31824.
MLA Handbook (7th Edition):
Wegewijs, L. “Privacy, Politics and the Fourth Estate.” 2014. Web. 19 Jan 2021.
Vancouver:
Wegewijs L. Privacy, Politics and the Fourth Estate. [Internet] [Masters thesis]. Leiden University; 2014. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/1887/31824.
Council of Science Editors:
Wegewijs L. Privacy, Politics and the Fourth Estate. [Masters Thesis]. Leiden University; 2014. Available from: http://hdl.handle.net/1887/31824

Leiden University
28.
van der Heyden, Nathan.
Can Privacy Survive in the Digital Age.
Degree: 2020, Leiden University
URL: http://hdl.handle.net/1887/123191
This thesis will argue that our current conception of privacy is insufficient to properly survive the threats posed by technological innovation in modern society.
Advisors/Committee Members: Sleutels, Jan (advisor).
Subjects/Keywords: technology; privacy; data
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
van der Heyden, N. (2020). Can Privacy Survive in the Digital Age. (Masters Thesis). Leiden University. Retrieved from http://hdl.handle.net/1887/123191
Chicago Manual of Style (16th Edition):
van der Heyden, Nathan. “Can Privacy Survive in the Digital Age.” 2020. Masters Thesis, Leiden University. Accessed January 19, 2021.
http://hdl.handle.net/1887/123191.
MLA Handbook (7th Edition):
van der Heyden, Nathan. “Can Privacy Survive in the Digital Age.” 2020. Web. 19 Jan 2021.
Vancouver:
van der Heyden N. Can Privacy Survive in the Digital Age. [Internet] [Masters thesis]. Leiden University; 2020. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/1887/123191.
Council of Science Editors:
van der Heyden N. Can Privacy Survive in the Digital Age. [Masters Thesis]. Leiden University; 2020. Available from: http://hdl.handle.net/1887/123191

University of Waterloo
29.
He, Miao.
Privacy-Preserving Multi-Quality Charging in V2G network.
Degree: 2014, University of Waterloo
URL: http://hdl.handle.net/10012/8426
► Vehicle-to-grid (V2G) network, which provides electricity charging service to the electric vehicles (EVs), is an essential part of the smart grid (SG). It can not…
(more)
▼ Vehicle-to-grid (V2G) network, which provides electricity charging service to the electric vehicles (EVs), is an essential part of the smart grid (SG). It can not only effectively reduce the greenhouse gas emission but also significantly enhance the efficiency of the power grid. Due to the limitation of the local electricity resource, the quality of charging service can be hardly guaranteed for every EV in V2G network. To this end, the multi-quality charging is introduced to provide quality-guaranteed service (QGS) to the qualified EVs and best effort service (BES) to the other EVs. To perform the multi-quality charging, the evaluation on the EV's attributes is necessary to determine which level of charging service can be offered to the EV. However, the EV owner's privacy such as real identity, lifestyle, location, and sensitive information in the attributes may be violated during the evaluation and authentication. In this thesis, a privacy-preserving multi-quality charging (PMQC) scheme for V2G network is proposed to evaluate the EV's attributes, authenticate its service eligibility and generate its bill without revealing the EV's private information. Specifically, by adopting ciphertext-policy attribute based encryption (CP-ABE), the EV can be evaluated to have proper charging service without disclosing its attribute privacy. By utilizing group signature, the EV's real identity is kept confidential during the authentication and the bill generation. By hiding the EV's real identity, the EV owner's lifestyle privacy and location privacy are also preserved. Security analysis demonstrates that PMQC can achieve the EV's privacy preservation, fine-grained access control on the EVs for QGS, traceability of the EV's real identity and secure revocation on the EV's service eligibility. Performance evaluation result shows that PMQC can achieve higher efficiency in authentication and verification compared with other schemes in terms of computation overhead. Based on PMQC, the EV's computation overhead and storage overhead can be further reduced in the extended privacy-preserving multi-quality charging (ePMQC) scheme.
Subjects/Keywords: V2G; Privacy-Preserving
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
He, M. (2014). Privacy-Preserving Multi-Quality Charging in V2G network. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/8426
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
He, Miao. “Privacy-Preserving Multi-Quality Charging in V2G network.” 2014. Thesis, University of Waterloo. Accessed January 19, 2021.
http://hdl.handle.net/10012/8426.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
He, Miao. “Privacy-Preserving Multi-Quality Charging in V2G network.” 2014. Web. 19 Jan 2021.
Vancouver:
He M. Privacy-Preserving Multi-Quality Charging in V2G network. [Internet] [Thesis]. University of Waterloo; 2014. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/10012/8426.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
He M. Privacy-Preserving Multi-Quality Charging in V2G network. [Thesis]. University of Waterloo; 2014. Available from: http://hdl.handle.net/10012/8426
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
30.
Thomas, Bruce.
A legal framework for sharing customs intelligence through the single window system.
Degree: 2018, Victoria U of Wellington : Dissertations
URL: http://hdl.handle.net/10063/7719
► Some customs agencies are implementing electronic single window systems. These single window systems enable an importer or exporter to digitally transmit their transaction information to…
(more)
▼ Some customs agencies are implementing electronic single window systems. These single window systems enable an importer or exporter to digitally transmit their transaction information to the customs administration. The single window system shares relevant information with other government agencies involved in the import or export process. It relieves the importer or exporter of the need to lodge transaction information separately with each government agency.
An international single window system is the interconnection of two or more national single window systems. It enables the exporter’s transaction information to be re-used in import processing, thereby reducing the amount of information required from importers.
For states that already have customs intelligence-sharing agreements, a single window system could be used to exchange intelligence information about the import and export transactions processed by the system.
Intelligence-sharing agreements can and should include transparent protection for human rights. The human rights relevant to this legal framework are access to justice, freedom from arbitrary search and seizure, freedom from torture and the right to
privacy. The right to
privacy is the human right most affected by intelligence-sharing.
This thesis proposes a legal framework to enable intelligence to be shared through a single window system with transparent terms for managing human rights. This thesis suggests that public confidence would be improved by showing how
privacy and other human rights are treated in the rules for customs intelligence-sharing using the system proposed here.
Advisors/Committee Members: Angelo, Tony, Smith, Tony.
Subjects/Keywords: intelligence; privacy; transparency
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Thomas, B. (2018). A legal framework for sharing customs intelligence through the single window system. (Doctoral Dissertation). Victoria U of Wellington : Dissertations. Retrieved from http://hdl.handle.net/10063/7719
Chicago Manual of Style (16th Edition):
Thomas, Bruce. “A legal framework for sharing customs intelligence through the single window system.” 2018. Doctoral Dissertation, Victoria U of Wellington : Dissertations. Accessed January 19, 2021.
http://hdl.handle.net/10063/7719.
MLA Handbook (7th Edition):
Thomas, Bruce. “A legal framework for sharing customs intelligence through the single window system.” 2018. Web. 19 Jan 2021.
Vancouver:
Thomas B. A legal framework for sharing customs intelligence through the single window system. [Internet] [Doctoral dissertation]. Victoria U of Wellington : Dissertations; 2018. [cited 2021 Jan 19].
Available from: http://hdl.handle.net/10063/7719.
Council of Science Editors:
Thomas B. A legal framework for sharing customs intelligence through the single window system. [Doctoral Dissertation]. Victoria U of Wellington : Dissertations; 2018. Available from: http://hdl.handle.net/10063/7719
◁ [1] [2] [3] [4] [5] … [93] ▶
.