Using Technology to Protect Privacy

Although computer-related technologies are increasing the ability of individuals, companies, and governments to invade personal privacy, technology can also be used to increase the level of privacy available to individuals. Several solutions can be used depending on the level of trust placed in the recipients of private information.

Secrecy. Developments in cryptography have enabled information and communications to be encrypted, maintaining their secrecy in the presence of attacks by other individuals, companies, and even governments. In the latter part of the twentieth century, governments were reluctant to allow individuals access to strong encryption and therefore attempted to control the distribution of strong encryption technology.

The development of the personal computer made governmental restriction on strong encryption ultimately impossible, because individuals could purchase computers powerful enough to perform both symmetric and public key cryptography in software.

Pretty Good Privacy (PGP) was one of the first programs developed to take advantage of the power of personal computers and offer individuals access to strong encryption schemes to keep documents secret from corporations and governments; it was so successful at this that its author, Phil Zimmerman, was charged with (but later acquitted of) violating U.S. export laws.

More recently, Transport Layer Security (TLS) has been developed for the World Wide Web to enable end-to-end privacy of communications between two participants. As such, TLS has been a key enabling technology in the development of e-commerce over the Internet, enabling buyers and sellers to exchange credit card information in a secure way.

Cryptography is excellent at protecting privacy when the data are controlled by its owner and it is shared with relatively few other parties who trust each other to keep the data secret. Sharing data with a larger set of participants increases the likelihood that trust is misplaced in at least one party, thereby jeopardizing data secrecy.

The number of participants with access to private information can often be reduced by implementing multilateral security, where information is stored in compartments accessable to only a few individuals, and the flow of information between compartments is restricted (see COMPUTER security).

A recent example of using multilateral security to protect privacy is the British Medical Association model for controlling access to electronic patient records. In the model, patients do not have a single electronic record; rather, they have a set of records (or compartments), each of which has a separate list of health-care workers who have the permission to read or append information to it. The flow of information between different records is restricted to prevent particularly sensitive information detailed in one record, such as a positive HIV test, from being introduced into other records.

Reciprocity. Configuring fine-grained access control parameters can be time consuming and difficult to get right. An alternative solution is to use reciprocity, where two or more entities agree to the mutual exchange of private data.

The exchange of information is symmetric if each party in the reciprocal transaction reveals the same piece of private data to all other parties; for example, three friends share their current mobile phone location with each other through their mobile phone operator.

An exchange of data is asymmetric if information is provided in return for the knowledge of the recepient’s identity. For example, in the United Kingdom, consumer credit ratings employ reciprocity: If a company queries a credit rating of an individual, that fact is recorded in the database so that when an individual queries their credit rating at a later date, they can see which companies examined their record.

Asymmetric reciprocity requires those requesting data to have a reputation worth protecting so that they obey acceptable social conventions when asking for information. Therefore, an abusive information request should reduce the attacker’s reputation so that a requester with a poor reputation forfeits all future access. Symmetric reciprocity requires that the information shared by each participant is of similar value and therefore constitutes a fair exchange. This is not always true; for example, the home phone number of a famous individual may be considered more valuable than the phone number of an ordinary citizen.

A centralized authority is usually required to enforce reciprocity by authenticating both the identities of those who access private data and the actual content of the mutually shared information. Authentication guarantees the identities of the parties, and the content of the exchanged information cannot be forged; therefore, the central authority must be trusted by all parties who use the system.

Anonymity. Sometimes there no trust relationship exists between the owner of personal information and a third-party, and yet they still want to exchange information. For example, individuals may not want to reveal the contents of their medical records to drug companies; yet the development of pharmaceutical products clearly benefits from access to the information contained within such documents. It is often possible to provide access to a subset of the data that satisfies the demands of the third party and simultaneously protects the privacy of an individual through anonymization.

Anonymization protects an individual’s privacy by removing all personally identifiable data before delivering it to an untrusted third party. Therefore, once data are successfully anonymized an adversary cannot infer the real-world individual represented by the data set. Anonymization is not an easy process; it is not sufficient to simply remove explicit identifiers such as a name or telephone number, because a combination of other attributes may enable a malicious data recipient to infer the individual represented by the data set.

For example, in the set of medical records for Cambridge, England, there may be only one 42-year-old professor who has lost sight in one eye. If the data presented to the third party contains information concerning the individual’s home city, profession, date of birth, and ophthalmology in sufficient detail, then it may be possible to associate this data with a real- world entity and therefore associate any other data in this record with a concrete identity. In this case, the privacy of the professor is effectively lost and the contents of his medical record, as presented to the third party, are revealed.

Successful anonymization may require the values in the released data set to be modified to prevent the third party from inferring the real-world identity associated with a record. For example, reducing the accuracy of the individual’s age from 42 to the range (40-50) may prevent an attacker associating an identity with the record.

Whether this reduction alone is sufficient depends on the data held in the other records (in the example above, it depends on the number of other professors between the ages of 40 and 50 with sight in only one eye). In general, anonymization of data is achieved through a thorough statistical analysis of the data set that takes into account the amount of information known by an attacker; such analysis is called Statistical Disclosure Control.






Date added: 2024-07-23; views: 41;


Studedu.org - Studedu - 2022-2024 year. The material is provided for informational and educational purposes. | Privacy Policy
Page generation: 0.017 sec.