Technology impact on privacy

Problem (scientific and practical)

The scientific problem of regulating privacy in the world of constantly evolving technology consists of determining the level of effective regulating of privacy, protecting the rights of data subjects, yet providing business with possibility of using personal data on the lawful basis and inspire the free cross-border movement of data.

The practical problem is finding the balance between business using technology to get benefits from personal data and actual real control of the personal data by the data subjects.

AIM

The aim of this Essay is to study the impact of evolving technology on privacy rights of data subject in context of global regulation

Objectives

  • to study the concept of privacy protection  in general in the modern world, full of technologies 
  • to emphasize the list of technology, having the most impact on privacy 
  • to study the key points of regulation of privacy concerning use of technology
  • to define the impact of technologies on privacy right and the way it amended regulations 

Social science methods

In order to study the problem and reach the aim and objectives of the essay, social science methods such as observation and analysis shall be used.

PLAN

Privacy protection in the modern world
1.1. Definition of personal data
1.2. Globalization of regulation of privacy
1.3. Rights of data subjects
1.4. Privacy concepts and roles

Technologies, most impacting on privacy and data protection
2.1. Social media advertising, based on personal data
2.2. Cloud computing services and privacy
2.3. Privacy tech as a new domain

Conclusions

1. PRIVACY PROTECTION IN THE MODERN WORLD

1.1. Definition of personal data 

In order to study and understand the impact of modern technologies, which are constantly evolving, on privacy, it shall be needed to define the term personal data. 

This definition is very important, as it reflects the impact of technologies have and will have on personal data protection and privacy in general. We can see that the data subject may be identified or identifiable by an identifier. Such an identifier is personal data. It may be offline or online. Hence, the main idea is that data becomes personal data when upon its usage, the data controller may distinguish one data subject from another. 

Offline and online identifiers about one data subject may be structured, studied and used with different aims, like offering goods or services to the data subject or predict its behavior. From definition it may be seen that online identifiers, such as IP, information about browser, or a nickname in the online game may be considered as personal data. 

Herewith, identifiers that may be considered as offline ones, such as face of the data subject, may easily become online and be processed by automated means, by using, for example, technology of face recognition on a railway station or upon unblocking one’s own mobile device. 

Thus, nowadays, actually any little piece of information regarding data subject may be considered as personal data, which means, it should be used in compliance with applicable laws.  

Herewith, if we go back in history and take a look at the definition of personal data in Directive 95/46/EC, we will find the following: 

However, if we take a look at the definition of personal information at LGPD3, we will not find special wording about online identifiers, yet, the definitions are very broad and by their nature include online identifiers.  

Learn how new technologies affect privacy. This article explains global laws, business duties, and rights of individuals.

So, according to LGPD, personal data means information related to an identified or identifiable natural person3;

If we take a look at CCPA4, we will find specific clauses about online identificators. 

According to CCPA, personal information means information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following if it identifies, relates to, describes, is reasonably capable of being associated with, or could be reasonably linked, directly or indirectly, with a particular consumer or household:

So, as it can be seen, because of the impact of technologies, data subjects can be easily identified or become identifiable by their “digital footsteps” and regulators put special attention on a definition level to define objects like browser history or interaction with advertisements as personal data. 

With technology development, which means the increment of speed, volume and velocity of personal information that may be processed by companies or governments, regulators all around the world put special attention on risky categories of personal data, making it more regulated in order to protect data subjects.    

1.2. Globalization of regulations of privacy 

Informational technologies make it possible to transfer and process huge amounts of personal data in a seconds, using cloud computing, big data technologies and other means. Some LLC may be registered in the USA, namely California, use the services of software developers and support from Ukrainian private entrepreneurs and have servers in Brazil. Yet, the business of such a company will be focused on the EU market. For example, it may be advertising services, based on predictable analytics, which gives such companies a lot of personal data of data subjects.

As it is stated in article 3 of GDPR: 

Thus, the regulator is empowered to penalize the legal entities, which are registered and acting under the foreign law (not EU law) in case they process the personal data of the EU data subjects.    

Same exteritorial approach may be seen in Brazil, where in the article 3 of LGPD, it is stated, that: 

Using informational technologies is affordable for big corporations such as Google, META, etc, and for small companies, who may make advertising campaigns on citizens of almost every country in the world with just a few clicks, using the services of above mentioned corporations. Governments wish to be able to protect their own citizens rights and as a result, adopt respective legislation with exterritorial effect.

Such terms and regulations affect directly on companies, who wish to enter or are working on the respective markets. They need to adapt their own privacy programs and even ways they do business in the context of processing the personal data to the respective requirements of applicable legislation. The idea of exteritoral effect of such regulations is to give the data subjects control over their personal data despite the location, headquarters place or servers of the respective controller or processor. 

Nowadays, privacy programs in the technological companies usually start with data inventorization, the aim of which is to understand the personal data flows – from which countries the data comes, and, respectively, determine the scope of global compliance. One company, registered in the USA may be subject to both GDPR and LGPD regulation because of working with the data of respective data subjects by means of modern informational technologies in the real time.

1.3. Rights of data subjects 

Scope of rights that are granted to data subjects vary depending on applicable legislation. Yet, talking about rights that are mostly affected by technology development, we should point our view on GDPR as an example of one of the best pieces of legislation nowadays. 

  • Providing information “loud and clear”

One of the most fundamental requirements to the controllers is the way of providing information to the data subjects. As it is stated in the article 12 of GDPR, 

This requirement may sound a bit generic at a first view, but it reflects one of the core principles of modern privacy regulation – provision of real control on own personal data to the data subject. Such control is impossible without understanding (I) what is happening with the data, (II) exact moment and consequences of provision of the data. 

One of the most interesting researches on customer journey maps in frames of “privacy touches” was provided by IPSOS in September 20215

They studied the impact of quality of privacy touches (ex. UX privacy texts, privacy UX design, etc.) on users of web-sites or other online software. One of their conclusions was that privacy communication has to be: (I) memorable, (II) manageable, (III) meaningful. They used quantitative and quality social science methods, including surveys and results that showed the necessity of complying with requirements of article 12 of GDPR not just because of importance to meet the legal obligations, but because of the reasonable expectations of the users of the online services. 

Learn how new technologies affect privacy. This article explains global laws, business duties, and rights of individuals.

Here we may also talk about “click fatigue”. In case the user is literally forced to click too many times with different forms and buttons, giving or not giving his/her consent, by some time, such a user will just click on any button, without reading. 

Herewith, UX privacy texts, as well as forms, may be designed in a way to make the user give his/her consent on everything the controler wants to do with his/her personal data. Such cases are called “dark patterns” in privacy touches design. Herewith, getting consents on processing of the personal data in cases, stated above may not be considered lawful basis to process the personal data, as it will be not in compliance both with requirements of clause 12 of GDPR (Regulation) and with requirements to the consent (clause 7), as well, stated in the Regulation.

Thus, technologies made it possible to process the personal data of millions of people in seconds of time, getting their consents online, with just a few clicks and putting their personal data in “black boxes”, generating new information about such data subject, test it on data subjects, make new conclusions about habits or behavior of such data subjects, and use it against them, providing such data subjects with offerings, using the voundables found in the process of analysis, discriminating them by doing so. Article 12 of GDPR is one of the fundamental clauses that provides data subjects with the right to get privacy communication in an understandable way.      

  • Right to be forgotten    

One of the most interesting rights of data subjects is stated in clause 17 of GDPR. It is a right to be erased or right to be forgotten. In the era of search engines and internet archives, it is almost impossible to hide some information that was once published on the internet. Herewith, such information may be a piece of personal data and the data subject may reject it being demonstrated online upon “googling” his/her name. 

There are terms of this right to be used and also there are cases, when it can’t be used, for example – if information is used for expression the right of freedom of expression of the information (part 3 of article 17 of GDPR). Line between public interest and privacy rights has always been hard to define and in the era of the Internet, fake news, bots and other “dark patterns of journalism staff” it became even harder. 

Herewith, if the person has a right to use it’s right “to be erased”, there is an interesting obligation of the controller, that is stated in clause 2 of article 17 of GDPR, namely: 

This is a very interesting intent of the regulator to make the right to be forgotten work in practice, as taking into account the way the information may be spreaded across the network, it shall be almost impossible for the data subject to reach all the controllers / processors, who were involved in processing its data. 

  • Portability right 

The right to data portability is defined in article 20 of GDPR, namely:

There are terms when the data subject may use this right, namely – when the processing is conducted by the automated means and depending on the legal basis, the personal data is processing on. 

This right is a novel of GDPR and its realization on practice is hard even in 2022, as there may be different file systems, level of telecommunications and approach to personal data usage in different companies. For example, a data subject, who is the client of the streaming platform, wants to change such a platform to another operator and desires that his/her preferences and personal data, used for recommendations of new content, were transferred. It may be done and the user will not lose his / her personalized experience in using this type of service. Yet, because of the lack of unified standart / different data privacy practices, the practical aspect of realization of this right is still evolving, as technologies are.     

  • Automated decision making and profiling 

Profiling and automated decision making based on data subject personal data takes place in numerous companies all over the world. The access to the respective technologies, such as Big data technologies, Artificial intelligence, machine learning, etc. makes it possible for the businesses to exclude human factors in most decisions regarding data subjects.

Such decisions vary dramatically and are based on the respective business processes and needs. For example, it may be a resume scoring by automated means in order to define whether the candidate meets the requirements. It may be the decision whether to provide or not a loan to the prospective borrower, based on his/her financial information. It may be a decision – which services to promote to the respective client of the online corporation, based on his / her behavior on the web-site / in the mobile app.   

In the retical 71 of GDPR the right to not be a subject of profiling and automated decision making is described, namely: 

The idea is that the data subject shall have a right to ask for human involvement into a decision making process. Herewith, the question always is “what do we have in a black box?” The AI or algorithm that is used in order to make decisions regarding the specific data subject must be fair. This means – not use the information in a way, it could discriminate against the data subject. It is a question of AI ethics and, herewith, impact of the algorithms, which work with personal data is hard to overestimate.

Learn how new technologies affect privacy. This article explains global laws, business duties, and rights of individuals.
  • Other privacy rights of data subjects

Other data subjects rights, such as right to restrict or right to object, as they are defined in the GDPR were influenced by the technology development as well, as their practical realization shifted mostly to the online mode. 

Thus, with technological development, which means collapsing of “online” and “offline” worlds, more and more activity of data subjects is done using informational technologies. In cases of offline services provision, like hotels services or cafes, companies, who provide such services, also use online systems – ERP, CRM, etc. to work with personal data of data subjects. It means that data subjects have to be able to effectively realize their rights online and companies should test the possibility for users to realize each privacy right online in a convenient manner. 

For example, withdrawal of consent should be made as easy as giving such consent, and so on.       

1.4. Privacy concepts and roles

  • Data protection impact assessment

As technologies are constantly developed and their application in business may take really different forms, it’s impossible for the regulators to define all exact combinations of technologies to be implemented by the companies in data processing processes and provide the respective regulation. 

Herewith, each case of application of new technologies in data processing may be almost unique. It’s not easy to evaluate risks of using new technical algorithms for the purposes of getting benefits from the personal data of data subjects. The EU regulator has provided companies and authorities, who may be in such situations, with an instrument, called data protection impact assessment. The details of such tool are stated in the article 35 of GDPR, namely, in the clause 1, it is stated, that: 

So, the regulator is intending to protect the data subjects from risks of implementation of new technologies by making it obligatory for controllers or processors to conduct some kind of risk analysis prior the start of implementation of planned changes to the data processing. 

One of the cases, where Data protection impact assessment is obligatory, is defined in clause 3(a) of the article 35 of GDPR, namely: 

Here we may find out that the above mentioned case, inter alia, includes triggers like profiling and automated processing. As was discussed earlier, the regulator points special attention to protect data subjects from the bias and discriminative decisions, made solely by the algorithm by providing data subjects with the right not to be subjects of such decisions and demand human interference. 

Now, when we talk about DPIA, we see the attempt of the regulator to prevent cases of possible bias and discrimination of data subject, based on using its voundables, knowledge about which was obtained upon automatic study of its personal data – habits, financial history, education, etc. This attempt is made not by providing rights to data subjects, but by providing obligations to the companies in the defined cases. 

In clause 7 of article 35 of GDPR, the regulator defines the minimum requirements to the DPIA, namely: 

  • a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller;
  • an assessment of the necessity and proportionality of the processing operations in relation to the purposes;
  • an assessment of the risks to the rights and freedoms of data subjects referred to in paragraph 1; and
  • the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Regulation taking into account the rights and legitimate interests of data subjects and other persons concerned.

We see that such requirements mean that the company must conduct a research on the situation and the consequences of implementing new technology concerning risks to data subjects. Herewith, the integral part of DPIA concerns the measures to ensure the protection of data subjects rights and demonstrate compliance. Lack of conducting the DPA or conducting it in the wrong way may result in the noncompliance with the GDPR. 

Thus, the regulator answers on the challenges provided by the new technologies by impacting both on companies and data subjects, trying to create the scenarios, under which, the data subjects rights will be protected, yet the companies will be able to make innovations. 

It is interesting that in order to conduct DPIA, in cases defined by the GDPR, a Data protection officer has to be involved in such a process. One of the triggers of designating the Data protection officer (DPO) is the case, defined in clause (1B) of article 35 of GDPR, namely: 

This is the case for most “online” corporations or companies that work with personal data as a business asset. In this case, the use of technologies is obvious and involvement of DPO in the assessment of risks of usage of such technologies is one more level of making a safe legal environment for the market in context of use of personal data.   

  • Data protection by design and by default

As development of new software is literally an unstoppable process, it is important to assure that such new developments are made taking into account the privacy requirements to such systems. If, for example, the planned mobile application shall be used by millions of kids all around the world, special attention should be put to the high level of protection of personal data by means of technical measures and, at the same time, following the principle of data minimization.

Learn how new technologies affect privacy. This article explains global laws, business duties, and rights of individuals.

GDPR defines such consent as data protection by default and by design in its article 25. Herewith, recital 78 of GDPR gives us more information regarding appropriate technological and organizational measures in frames of development of technological products, namely, inter alia:

The whole idea of data privacy by design and by default in frames of technology impact on privacy in general, is to make the market (the business) put privacy first upon development of new features and products. Rising such privacy culture in companies, where privacy programs have real impact on new developments means better protection of the rights of data subjects, yet, not limiting the companies in innovations.     

2. TECHNOLOGIES, MOST IMPACTING ON PRIVACY AND DATA PROTECTION

Technology development and its impact on all aspects of modern life is dramatic. We live in the world of shifted paradigma, where we provide personal data online, get services online, “live online”, yet, “online” is just the way of transmitting the information. Herewith, “online” is very dynamic and as data subjects, we face cases, where only seconds after providing some personal information, we are, in the best case, viewing the customized advertising, and, in many others – our personal information may be used against us. 

It’s hard to determine the pure different technologies, which impact our privacy mostly. It’s always a combination of different technical decisions in order to receive some business aim. For example, banks may analyze our transactions and then share such information with the advertisers, who will analyze it and provide us with the customized advertising. Is it legal? It depends on the case, but in any case, in the world, where technology develops fast, we, as data subjects, wish to keep control on our personal data. We want to know the way it is used and how it affects us at the end of the day. 

Regulators may point some special attention on different technologies, using of which may result in violation of privacy rights of data subjects.    

It’s impossible to draw up a regulation, which will state the specific rules for each technology and its possible use. Regulators adapt, and one of the ways of such adaptation is stating the principles, logic, key procedures and barriers to protect the data subject, and, herewith, provide guidance for the specific cases. When we talk about GDPR, as one of the most interesting pieces of legislation, we should also take into account the guidelines from the European data protection office (EDPB). EDPB replaced Working party 29 and continued to help businesses from all over the world with questions regarding best practices in data processing. 

Herewith, we may take a look at some interesting points in the guidelines, provided by EDPB regarding use of new technologies in privacy.  

2.1. Social media and customized advertising 

Who doesn’t have an account in one of the social networks? Facebook, Twitter, Tiktok and other social networks get our attention day after day. We scroll the news, we click like, we watch some type of content longer, then others. We create data about our habits, interests and lifestyle. We create data, which may be used to develop a specific advertising banner, specially for us. It may sound creepy, but combining different information about a data subject and putting it into some black box, called “AI” may lead to microtargeting. That can lead into putting a data subject into a “bubble”, where he/she may not see generic offers, he/she could see, because of specific tactics of the advertisers. 

EDPB provided a Guidance on targeting social media users. 

It deserves special attention, as social media targeting has become some kind of demonstration of the impact of technologies on privacy nowadays. 

EDPB states, that there such actors in this process: social media providers, their users, targeters and other actors which may be involved in the targeting process.

One of the risks of using social media targeting, as defined by EDPB is as follows: 

This leads us to the point that because of the complexity of the remarketing, microtargeting and targeting in social media, data subjects may feel a lack of control of their personal data.

For example – optional user Bob from Vilnius “googled”  a toaster, then he went to the some web-sites from top-3 in Google, and, on some of them, he put the toaster into the basket, and, on one of them, he actually ordered the toaster, which was delivered to him in 2 days. Now, Bob, eating a toast, wants to find out some news about his friends in his favorite social media and sees different ads. One of them is from the web-site, where he almost bought a toaster. It suggests him to come back and finish the purchase. He sees another advertisement, it’s from the web-site, where he actually bought a toaster, it offers him to buy a microwave with a discount. The third advertisement is from the company, on which web-site he did not enter. It offers Bob to buy the best bread for toasts.   

All of suchs ads may be absolutely legal and this case is casual, not paranoiac, as Bob saw ads, based only on one episode of his browser history. In this case technologies such as Facebook pixel and Facebook auditories may have been used (in case, Bob saw ads in Facebook). The question is – did Bob provide respective consents to cookies on the web-sites? Did such web-sites have respective notices and terms in their privacy policies? Such a list of questions is far from being full. 

Bob may have a feeling that corporations look at him as some kind of Big brother and it will be not far from the truth. On the other hand, social media may be just a tool for companies to reach the respective data subject, using his/her personal data, when they did not have any legal basis to do so.  

One more privacy risk, that is discovered in the Guidance by the EDPB is as follows: 

Thus, here we may see the same concern, as with automated decision making using profiling. The personal data of the data subject may be gathered in different ways (sometimes not with the respective legal basis), combined and used against the data subject. Herewith, in doing so, we can see the purpose of some human being or company (group of people). The technology itself allows us to work with personal data, get insights and use them. The way they shall be used is determined by the controller.   

2.2. Cloud computing services and privacy

There are a lot of different regulations all over the world regarding cloud services. In some cases, it is prohibited to keep the personal data of data subjects, citizens of some country, on servers, outside of such a country. In other cases – in order to transfer personal data outside of some region, there has to be an explicit consent from the data subject, and so on.

Learn how new technologies affect privacy. This article explains global laws, business duties, and rights of individuals.
Business people showing document to client. Computer, profile, diagram. Employment concept. Vector illustration can be used for topics like headhunting, human resources, recruitment

Development of technologies of cloud computing and cloud services as we know it today provided an opportunity for businesses to host data all over the world and have access to it in seconds. In most cases, when we talk about privacy issues and regulations, most data storage companies have respective privacy policies and privacy documents, which define them as processors of personal data. According to the recital 85 of the GDPR, 

Thus, cloud providers have to achieve the respective level of data protection and must be able to demonstrate it. Data centers usually build their privacy program around this status of processor and adequate technical and organizational measures, needed to assure the high level of personal data protection. Hence, as processors act only under instructions of the controller regarding processing of respective personal data, the data processing contracts of cloud providers are usually very strict and non-negotiable.     

2.3. Privacy tech as a new niche    

Day after day, new privacy laws are adopted all over the world. Some of them have extraterritorial effect, some have specific provisions. There are regulations, like GDPR, which apply to the whole region. Yet, in the USA, we may soon see a specific law in each State. It is hard for global companies to comply with all such laws at one moment and to keep updated on its amendments. 

Building a privacy program is a tough task, and implementing it in practice is even harder. Upon GDPR coming into force, law firms all over the world helped with GDPR compliance in the way they understood GDPR, drafting different policies, procedures and data mapping. 

Herewith, the privacy domain is not only about lawyers. It’s about engineers, privacy champions and C-level managers as well. On the other hand, there has to be a person to lead the privacy program, like chief privacy officer and also there is a Data protection officer, designating of which can be obligatory. Another challenge is software for privacy governance and privacy management.

Can using software, which means using services of privacy tech startups lead to absolute privacy laws compliance? It depends. There is a lot of “tactical software”, which may help, for example with building the data mapping, implementing a “gdpr-compliant” cookie banner or managing DPAs with the contractors. On the other hand, there is software trying to help with privacy compliance in general. 

The author of this essay believes that using such software may help with reaching privacy compliance, but yet, it is just a tool. It is a part of the way to get GDPR compliant and such instruments have to be used smartly in the hands of the chief privacy officer. 

Thus, as it can be seen, not only technology affects privacy, by stimulating new regulations and requirements to the market players, but privacy stimulates development of new software technologies. 

CONCLUSIONS

  • It’s impossible to stop development of new technologies. Yet, its usage may be regulated. Privacy as a fundamental right is regulated by specific rules, which put innovative businesses, working with personal data in a frame, aimed to protect the data subjects rights and not to harm the innovations as a concept. 
  • The impact of technologies to privacy is provided by the mix of different innovations, which are used differently in different cases worldwide, which forms privacy legislation in a way to meet new challenges of innovations.
  • Because of technology development, a lot of new concepts and roles were implemented into privacy regulations and it keeps updating, same as technologies.
  • Privacy rights realization on practice in real time became the key lacmus paper in determining the privacy compliance of the respective controller or processor.      

Do you have any questions for the lawyers?
up to 500 characters
An error occurred
The request has been sent Thank you for your message! We will process it as soon as possible.

Articles on the topic