Technology impact on privacy

Problem (scientific and practical)

The scientific problem of regulating privacy in the world of constantly evolving technology consists of determining the level of effective regulating of privacy, protecting the rights of data subjects, yet providing business with possibility of using personal data on the lawful basis and inspire the free cross-border movement of data.

The practical problem is finding the balance between business using technology to get benefits from personal data and actual real control of the personal data by the data subjects. 


The aim of this Essay is to study the impact of evolving technology on privacy rights of data subject in context of global regulation


  • to study the concept of privacy protection  in general in the modern world, full of technologies 
  • to emphasize the list of technology, having the most impact on privacy 
  • to study the key points of regulation of privacy concerning use of technology
  • to define the impact of technologies on privacy right and the way it amended regulations 

Social science methods

In order to study the problem and reach the aim and objectives of the essay, social science methods such as observation and analysis shall be used.


Privacy protection in the modern world
1.1. Definition of personal data
1.2. Globalization of regulation of privacy
1.3. Rights of data subjects
1.4. Privacy concepts and roles

Technologies, most impacting on privacy and data protection
2.1. Social media advertising, based on personal data
2.2. Cloud computing services and privacy
2.3. Privacy tech as a new domain



1.1. Definition of personal data 

In order to study and understand the impact of modern technologies, which are constantly evolving, on privacy, it shall be needed to define the term personal data. 

According to GDPR1 (article 4), ‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.

This definition is very important, as it reflects the impact of technologies have and will have on personal data protection and privacy in general. We can see that the data subject may be identified or identifiable by an identifier. Such an identifier is personal data. It may be offline or online. Hence, the main idea is that data becomes personal data when upon its usage, the data controller may distinguish one data subject from another. 

Offline and online identifiers about one data subject may be structured, studied and used with different aims, like offering goods or services to the data subject or predict its behavior. From definition it may be seen that online identifiers, such as IP, information about browser, or a nickname in the online game may be considered as personal data. 

Herewith, identifiers that may be considered as offline ones, such as face of the data subject, may easily become online and be processed by automated means, by using, for example, technology of face recognition on a railway station or upon unblocking one’s own mobile device. 

Thus, nowadays, actually any little piece of information regarding data subject may be considered as personal data, which means, it should be used in compliance with applicable laws.  

Herewith, if we go back in history and take a look at the definition of personal data in Directive 95/46/EC, we will find the following: 

‘personal data shall mean any information relating to an identified or identifiable natural person (‘data subject’); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity2;

As it can be seen, in this definition, there is no such thing as emphasizing the online identificators, which shows us that impact of development of informational technologies is crucial and leads to changes in the regulation even on definition level.  

However, if we take a look at the definition of personal information at LGPD3, we will not find special wording about online identifiers, yet, the definitions are very broad and by their nature include online identifiers.  

So, according to LGPD, personal data means information related to an identified or identifiable natural person3;

If we take a look at CCPA4, we will find specific clauses about online identificators.

According to CCPA, personal information means information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following if it identifies, relates to, describes, is reasonably capable of being associated with, or could be reasonably linked, directly or indirectly, with a particular consumer or household:

(A) Identifiers such as a real name, alias, postal address, unique personal identifier, online identifier, internet protocol address, email address, account name, social security number, driver’s license number, passport number, or other similar identifiers.

(F) Internet or other electronic network activity information, including, but not limited to, browsing history, search history, and information regarding a consumer’s interaction with an internet website, application, or advertisement.

So, as it can be seen, because of the impact of technologies, data subjects can be easily identified or become identifiable by their “digital footsteps” and regulators put special attention on a definition level to define objects like browser history or interaction with advertisements as personal data. 

With technology development, which means the increment of speed, volume and velocity of personal information that may be processed by companies or governments, regulators all around the world put special attention on risky categories of personal data, making it more regulated in order to protect data subjects.    

1.2. Globalization of regulations of privacy 

Informational technologies make it possible to transfer and process huge amounts of personal data in a seconds, using cloud computing, big data technologies and other means. Some LLC may be registered in the USA, namely California, use the services of software developers and support from Ukrainian private entrepreneurs and have servers in Brazil. Yet, the business of such a company will be focused on the EU market. For example, it may be advertising services, based on predictable analytics, which gives such companies a lot of personal data of data subjects.

As it is stated in article 3 of GDPR: 

This Regulation applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to:

  • the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the Union; or
  • the monitoring of their behavior as far as their behavior takes place within the Union1.

Thus, the regulator is empowered to penalize the legal entities, which are registered and acting under the foreign law (not EU law) in case they process the personal data of the EU data subjects.    

Same exteritorial approach may be seen in Brazil, where in the article 3 of LGPD, it is stated, that: 

This Law applies to any processing activity carried out by a natural person or legal entity governed by public or private law, regardless of the medium, the country of its headquarters or the country where the data are located, provided that:

I – the processing activity is carried out in the national territory;

II – the processing activity has the purpose of offering or supplying goods or services or the processing of data of individuals located in the national territory

Using informational technologies is affordable for big corporations such as Google, META, etc, and for small companies, who may make advertising campaigns on citizens of almost every country in the world with just a few clicks, using the services of above mentioned corporations. Governments wish to be able to protect their own citizens rights and as a result, adopt respective legislation with exterritorial effect.

Such terms and regulations affect directly on companies, who wish to enter or are working on the respective markets. They need to adapt their own privacy programs and even ways they do business in the context of processing the personal data to the respective requirements of applicable legislation. The idea of exteritoral effect of such regulations is to give the data subjects control over their personal data despite the location, headquarters place or servers of the respective controller or processor. 

Nowadays, privacy programs in the technological companies usually start with data inventorization, the aim of which is to understand the personal data flows – from which countries the data comes, and, respectively, determine the scope of global compliance. One company, registered in the USA may be subject to both GDPR and LGPD regulation because of working with the data of respective data subjects by means of modern informational technologies in the real time.

1.3. Rights of data subjects 

Scope of rights that are granted to data subjects vary depending on applicable legislation. Yet, talking about rights that are mostly affected by technology development, we should point our view on GDPR as an example of one of the best pieces of legislation nowadays. 

  • Providing information “loud and clear”

One of the most fundamental requirements to the controllers is the way of providing information to the data subjects. As it is stated in the article 12 of GDPR, 

… the controller shall provide information to the data subject in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child. 

This requirement may sound a bit generic at a first view, but it reflects one of the core principles of modern privacy regulation – provision of real control on own personal data to the data subject. Such control is impossible without understanding (I) what is happening with the data, (II) exact moment and consequences of provision of the data. 

One of the most interesting researches on customer journey maps in frames of “privacy touches” was provided by IPSOS in September 20215

They studied the impact of quality of privacy touches (ex. UX privacy texts, privacy UX design, etc.) on users of web-sites or other online software. One of their conclusions was that privacy communication has to be: (I) memorable, (II) manageable, (III) meaningful. They used quantitative and quality social science methods, including surveys and results that showed the necessity of complying with requirements of article 12 of GDPR not just because of importance to meet the legal obligations, but because of the reasonable expectations of the users of the online services. 

Here we may also talk about “click fatigue”. In case the user is literally forced to click too many times with different forms and buttons, giving or not giving his/her consent, by some time, such a user will just click on any button, without reading. 

Herewith, UX privacy texts, as well as forms, may be designed in a way to make the user give his/her consent on everything the controler wants to do with his/her personal data. Such cases are called “dark patterns” in privacy touches design. Herewith, getting consents on processing of the personal data in cases, stated above may not be considered lawful basis to process the personal data, as it will be not in compliance both with requirements of clause 12 of GDPR (Regulation) and with requirements to the consent (clause 7), as well, stated in the Regulation.

Thus, technologies made it possible to process the personal data of millions of people in seconds of time, getting their consents online, with just a few clicks and putting their personal data in “black boxes”, generating new information about such data subject, test it on data subjects, make new conclusions about habits or behavior of such data subjects, and use it against them, providing such data subjects with offerings, using the voundables found in the process of analysis, discriminating them by doing so. Article 12 of GDPR is one of the fundamental clauses that provides data subjects with the right to get privacy communication in an understandable way.      

  • Right to be forgotten    

One of the most interesting rights of data subjects is stated in clause 17 of GDPR. It is a right to be erased or right to be forgotten. In the era of search engines and internet archives, it is almost impossible to hide some information that was once published on the internet. Herewith, such information may be a piece of personal data and the data subject may reject it being demonstrated online upon “googling” his/her name. 

There are terms of this right to be used and also there are cases, when it can’t be used, for example – if information is used for expression the right of freedom of expression of the information (part 3 of article 17 of GDPR). Line between public interest and privacy rights has always been hard to define and in the era of the Internet, fake news, bots and other “dark patterns of journalism staff” it became even harder. 

Herewith, if the person has a right to use it’s right “to be erased”, there is an interesting obligation of the controller, that is stated in clause 2 of article 17 of GDPR, namely: 

Where the controller has made the personal data public and is obliged pursuant to paragraph 1 to erase the personal data, the controller, taking account of available technology and the cost of implementation, shall take reasonable steps, including technical measures, to inform controllers which are processing the personal data that the data subject has requested the erasure by such controllers of any links to, or copy or replication of, those personal data. 

This is a very interesting intent of the regulator to make the right to be forgotten work in practice, as taking into account the way the information may be spreaded across the network, it shall be almost impossible for the data subject to reach all the controllers / processors, who were involved in processing its data. 

  • Portability right 

The right to data portability is defined in article 20 of GDPR, namely:

The data subject shall have the right to receive the personal data concerning him or her, which he or she has provided to a controller, in a structured, commonly used and machine-readable format and have the right to transmit those data to another controller without hindrance from the controller to which the personal data have been provided. 

There are terms when the data subject may use this right, namely – when the processing is conducted by the automated means and depending on the legal basis, the personal data is processing on. 

This right is a novel of GDPR and its realization on practice is hard even in 2022, as there may be different file systems, level of telecommunications and approach to personal data usage in different companies. For example, a data subject, who is the client of the streaming platform, wants to change such a platform to another operator and desires that his/her preferences and personal data, used for recommendations of new content, were transferred. It may be done and the user will not lose his / her personalized experience in using this type of service. Yet, because of the lack of unified standart / different data privacy practices, the practical aspect of realization of this right is still evolving, as technologies are.     

  • Automated decision making and profiling 

Profiling and automated decision making based on data subject personal data takes place in numerous companies all over the world. The access to the respective technologies, such as Big data technologies, Artificial intelligence, machine learning, etc. makes it possible for the businesses to exclude human factors in most decisions regarding data subjects.

Such decisions vary dramatically and are based on the respective business processes and needs. For example, it may be a resume scoring by automated means in order to define whether the candidate meets the requirements. It may be the decision whether to provide or not a loan to the prospective borrower, based on his/her financial information. It may be a decision – which services to promote to the respective client of the online corporation, based on his / her behavior on the web-site / in the mobile app.   

In the retical 71 of GDPR the right to not be a subject of profiling and automated decision making is described, namely: 

The data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention.

The idea is that the data subject shall have a right to ask for human involvement into a decision making process. Herewith, the question always is “what do we have in a black box?” The AI or algorithm that is used in order to make decisions regarding the specific data subject must be fair. This means – not use the information in a way, it could discriminate against the data subject. It is a question of AI ethics and, herewith, impact of the algorithms, which work with personal data is hard to overestimate.

  • Other privacy rights of data subjects

Other data subjects rights, such as right to restrict or right to object, as they are defined in the GDPR were influenced by the technology development as well, as their practical realization shifted mostly to the online mode. 

Thus, with technological development, which means collapsing of “online” and “offline” worlds, more and more activity of data subjects is done using informational technologies. In cases of offline services provision, like hotels services or cafes, companies, who provide such services, also use online systems – ERP, CRM, etc. to work with personal data of data subjects. It means that data subjects have to be able to effectively realize their rights online and companies should test the possibility for users to realize each privacy right online in a convenient manner. 

For example, withdrawal of consent should be made as easy as giving such consent, and so on.       

1.4. Privacy concepts and roles

  • Data protection impact assessment

As technologies are constantly developed and their application in business may take really different forms, it’s impossible for the regulators to define all exact combinations of technologies to be implemented by the companies in data processing processes and provide the respective regulation. 

Herewith, each case of application of new technologies in data processing may be almost unique. It’s not easy to evaluate risks of using new technical algorithms for the purposes of getting benefits from the personal data of data subjects. The EU regulator has provided companies and authorities, who may be in such situations, with an instrument, called data protection impact assessment. The details of such tool are stated in the article 35 of GDPR, namely, in the clause 1, it is stated, that: 

Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data. 2A single assessment may address a set of similar processing operations that present similar high risks.     

So, the regulator is intending to protect the data subjects from risks of implementation of new technologies by making it obligatory for controllers or processors to conduct some kind of risk analysis prior the start of implementation of planned changes to the data processing. 

One of the cases, where Data protection impact assessment is obligatory, is defined in clause 3(a) of the article 35 of GDPR, namely: 

A data protection impact assessment referred to in paragraph 1 shall in particular be required in the case of: (I) a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person;

Here we may find out that the above mentioned case, inter alia, includes triggers like profiling and automated processing. As was discussed earlier, the regulator points special attention to protect data subjects from the bias and discriminative decisions, made solely by the algorithm by providing data subjects with the right not to be subjects of such decisions and demand human interference. 

Now, when we talk about DPIA, we see the attempt of the regulator to prevent cases of possible bias and discrimination of data subject, based on using its voundables, knowledge about which was obtained upon automatic study of its personal data – habits, financial history, education, etc. This attempt is made not by providing rights to data subjects, but by providing obligations to the companies in the defined cases. 

In clause 7 of article 35 of GDPR, the regulator defines the minimum requirements to the DPIA, namely: 

  • a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller;
  • an assessment of the necessity and proportionality of the processing operations in relation to the purposes;
  • an assessment of the risks to the rights and freedoms of data subjects referred to in paragraph 1; and
  • the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Regulation taking into account the rights and legitimate interests of data subjects and other persons concerned.

We see that such requirements mean that the company must conduct a research on the situation and the consequences of implementing new technology concerning risks to data subjects. Herewith, the integral part of DPIA concerns the measures to ensure the protection of data subjects rights and demonstrate compliance. Lack of conducting the DPA or conducting it in the wrong way may result in the noncompliance with the GDPR. 

Thus, the regulator answers on the challenges provided by the new technologies by impacting both on companies and data subjects, trying to create the scenarios, under which, the data subjects rights will be protected, yet the companies will be able to make innovations. 

It is interesting that in order to conduct DPIA, in cases defined by the GDPR, a Data protection officer has to be involved in such a process. One of the triggers of designating the Data protection officer (DPO) is the case, defined in clause (1B) of article 35 of GDPR, namely: 

the core activities of the controller or the processor consist of processing operations which, by virtue of their nature, their scope and/or their purposes, require regular and systematic monitoring of data subjects on a large scale; 

This is the case for most “online” corporations or companies that work with personal data as a business asset. In this case, the use of technologies is obvious and involvement of DPO in the assessment of risks of usage of such technologies is one more level of making a safe legal environment for the market in context of use of personal data.   

  • Data protection by design and by default

As development of new software is literally an unstoppable process, it is important to assure that such new developments are made taking into account the privacy requirements to such systems. If, for example, the planned mobile application shall be used by millions of kids all around the world, special attention should be put to the high level of protection of personal data by means of technical measures and, at the same time, following the principle of data minimization.

GDPR defines such consent as data protection by default and by design in its article 25. Herewith, recital 78 of GDPR gives us more information regarding appropriate technological and organizational measures in frames of development of technological products, namely, inter alia:

When developing, designing, selecting and using applications, services and products that are based on the processing of personal data or process personal data to fulfill their task, producers of the products, services and applications should be encouraged to take into account the right to data protection when developing and designing such products, services and applications and, with due regard to the state of the art, to make sure that controllers and processors are able to fulfill their data protection obligations. 

The whole idea of data privacy by design and by default in frames of technology impact on privacy in general, is to make the market (the business) put privacy first upon development of new features and products. Rising such privacy culture in companies, where privacy programs have real impact on new developments means better protection of the rights of data subjects, yet, not limiting the companies in innovations.     


Technology development and its impact on all aspects of modern life is dramatic. We live in the world of shifted paradigma, where we provide personal data online, get services online, “live online”, yet, “online” is just the way of transmitting the information. Herewith, “online” is very dynamic and as data subjects, we face cases, where only seconds after providing some personal information, we are, in the best case, viewing the customized advertising, and, in many others – our personal information may be used against us. 

It’s hard to determine the pure different technologies, which impact our privacy mostly. It’s always a combination of different technical decisions in order to receive some business aim. For example, banks may analyze our transactions and then share such information with the advertisers, who will analyze it and provide us with the customized advertising. Is it legal? It depends on the case, but in any case, in the world, where technology develops fast, we, as data subjects, wish to keep control on our personal data. We want to know the way it is used and how it affects us at the end of the day. 

Regulators may point some special attention on different technologies, using of which may result in violation of privacy rights of data subjects.    

It’s impossible to draw up a regulation, which will state the specific rules for each technology and its possible use. Regulators adapt, and one of the ways of such adaptation is stating the principles, logic, key procedures and barriers to protect the data subject, and, herewith, provide guidance for the specific cases. When we talk about GDPR, as one of the most interesting pieces of legislation, we should also take into account the guidelines from the European data protection office (EDPB). EDPB replaced Working party 29 and continued to help businesses from all over the world with questions regarding best practices in data processing. 

Herewith, we may take a look at some interesting points in the guidelines, provided by EDPB regarding use of new technologies in privacy.  

2.1. Social media and customized advertising 

Who doesn’t have an account in one of the social networks? Facebook, Twitter, Tiktok and other social networks get our attention day after day. We scroll the news, we click like, we watch some type of content longer, then others. We create data about our habits, interests and lifestyle. We create data, which may be used to develop a specific advertising banner, specially for us. It may sound creepy, but combining different information about a data subject and putting it into some black box, called “AI” may lead to microtargeting. That can lead into putting a data subject into a “bubble”, where he/she may not see generic offers, he/she could see, because of specific tactics of the advertisers. 

EDPB provided a Guidance on targeting social media users

It deserves special attention, as social media targeting has become some kind of demonstration of the impact of technologies on privacy nowadays. 

EDPB states, that there such actors in this process: social media providers, their users, targeters and other actors which may be involved in the targeting process.

One of the risks of using social media targeting, as defined by EDPB is as follows: 

Targeting of social media users may involve uses of personal data that go against or beyond individuals’ reasonable expectations and thereby infringes applicable data protection principles and rules. For example, where a social media platform combines personal data from third-party sources with data disclosed by the users of its platform, this may result in personal data being used beyond their initial purpose and in ways the individual could not reasonably anticipate. 

The profiling activities that are connected to targeting might involve an inference of interests or other characteristics, which the individual had not actively disclosed, thereby undermining the individual’s ability to exercise control over his or her personal data. Moreover, a lack of transparency regarding the role of the different actors and the processing operations involved may undermine, complicate or hinder the exercise of data subject rights.

This leads us to the point that because of the complexity of the remarketing, microtargeting and targeting in social media, data subjects may feel a lack of control of their personal data.

For example – optional user Bob from Vilnius “googled”  a toaster, then he went to the some web-sites from top-3 in Google, and, on some of them, he put the toaster into the basket, and, on one of them, he actually ordered the toaster, which was delivered to him in 2 days. Now, Bob, eating a toast, wants to find out some news about his friends in his favorite social media and sees different ads. One of them is from the web-site, where he almost bought a toaster. It suggests him to come back and finish the purchase. He sees another advertisement, it’s from the web-site, where he actually bought a toaster, it offers him to buy a microwave with a discount. The third advertisement is from the company, on which web-site he did not enter. It offers Bob to buy the best bread for toasts.   

All of suchs ads may be absolutely legal and this case is casual, not paranoiac, as Bob saw ads, based only on one episode of his browser history. In this case technologies such as Facebook pixel and Facebook auditories may have been used (in case, Bob saw ads in Facebook). The question is – did Bob provide respective consents to cookies on the web-sites? Did such web-sites have respective notices and terms in their privacy policies? Such a list of questions is far from being full. 

Bob may have a feeling that corporations look at him as some kind of Big brother and it will be not far from the truth. On the other hand, social media may be just a tool for companies to reach the respective data subject, using his/her personal data, when they did not have any legal basis to do so.  

One more privacy risk, that is discovered in the Guidance by the EDPB is as follows: 

The possibility of discrimination and exclusion. Targeting of social media users may involve criteria that, directly or indirectly, have discriminatory effects relating to an individual’s racial or ethnic origin, health status or sexual orientation, or other protected qualities of the individual concerned. For example, the use of such criteria in the context of advertising related to job offers, housing or credit (loans, mortgages) may reduce the visibility of opportunities to persons within certain groups of individuals. The potential for discrimination in targeting arises from the ability for advertisers to leverage the extensive quantity and variety of personal data (e.g. demographics, behavioral data and interests) that social media platforms gather about their users.11 Recent research suggests that the potential for discriminatory effects exists also without using criteria that are directly linked to special categories of personal data in the sense of Article 9 of the GDPR.

Thus, here we may see the same concern, as with automated decision making using profiling. The personal data of the data subject may be gathered in different ways (sometimes not with the respective legal basis), combined and used against the data subject. Herewith, in doing so, we can see the purpose of some human being or company (group of people). The technology itself allows us to work with personal data, get insights and use them. The way they shall be used is determined by the controller.   

2.2. Cloud computing services and privacy

There are a lot of different regulations all over the world regarding cloud services. In some cases, it is prohibited to keep the personal data of data subjects, citizens of some country, on servers, outside of such a country. In other cases – in order to transfer personal data outside of some region, there has to be an explicit consent from the data subject, and so on.

Business people showing document to client. Computer, profile, diagram. Employment concept. Vector illustration can be used for topics like headhunting, human resources, recruitment

Development of technologies of cloud computing and cloud services as we know it today provided an opportunity for businesses to host data all over the world and have access to it in seconds. In most cases, when we talk about privacy issues and regulations, most data storage companies have respective privacy policies and privacy documents, which define them as processors of personal data. According to the recital 85 of the GDPR, 

To ensure compliance with the requirements of this Regulation in respect of the processing to be carried out by the processor on behalf of the controller, when entrusting a processor with processing activities, the controller should use only processors providing sufficient guarantees, in particular in terms of expert knowledge, reliability and resources, to implement technical and organizational measures which will meet the requirements of this Regulation, including for the security of processing.

Thus, cloud providers have to achieve the respective level of data protection and must be able to demonstrate it. Data centers usually build their privacy program around this status of processor and adequate technical and organizational measures, needed to assure the high level of personal data protection. Hence, as processors act only under instructions of the controller regarding processing of respective personal data, the data processing contracts of cloud providers are usually very strict and non-negotiable.     

2.3. Privacy tech as a new niche    

Day after day, new privacy laws are adopted all over the world. Some of them have extraterritorial effect, some have specific provisions. There are regulations, like GDPR, which apply to the whole region. Yet, in the USA, we may soon see a specific law in each State. It is hard for global companies to comply with all such laws at one moment and to keep updated on its amendments. 

Building a privacy program is a tough task, and implementing it in practice is even harder. Upon GDPR coming into force, law firms all over the world helped with GDPR compliance in the way they understood GDPR, drafting different policies, procedures and data mapping. 

Herewith, the privacy domain is not only about lawyers. It’s about engineers, privacy champions and C-level managers as well. On the other hand, there has to be a person to lead the privacy program, like chief privacy officer and also there is a Data protection officer, designating of which can be obligatory. Another challenge is software for privacy governance and privacy management.

Can using software, which means using services of privacy tech startups lead to absolute privacy laws compliance? It depends. There is a lot of “tactical software”, which may help, for example with building the data mapping, implementing a “gdpr-compliant” cookie banner or managing DPAs with the contractors. On the other hand, there is software trying to help with privacy compliance in general. 

The author of this essay believes that using such software may help with reaching privacy compliance, but yet, it is just a tool. It is a part of the way to get GDPR compliant and such instruments have to be used smartly in the hands of the chief privacy officer. 

Thus, as it can be seen, not only technology affects privacy, by stimulating new regulations and requirements to the market players, but privacy stimulates development of new software technologies. 


  • It’s impossible to stop development of new technologies. Yet, its usage may be regulated. Privacy as a fundamental right is regulated by specific rules, which put innovative businesses, working with personal data in a frame, aimed to protect the data subjects rights and not to harm the innovations as a concept. 
  • The impact of technologies to privacy is provided by the mix of different innovations, which are used differently in different cases worldwide, which forms privacy legislation in a way to meet new challenges of innovations.
  • Because of technology development, a lot of new concepts and roles were implemented into privacy regulations and it keeps updating, same as technologies.
  • Privacy rights realization on practice in real time became the key lacmus paper in determining the privacy compliance of the respective controller or processor.      


    Your question to IT lawyers