GDPR: Artificial Intelligences’ Major Blockage

The data protection and privacy law, which came into effect across the EU on 25thmay have a great impact on companies building machine learning systems. We know that in order to build these systems, companies’ needs large amount of data, but Big data is completely opposed to the basis of data protection.

 

According to the EU Data Protection Regulation, companies must meet three specified transparency requirements (along with other suitable safeguards) in order to better inform data subjects about the Article 22 (1) type of processing and the consequences:

 

  • inform the data subject purpose of data storage;
  • provide meaningful information about the logic involved; and
  • explain the significance and envisaged consequences of the processing.

Hypoglycemia may viagra mastercard even lead way to certain neurological dysfunctions like ataxia and stroke in age old patients. If left untreated, diabetes can pfizer viagra price cause many complications. To act efficiently this drug has been wisely developed as a PDE 5 inhibitor that can effectively restrict the mechanism of appalachianmagazine.com order cialis canada this concerned enzyme and reduces the consumption of alcohol and quit smoking. Anxiety can be due to concerns of sexual performance or enhancing male power during intimate moments. female viagra buy

The logic behind is to be aware of how far this transparency provision is interpreted and whether companies have to fear or not.

 

AI is omnipresent: From the analysis of large and complex data sets such as genome data in medical research, Predictive Policing in the police and security sector to digital language assistants such as Apple’s Siri or Alexa of Amazon. Even fitness apps are increasingly relying on the use of AI and machine learning in order to be able to offer each user a tailor-made training plan optimized for them.

 

This trend has not gone unnoticed by politicians. After the European Commission presented a European concept in the field of artificial intelligence at the end of April, the parliamentary groups also took part on 26 June 2018 , to discuss over recommendations for action in the handling of artificial intelligence – especially in legal and ethical terms – by the summer break in 2020. The AI ​​concept of the European Commission also provides extensive research and development measures with the aim of promoting AI innovation in Europe.

 

Even if the use of AI does not necessarily have to be associated with the evaluation of personal data in every case of application, for example in banking and insurance, it also stays suitable for the comprehensive evaluation of personality traits (so-called “profiling” / “scoring”). According to European data protection authorities, an example of profiling and classifying is the following:

 

a business may wish to classify its customers according to their age or gender for statistical purposes and to acquire an aggregated overview of its clients without making any predictions or drawing any conclusion about an individual. In this case, the purpose is not assessing individual characteristics and is therefore not profiling.”

 

It is therefore astonishing that the concepts of the EU Commission with regard to the data protection measurement of AI use have so far remained rather vague for companies.

 

Regardless of the admissibility of a particular procedure, these transparency obligations are often seen as extremely critical in the light of the protection of trade and business secrets. The reason for this is that the person concerned must also be provided with “meaningful information about the logic involved” and it is still unclear to what extent and to what amount this information is to be given. The key question is whether the person in charge, ie the company using the AI, is only required to explain the principles and essential elements underlying an automated decision-making process descriptively, or whether the disclosure of calculation formulas, parameters and algorithms can actually be demanded from this.

 

In any case, with the view expressed here, there is no obligation to disclose formulas and algorithms from the GDPR. The transparency provisions of the GDPR therefore only require “meaningful information about the logic involved” of automated decision-making, but not the actual publication of these logics. According to this, the responsible party owes only a description of the principles underlying an automated decision-making process, that is to say about the fundamental laws by which an algorithm makes decisions. The purpose of the GDPR obligations is therefore not (as often represented) to enable the concerned person to recalculate the results of an automated decision-making process, for example the “score” of the concerned person. This would require, for example, the specific calculation formula and the calculation parameters. Rather, in the context of the transparency provisions, for example in the context of a privacy policy, the data subject should only be given the opportunity to obtain advance information on the extent to which his data is processed by a particular service provider and, if appropriate, to look for alternatives.

 

This view is not contradicted by the requirement of “meaningfulness” of the required information. On the other side, for the average user, a comprehensible description of the underlying processes may represent a greater added value than the disclosure of the mathematical-technical logics themselves. Only by then a generally understandable description can meet the requirements of the GDPR. This requires that all information to be provided must be provided in an intelligible form and in a “clear and simple language”.

 

In summary, the GDPR lurks no real danger for the protection of know-how. Rather, their admissibility requirements and transparency obligations in the use of automated decision-making are consistent and appropriate: Human individuals should not become the ordinary “ball” of machines. If machines make automated decisions without being checked by professionals for precision, it can lead to insignificant results as well.

Cheap Tents On Trucks Bird Watching Wildlife Photography Outdoor Hunting Camouflage 2 to 3 Person Hide Pop UP Tent Pop Up Play Dinosaur Tent for Kids Realistic Design Kids Tent Indoor Games House Toys House For Children