On May 25, 2018, the new European privacy law, the General Data Protection Regulation (GDPR), came into effect. This law governs the collection, storage, use, sharing and disposal of personal data belonging to residents of member states of the European Union (EU) and countries in the European Economic Area (EEA).
On May 25, 2018, the new European privacy law, the General Data Protection Regulation (GDPR), came into effect. This law governs the collection, storage, use, sharing and disposal of personal data belonging to residents of member states of the European Union (EU) and countries in the European Economic Area (EEA). This law has far-reaching impact known as extraterritorial scope where organizations outside of EU and EEA that process personal information of residents in these areas may also be subjected to the requirements under the GDPR.
Organizations globally frantically updated their privacy policies even as they grapple with operationalizing the regulation. It was boom time for privacy law firms all over the world. The key factor that caught the attention of firms and propelled them to put in place policies and procedure to comply with the law is the hefty fine of up to €20 million, or 4% of the previous financial year’s worldwide annual revenue, whichever is the higher of the two for serious violations. Since then, the GDPR has become the gold benchmark for data protection laws all over the world.
One of the key requirements of the GDPR is the determination of the legal basis for the collection and processing of the personal data of individuals.
Additionally, the GDPR also call out a special category of personal data that includes the following:
- racial or ethnic origin,
- political opinions,
- religious or philosophical beliefs,
- trade union membership,
- genetic data,
- biometric data for the purpose of uniquely identifying a natural person,
- data concerning health
- data concerning a natural person’s sex life
- sexual orientation
These data are more sensitive in nature and the GDPR requires firms to exercise more care in ensuring that the collection and processing of these data are absolutely necessary for the purposes that they have identified and implement higher levels of protection.
Furthermore, the GDPR has rules that set parameters for the use of automated decision making where a decision is made based solely on automated means without any human involvement. The risks of automated decision making are the potential for bias discrimination against individuals and decisions that produce potentially damaging legal effects concerning an individual. GDPR requires organizations to allow individuals to object to automated decision making or request human intervention or challenge a decision. An example of automated decision making is denial of social benefits based on profiles or other status. Information collected by an organization may not be accurate and individuals should have the right to access and correct the information.
Genesis of GDPR
The GDPR is the most significant enhancement of previous European privacy laws that have evolved overtime. The origin of these laws can be traced back to World War Two when Adolf Hitler, through the Nazi party, wanted to create a perfect race by removing what he deemed to be “imperfections” in individuals who were a threat to the purity of the Aryan race. These included minorities, treated as “enemies of the state”, such as the Slavs (Eastern Europeans), gypsies, homosexuals, the disabled, prostitutes, Jehovah’s Witnesses, alcoholics, pacifists, beggars, hooligans, criminals and the Jews.
Driven by the conviction that the Aryans were the “master race” the Nazi fanatically pursued their eugenics policies of eliminating individual with disabilities or social problems into order to eradicate their “degenerated genes” from being passed down to future human bloodline. These were done through sterilization, euthanasia and being sent to concentration camps.
A more efficient system
To carry out its vision to create this perfect race, the Nazi government first needed to carry out a national census to profile individuals with the aim of identifying the targeted groups mentioned earlier, and to do so efficiently. With the help of tech giant IBM’s subsidiary in Germany, Deutsche Hollerith-Maschinen GmbH (Dehomag), the Nazi government used IBM’s punch card technology to profile what would under today’s GDPR to be considered sensitive data such as racial background and religious belief. This was done through census workers going door to door filling out punch cards and having the punch cards sorted using Dehomag’s tabulation machines. These machines were also placed at concentration camps and with the prisoners’ background and how they died coded:
Background Cause of Death
3 Homosexual 3 Natural
9 Anti-social 4 Execution
12 Gypsy 5 Suicide
8 Jew 6 Gas Chambers
From that project, Hitler discovered that there were about two million Jews residing in Germany and thus began his infamous “Final Solution to the Jewish Question” resulting in the persecution and causing the deaths of about six million European Jews by the end of the Holocaust.
Tabulation of census personal data was not the only way of profiling. More manual methods were also employed such as tax returns, synagogue and community membership lists, parish records (for converted Jews), police registration forms, the questioning of relatives, information provided by neighbors and municipal officials, individual identity papers and local intelligence networks. Hence, the Nazi would have carried out their vision of eliminating those they believed to be sub-human in any case even without IBM’s technology. However, the use of automation had allowed the Nazi party to scale their monumental operations at a much faster rate which resulted in one of the worst atrocities in human history.
Never Again!
For the Jews who were identified, persecutions started with social and employment discrimination which included boycott of Jewish businesses by the Nazi party, removal of German citizenship, not being able to participate in certain profession such as doctors, teachers, lawyers and civil servants and Jewish children were not allowed to go to school. Anti-Semitism laws were passed where marriage and sexual relations between Jews and Germans were outlawed and Jews were stripped of all civil and political rights. After an incident involving the assassination of the German ambassador to France by a Jew, persecution of the Jews became violent. Jewish properties were destroyed and Jews were physically assaulted.
Jews were not the only group who suffered under Nazi rile. Those with physical or mental disabilities or had hereditary disease were subjected to sterilization and later on euthanasia by being gassed. Other minority groups who were considered to be inferior were sent to concentration camps and died of starvation or worked to death.
When Hitler first came to power, he also banned trade unions as he deemed them to be threats because of their influence over the workers and replaced these unions with his own German Labor Force.
When World War Two ended, the world declared that atrocities committed by Nazi must never be repeated and the Universal Declaration of Humans Rights was adopted by the United Nations. Included among its 30 Articles is Article 12 which states: “No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honor and reputation. Everyone has the right to the protection of the law against such interference or attacks.”
In 1970, West Germany introduced what’s considered to be the country’s first modern data privacy legislation governing public sector data in the West German state of Hesse. In 1977, the Federal Data Protection Act was passed to protect residents “against abuse in their storage, transmission, modification and deletion.” Today, Germany has one of the strictest interpretation and application of the GDPR.
All’s well that ends well?
Today, automation has gotten better. A lot “better”.
The Dehomag’s tabulation machines reportedly could sort about 7 cards per seconds. Today’s modern desktop computer can process over 100 billion instructions per second at the time of this writing.
And because of its astonishing processing power, computers are being “trained” to identify patterns and make automated decision making with minimum human intervention through feeding of huge amount of diverse data (hence the term “big data”) into its algorithms for analysis and inference. “Machine learning” as it is known is that branch of intelligence where the aim is to automatically improve algorithms overtime through learning from experience.
Applications of machine learning includes automated profiling of individuals for employment matching or assessments and credit worthiness. Machine learning is also applied to facial recognition and whose use include providing secured access to our devices. However, facial recognition is also used by governments for surveillance with the purpose of maintain public safety. At the time of this writing, the Metropolitan Police in UK uses facial recognition CCTV to identify criminals for crime preventions. In China, the government has rolled out its social credit system where its residents are assigned scores for their social behaviors. Any anti-social behavior such as jaywalking, over purchase of alcohol or publicly making negative comments about the government will result in deduction of their scores. Having scores below certain levels will result in loss of certain freedom or rights such as restrictions of travels on trains or planes or not being able to enrol their children in private schools.
Ethical challenges of the use of AI as described above include:
- Algorithms biasness as a result of bias inputs, inadvertently or otherwise, by programmers
- Lack of transparency of the use of AI and criteria used for the automated decisions
GDPR requires legal basis before data can be processed by government agencies or companies. Though explicit consent is the most common legal basis used, it is not the only basis. Other legal basis includes for the performance of a task carried out in the public interest. The questions that arise are (a)who defines what constitute public interest and base on what criteria and (b) who decide what data will be collected and how they will be used. While this would seem on surface not a concern in a democratic society, especially where public trust is high, what about a society governed by a tyrannical totalitarian regime?
“This (The Declaration of Human Rights) was a bold and clear commitment that power would no longer serve as a cover to oppress or injure people, and it established equal rights of all people to life, liberty, security of person, equal protection of the law and freedom from torture, arbitrary detention or forced exile.” – Former US President Jimmy Carter