GDPR: Privacy by design

Privacy by design, which facilitates the protection of data through technology design, is a vital area that GDPR attempts to improve in the field of data security. It is crucial for organizations to perform security controls, such as penetration testing, because of the regular reports that GDPR now requires and because of the heavy fines that the European Union can now impose for mishandling consumers’ personal data. Privacy by design eases the impact of implementing the GDPR requirements because data processors will have already integrated data processing procedures into the technology. Privacy by design uses compliance through risk assessments prior to making decisions about data security. Risk assessments can be performed through a wide array of methods, yet they all require the same periodic and thorough approach; they should be performed often and proactively. Being proactive is being responsible with the valuable data your organization possesses. Critical cyber infrastructures can never be completely secured from hackers, but your organization can greatly mitigate cyber threats by being in control of the situation, rather than responding to an ongoing crisis.

Because of GDPR, companies must now have procedures in place from the very beginning of process development that comply with GDPR. They must now analyze risk much more actively than previously required, realizing the implications of their data-processing procedures. In order to simplify the planning and implementation of privacy by design, a seven-step process has been developed.

These steps are:

  • Be proactive, not reactive;
  • Privacy should be the default setting;
  • Privacy should be embedded into the design;
  • Full functionality: positive-sum, not zero-sum;
  • End-to-end security;
  • Visibility and transparency;
  • Respect for user privacy.

Some of the above steps are self-explanatory, and others require some explanation. Step three warrants that processors use careful planning by implementing security measures that will be able to adequately handle certain levels of risk while limiting the negative impact on the rights and freedoms of data subjects. End-to-end security is a blanket term that emphasizes the holistic approach organizations should take. They must constantly be checking and updating their security systems through testing because as step one admonishes, organizations must be proactive, taking preventive measures in addition to reactive ones.

Penetration testing is the simulated attack and scanning of computer systems that identifies vulnerabilities so that they can be corrected before a real attack occurs. Penetration testing should be conducted, at the bare minimum, once a year. Preferably, they should be conducted many times a month. Security controls, ranging from preventive to compensatory, should be regularly checked. There are multiple facets to penetration testing, such as vulnerability scanning, which can be performed automatically. Vulnerability scanning checks to ensure that systems are up-to-date and that security software has been updated. Internal infrastructure and all other critical infrastructures should also be constantly monitored and tested for vulnerabilities. Systems such as firewalls and web filtering should be tested for susceptibility to viruses. The end-to-end life cycles of data should also be examined in order to reduce security risks. GDPR recommends using encryption and pseudonymisation when implementing privacy-by-design, but it is up to organizations themselves to determine which measures they should take. Step two should not be overlooked. Data controllers must use appropriate measures both on a technical and an organizational level to ensure that personal data is used only for its specified purpose. Data collectors must do their utmost to minimize the amount of data collected along with the amount of time that the data is stored. In addition, all settings should be set to the most privacy-friendly level, so that users have to consciously change the setting to a less privacy-friendly setting.

Testing your organization’s security systems is just one step in a larger process. Organizations must also practice due diligence when it comes to employee awareness. Your staff may be the last line of defense against a cyber attack, so they should be prepared to handle any security threat. Phishing resilience will help to ensure preparedness. Using programs such as Integrity’s ActiveThreat assesses employee behavior in order to better train your organization. Simulated phishing attacks through phony emails and messages will allow your organization to gauge how susceptible it is to security threats. Employees must have training in areas such as phishing in order to limit susceptibility to fraud.
Integrity is here to offer your organizations help in the fields of penetration testing, phishing resilience, risk management, and GDPR implementation.

Will GDPR affect automated decision making and profiling?

Companies that use automatic decision making processes in order to profile consumers are taking note of GDPR. Article 22, Section 1 prohibits automatic decision making of personal data if such decision making produces “legal or similarly significant” effects on the consumer. The vagueness of this section has resulted in many interpretations, so where do companies that depend on automatic profiling stand?

The lack of clarity about Article 22, Section 1 should not cause alarm for companies that use automatic profiling when price matching. The ramifications that automated decision making can produce are neither legal nor significant because discriminatory practices that price-matching companies use do not greatly impact the lives of consumers. Consumers making purchases because of price matching are not making decisions that could critically change their lives, nor are they being denied any legal rights. In comparison to practices that could produce significant and legal effects, the practices involved in price-matching are trivially impactful.

Automatic profiling decisions that could be significantly or legally impactful are often made by companies that play crucial roles in people’s lives. These companies deal with financing, insurance, transportation, etc. An example would be if a company denies an individual’s request for a loan based solely on automated decision making, then the company is significantly impacting the welfare of the individual. While the company’s decision does not violate any pre-GDPR laws, the rejection does have the potential to greatly reduce the fortunes of the individual. Another example of a legal effect is the imposition of speeding fines based off camera evidence using solely automated decision making. Meanwhile, the profiling activities performed by a shopping-assistance company that selects products tailored to certain needs are, comparably, insignificant. Activities that could have legal or otherwise significant ramifications rightly merit human intervention. Yet companies that operate in a market of much less gravity should not need to encumber themselves by adding an unnecessary human element to a speedy and effective operation.

GDPR’s opacity can create skepticism and even uncertainty, even for companies that do not appear to be affected. But the wording can be used to companies’ advantage. For example, a shoe company that uses personal data to advertise directly to consumers does not seem like the target of Article 22, Section 1. However, a shopaholic may go on a shoe shopping spree upon the appearance of direct marketing advertisements from the shoe company, forcing the shopper to mortgage his house. The company has, thus, significantly impacted a data subject with automated decision making. According to this line of thinking, virtually every automated profiling process can affect data subjects significantly. Surely, the European Union did not implement GDPR only for companies to be unable to follow it. GDPR is meant to give data subjects more freedom over how their data is processed. It is not meant to prohibit companies from offering their services to consumers. While GDPR does rein in some practices of companies like loan institutions and insurance companies, it most likely does not affect the automatic profiling practices of other companies, like those that price-match.

Integrity can help your company to review its practices in order to fully comply with GDPR regulations.