The BA Fine – The Data Protection Landscape is Changing

How will the BA Fine affect the data protection landscape? After over a year of delay, the Information Commissioner’s Office (ICO) finally issued their much-anticipated Penalty Notice against British Airways on 16th October 2020.

There have been headlines and debate around the size of the penalty. But the real focus should be on what the Notice tells us about the ICO’s approach to assessing whether an organisation is following the GDPR with regards to information security.

“When organisations take poor decisions around people’s personal data, that can have a real impact on people’s lives. The law now gives us the tools to encourage businesses to make better decisions about data, including investing in up-to-date security.” – ICO press release.

We have distilled the 114-page Notice into three insights containing the key lessons to learn:

  1. The data protection landscape is changing – act now 
  2. How British Airways security flaws let data theft unfold
  3. The boardroom’s responsibilities – what are your business risks?

Join our webinar on THU, NOV 5, 2020 10:00 AM – 10:45 AM GMT where we will discuss how this fine will affect data protection.


The importance of assessing risk


The ICO’s messaging to Data Controllers frequently highlights that the law requires ‘appropriate’ measures:

“Not every instance of unauthorised processing or breach of security will amount to a breach…The obligation under Article 5…is to ensure appropriate security; the obligation under Article 32 is to implement appropriate technical and organisational measures to ensure an appropriate level of security…” (6.5, p29)

The challenge the Controller faces is to have in place a process for assessing what is “appropriate,” especially when it comes to IT security when there are many methods and means to protect data, each with different costs and impacts upon the organisation.

The ICO noted that Articles 5 and 32 outline what is required. First, they require you to look at the organisation and the data:

“the nature, scope, context and purposes of processing…” (6.5, p29)

In the BA breach case, the ICO considered the size and profile of the organisation and the nature of its business. The ICO concluded BA should have recognised that the delivery of its services required it to process large volumes of personal data and that it was likely to be targeted by attackers.

All organisations, therefore, need to consider how their operational model (locations; service delivery model) the sector they are in (e.g. healthcare; education; charity) and their own history (rapid expansion; changes in approach) bring heightened expectations and/or risks.

Next, the Articles ask us to bring people into our assessment. We need to consider

“…the risk to the rights of data subjects.” (6.5, p29)

The GDPR outlines, in recital 75, examples of potential risks to individuals. These include: being prevented from exercising control over their personal data to identify theft or fraud, damage to reputation through to physical harm, financial losses. The ICO will look at the degree of damage or harm (which may include distress and/or embarrassment) when identifying the circumstances in which they consider it will be appropriate to issue a Penalty Notice. (2.37, p16)

The ICO concluded that BA should have recognised the risk that a breach may have had significant consequences for its customers.

How did BA not recognise this? This points to our next lesson.



The importance of valuing all personal data


No special category personal data was affected by the breach.

But BA’s attempt to argue that that the ICO had “severely overstated” the sensitivity of the data affected was knocked back by the ICO; they consider that:

“the loss of control by BA of personal data such as names, addresses and unencrypted payment card data to be particularly serious, allowing as they do the opportunity for identity theft.” (7.32, p69)

This highlights that assessing the value and sensitivity of data, and its importance to people if it were mishandled, is never as simple as having just three categories (e.g. not personal, personal data or special category personal data).

In this case, the ICO noted that when financial data, especially full financial data, is disclosed (i.e. the card details and the CVV) and where there is a high volume of data disclosed, it gave that data a greater value and significance.

And we have seen the ICO take action regarding Easyjet in July, asking them to make their loss of flight details public because such details could be used to produce convincing phishing messages during a time when more people are likely to react to emails about refunds (due to the impact of Covid).

So there is a need for a more nuanced assessment with the individual’s rights and welfare at the heart of it. For example, consideration should be given to:

  1. The category of people whose data is being processed.
  2. The volume of records and volume of data per record.
  3. The data “stickiness” (how difficult/inconvenient it is for someone to change the data if it is exposed/misused?)
  4. The data “criticality” (how much does the individual and the people they interact with rely on the data being accurate and available, and what is the degree of effect if it is not?)
  5. The data “reach” (how far outside of their direct control will the data travel for the processing?)

Adopting a “person-centred” approach to assessing the value and sensitivity of personal data will help you assess the potential risk to the rights of data subjects. Next we need to look at the final aspect for assessing what is appropriate: technology and cost.



A strong relationship between Data Protection and IT experts is key


The GDPR recognises that one size fits no one. Article 32 enables a cost-benefit analysis when deciding which technical and organizational measures are appropriate for your organisation. You can consider:

  • The “state of the art” – i.e. what the current industry standard solutions given the current threats and risks.
  • The ”costs of implementation” – i.e. you can consider the costs of implementing a certain approach.

The BA case highlights the importance of data protection experts working with IT experts in order that their different but related skills and knowledge can work together here.

  • Data protection experts are responsible for prompting the organisation to define the value and significance of the different data being processed. As we saw in lesson one, this should be based on an assessment of the nature, scope, context and purposes of the processing and risks to individuals. It should also consider the operational, commercial, regulatory and ethical risks the organisation faces when handling data.
  • IT should provide expertise about the threats and risks the organisation faces, the technical options available to mitigate those risks, and the costs of implementing those options.

Working together can ensure consistent assessments of risk so that the available options, their benefits, costs and risks can be presented to decision-maker and a risk-based decision made.

As well as being important for the selection of appropriate security measures, a strong working relationship is also an important factor in ensuring IT projects do not result in systems that fail to comply.

BA was found to be accidentally logging payment card details (including, in many cases, CVV numbers, the majority of which were unencrypted, in plain text). BA stated that this was done for testing purposes; it was only intended to be operating when the system was not live but was left active by mistake. The attacker accessed 108,000 records that BA should not have been recording in the first place.

The logging and storing of these card details…was not an intended design feature of BA’s systems and was not required for any particular business purpose…This error meant that the system had been unnecessarily logging payment card details since December 2015. (3.22,p22)

The fact that BA did not identify that the credit card logging feature remained active after its system went live in 2015…demonstrates a failure to adopt appropriate technical and organisational measures…and compliance with the data protection principles, including data minimisation. (6.89-6.91, pp53-54)

If Data Protection and IT experts work together from the outset of new projects, adding their expertise alongside input from the business into Data Protection Impact Assessments; consider how the system can be built on the principle of Data Protection by Design and Default, and articulate the impacts (in cost, time, functionality) and benefits of potential solutions, the project can deliver business outcomes and ensure GDPR obligations are met.

In this case, the ICO noted that:

“None of the…measures [the ICO had highlighted] would have entailed excessive cost or technical barriers. They are all readily available measures available through the Microsoft Operating System used by BA” (6.72, p48)

Once this approach is up and running, there is a need to consider the next lesson…



Document decision-making and ensure paperwork matches reality


When BA tried to defend its approach, it was often unable to provide evidence of its decision making.

For example, BA tried to say that it was following guidance when not requiring Multi-Factor Authentication (MFA) for certain remote network access. But when pressed for evidence of the decision making behind this approach, the ICO flagged that:

“Given…that no copy [of the risk assessment] can now be located, it is not possible to say that BA took into consideration the risk, the state of the art, the cost, or the available technical measures when deciding what security was appropriate.” (6.26, p34).

Similarly, when the ICO ask why other measures had not been considered or implemented, BA’s arguments failed to convince:

“BA argued that it was untenable to suggest that whitelisting was an alternative in practice…However: (i) there is no evidence that BA considered what alternative measures could be put in place as an alternative to MFA…and (ii) even if BA is correct that this solution would not have proven viable, it does not obviate the need to consider appropriate measures or explain why other appropriate measures were not in place” (6.27b, p35)

You, therefore, need the paperwork to evidence and support your decision-making, especially if the decision is to accept a degree of risk.

The ICO also noted how often BA’s policies and statements did not reflect the reality of what they found happening in practice.

At a policy level, a BA policy said Multi-Factor Authentication (MFA) would be used for all remote network access. When challenged to explain why 13 of 243 application were not actually protected by MFA, the ICO concluded that:

“BA has not provided a satisfactory explanation as to why…it was deemed unnecessary for certain applications…to comply with the policy requiring MFA.” (6.21, p32-33).

At a strategic level, BA highlighted its “extensive commitment to information security.”  The ICO accepted BA had demonstrated commitment to certain aspects of information security and had put in place a programme to prepare its systems for the introduction of the GDPR.

Yet the ICO had to conclude that the programme had failed to identify and address the deficiencies in BA’s security that were highlighted by the attack, so BA was negligent in failing to ensure that it had taken all appropriate measures to secure personal data. (7.21 and 7.23, pp66-67).

Paperwork must therefore reflect reality. The GDPR’s accountability principle requires organisations to maintain evidence of their compliance and the effectiveness of their measures. Decisions around IT security should therefore be reliable, true and honest reflection of your approach, and/or decisions to change, amend or deviate from agreed policies and procedures.



Timely breach reporting procedures are key


The need to report all breaches to the ICO, unless you consider that the breach is unlikely to result in a risk to peoples’ rights and freedoms was a major change in data protection law.

BA did act swiftly in this regard. Within two hours they had made changes to stop the breach. Within a day they had notified the ICO, acquirer banks and payment schemes, and 496,636 affected customers about the incident. (3.26-3.27, pp23-24)

The ICO took into account how swiftly they acted, and that BA also issued as a press release to 5,000 journalists and commentators, and was active on television, social media and in the press about the attack (7.42, p72).

Having a clear process by which staff can report actual or suspected breaches, which is well publicised via training and awareness materials, has never been more important.

However, the issue, in this case, was not the response to being made aware of the breach, but who made BA aware of the breach. It took a third party to inform BA that data was being sent from to

“The failures are especially serious in circumstances where it is unclear whether or when BA itself would ever have detected the breach. BA was only alerted to the [redirection] of personal data from its website [to the Attacker’s site] by a third-party. In the absence of that notification, the number of affected data subjects and any financial harm to them could have been even more significant” (7.10, p62)

The decision of whether to deploy software and tools to monitor and detect unusual or unauthorised activity within your systems will come down to the assessment of the risks and whether such measures are appropriate for your organisation.

Join our webinar on THU, NOV 5, 2020 10:00 AM – 10:45 AM GMT where we will discuss how this fine will affect data protection.