Mainframe Blog GDPR Blog

GDPR – Some data recovery implications (part 3)

Phil Grainger
by Phil Grainger

Why Do I Need to Repeat Recovery Testing for GDPR?

As we saw in my last blog on recovery and GDPR, the relevant part of the regulation says a company must have:

“…. a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.”

When I first read this, I was somewhat perplexed. It seemed strange that once you had proved both recoverability and recovery timeliness, that you would be asked to repeat that proof on request.

But then I realized that our IT world is in a constant state of change. Over time, an application might see:

  • An increase in transaction volume
  • An increase in data volume
  • A change in the way customers interact with the application, introducing new “high priority” items
  • A change in the technology supporting the application
  • A new version of the database management system, the overall operating system, or the tools being used to manage the application and its data
  • All of the above

Any one of these could compromise an organization’s ability to recover their data and maintain compliance with GDPR. It’s gratifying to see that the people who drafted the legislation realized that these changes are happening all the time, resulting in the need for repeated testing.

If you are using your disaster recovery (DR) strategy to support recovery compliance, then it could be possible to re-run your testing for GDPR at the same time as your next DR test, but that is likely to be time consuming and expensive to set up and execute. And don’t forget, the primary reason for a DR test is to prove recoverability in case of a disaster – trying to piggy-back compliance testing on the back of a DR test is risky, to say the least.

This is why BMC Software introduced ESTIMATION and SIMULATION to our software recovery portfolio.

ESTIMATION uses algorithms based on the way Db2 and IMS perform data recoveries, and on knowledge of the way data has been backed up to calculate an ESTIMATE of recovery time for a set of Db2 tables or IMS databases. Being a calculation, though, it’s really a best-case estimate of recovery time; it can’t account for many of the reasons for delays during recovery. However, estimation is quick to set up and equally quick to execute. It also doesn’t cost as much to run as a full recovery, so can be repeated as often as necessary.

It is also possible to do simplistic “what-if” calculations to model data growth and answer “would we still be compliant with 15% more data?”.

Estimation becomes a handy tool to get an understanding of recovery and recovery compliance, but it cannot provide the proof that GDPR requires.

Historically, the only way to prove recoverability and get an accurate measure of recovery times was to actually perform the recovery, which either meant costly disaster recovery-type tests, or making a production system unavailable for a while. BMC has a better way – recovery SIMULATION.

A simulated recovery does everything a real recovery would do, but does NOT overwrite the target data being recovered. So a successful recovery simulation proves that every object (Db2 table/index or IMS database) is recoverable, AND it delivers an accurate assessment of the time taken to do the recovery. Because a simulation is simple to set up and run, you can re-run it as often and as regularly as needed.

Lastly, it is possible to analyse the output of a recovery simulation looking for bottlenecks where time could be saved.

If you would like to learn more, please visit our GDPR page on the BMC website at www.bmc.com/info/mainframe-gdpr.html

IDC: Essential capabilities for GDPR compliance

Build GDPR success on a foundation of visibility, governance, and automation. Register now to receive your copy of the complimentary IDC InfoBrief.
Get the report ›

These postings are my own and do not necessarily represent BMC's position, strategies, or opinion.

About the author

Phil Grainger

Phil Grainger

Phil has 30 years experience of DB2, starting work long ago in 1987 with DB2 Version 1.2. Since then he has worked with all versions, including DB2 12.

From his beginnings as a DB2 DBA for one of the largest users of DB2 at that time in the UK, through his time at PLATINUM technology, his almost 10 years as Senior Principal Product Manager at CA and through to his current position with BMC Software, Phil has always been a keen supporter of user groups and is a regular speaker at both vendor sponsored and independent events. His work with IDUG includes being a past member of the European IDUG Planning Committee, an inductee into the IDUG Volunteer Hall of Fame and now Board Liaison for BMC Software

Phil has been honoured by IBM as an Analytics Champion from 2009 to 2017

Phil is now Lead Product Manager at BMC Software working in support of their DB2 tools portfolio

In addition, Phil is a regular contributor to the IDUG sponsored
DB2-L discussion list