Santa Cruz Tech Beat


Why centralized data access is key for your organization becoming ‘GDPR ready’

By Daniel Mintz
Chief Data Evangelist at Looker

May 31, 2018 — Santa Cruz, CA

After the better part of two years of preparation, debate and conjecture across the technology industry, today, the general data protection regulation (GDPR) is finally upon us.

In the past, the impact of this type of regulatory change would have been confined to the IT and data teams. Nowadays, however, nearly everyone handles data. From customer communications to employee records and beyond, much of this information will qualify as personal data. This means, according to the GDPR, that data must be controlled, used based on published commitments, secured, and ‘deletable’.

Yet, for many companies, allowing access to data has typically required copying, exporting, and extracting data – which leaves a trail of personal data across any number of laptops, servers and systems, both inside the company and with third parties.

Tackling data sprawl

Once data is disconnected from the central source, people begin to rely on the types of decentralized storage “systems” mentioned above. “Oh, I have that list of email addresses on my laptop.” They’re then left with disparate data ‘swamps’ that are impossible to search and even harder to manage and protect.

From the perspective of IT, it’s one thing to control one highly guarded fortress. It’s another challenge entirely when you don’t know how many fortresses exist, what data is inside, how it’s used, you have no record of how many keys have been copied and you don’t know who has access to them. This is the challenge Chief Privacy and Data Protection Officers are being presented with. It’s a problem we need to tackle as an industry – or many will fall victim to GDPR and its potentially severe punishments, or a loss of customer trust.

This is an issue that requires a long-term solution – and cannot be solved by a one-time, CIO-led data swamp cleanup. Because if the data analysis tools encourage “data sprawl” — extracting data and moving it to ‘data workbooks’ for analysis — the problem will reoccur. So even after CIOs and IT teams have transformed their data swamps into clean and organized data lakes, their analysis tools go and start the problem all over again – creating a never-ending spiral of pain.

Why you need a single access point for your data

That’s why any long-term solution has to address the root of the problem.

Continue reading article here:


Tagged ,

Related Posts

Sign up for our free weekly email digest!

  • This field is for validation purposes and should be left unchanged.

Follow Now

Facebook Feed

This message is only visible to admins.
Problem displaying Facebook posts.
Click to show error
Error: Server configuration issue