Historically, Apache Hadoop has provided limited security capabilities, and most Hadoop implementations to date have been deployed with little thought given to security. To protect sensitive data being stored and analyzed in Hadoop, security architects should use a structured approach to security as recommended in one of Gartner’s recent research report.
Take a Life Cycle Approach to Securing Data in Hadoop
Data has a life cycle within an organization. It is typically created within enterprise systems, and during its life cycle, it may go into and out of Hadoop and other big data repositories, as well as into backup and archival systems. Viewed in this way, the Hadoop protection strategy cannot be limited to the protection of only the data within Hadoop discussed thus far.
A best practice is to take a data life-cycle-centric approach to the security within Hadoop that considers these issues:
- How and when is data loaded into Hadoop? Which users, groups and systems have rights to do this? If the data loads are programmatic, what controls (such as certificate-based authentication) protect programmatic access? What steps are taken to ensure the integrity and authenticity of the data being loaded?
- If sensitive data is loaded, has it been anonymized where necessary?
- Are end-user or automated extracts from Hadoop prevented? Or, if allowed, how is the extracted data protected? At a minimum, these activities should be logged using DAP solutions described in this research.
- Is data backed up from Hadoop? Is so, are these backups encrypted? Who has access to the backups?
- Is data archived from Hadoop? If so, how are these archives protected?
Learn more about what’s new with Big Data and visit the Bodhtree’s Resource Center for more info.