Implementing Policy-Based Access Control in a hybrid cloud datalake
This presentation will cover the evolution of access management for a complex and distributed set of cloud resources while building IBM’s internal data platform. Our key challenges were managing access across a large variety of disparate systems. We will discuss the advantages and disadvantages we encountered along the way, the best practice patterns we have found, and a recommendation on how to follow.
The IBM Cognitive Enterprise Data Platform (CEDP) provides access to enterprise data from across IBM for use in discovering insights and supporting AI in a hybrid cloud environment. CEDP is a data lake that spans across Public Cloud, Private Cloud, and on-prem. CEDP is built upon of various storage solutions along with data movement, transformation, indexing, and compute. Access to these data “systems” is granted to “users” being tools, applications and individuals. This data can often include extremely sensitive financial information, personal information etc. The data is regulated globally for who and how data can be accessed, especially for personal information and can include geographic restrictions, blackout period restrictions, and user nationality restrictions.
Our initial approach to access control was based on defining a set of privileges for users and granting those privileges directly against resources. Once granted, the user always has access, regardless of external constraints such as the physical location of the user. The solution included the use of Bluegroups, IAM, AccessHub, and a distributed policy management system unique to each resource. We are moving to a policy-based model of control that takes the attributes of the user, the data, and the platform. The new system leverages policy-based access control and can dynamically evaluate the access rules applied against those attributes. It also provides a centralized policy management approach that can be governed in real time.
BIO – Dr Andreas Wespi is a Research Staff Member at IBM Research – Zurich. His current research focuses on security analytics applied to hybrid multi-cloud environments. For many years he was managing the Security and Privacy Research team at IBM Research – Zurich and leading projects on intrusion detection, data security, cloud security, security policy management, and privacy. In the beginning of his IBM career, he was a member of IBM’s Global Security Analysis Lab (GSAL). The GSAL made substantial contributions to IBM’s security product and service offerings. Among others it developed the technology behind IBM Tivoli Risk Manager, the first commercial Security Information and Event Management (SIEM) product
BIO – Chris Giblin – I am a software engineer who has, during the course of meanwhile many years, worked in a wide variety of projects, from customer engagements, to learning services, through to the ever-inspiring Zurich Research Lab where I am based. My areas of specialization are authorization policy, compliance, software architecture and middleware programming. In recent years I have had the privilege, with many outstanding collaborators, to focus on building and operating data intensive systems. This has included developing middleware for sales and marketing applications, IBM’s CoRE recommendation engine, and most recently serving as security technical lead for the Cognitive Enterprise Data Platform (CEDP), IBM’s internal AI data platform. Currently I am busy extending security features for IBM Cloud Event Streams and developing approaches to automating compliance.
BIO – Grant Miller – I am an enterprise architect and a software engineer who has worked in several groups in IBM over my career including Software Group, Systems Group, Supply Chain, CIO, and currently the IBM Global Chief Data Office where I am the Chief Architect for technology enablement.
BIO – Ilya Hardzeenka – I’m solution architect and software engineer who has worked on several projects in IBM and Rockwell Automation companies. Initially at IBM I was involved into integrating Kenexa with other HR systems in IBM after it was acquired. Than I moved to Rockwell Automation as Big Data Architect to lead the team and develop Global Data Lake on top of Hadoop After 2 years I came back into IBM Global Chief Data Office (GCDO) as Solution Architect for CEDP Co Creation projects on Public Cloud. Now I’m working as Category Architect and Technical Delivery Lead in Platform Security Area. I’m focused on designing and developing security services in CEDP as well as ongoing maintenance of existing services and infrastructure..
Research Staff Member, IBM Research - Zurich (IBM)
Senior Software Engineer, IBM Research - Zurich (IBM)
Senior Technical Staff Member, Chief Architect, Global Chief data Office
Category Architect and Technical Delivery Lead
6:00 am UTCBreakout
8:00 am UTCBreakout
8:04 am UTCBreakout
12:00 pm UTCBreakout
1:00 pm UTCBreakout
4:04 pm UTCBreakout
8:00 pm UTCBreakout