My medium.com tagline states that I operate at the intersection of API Management, Integration and Identity. This past week I was at Identiverse — a gathering of people working in the field of identity. So, I was there. Identiverse describes itself as:
“People, applications and devices are gravitating toward a digital world where they all recognize and interact with one another. This digital world is being built on a foundation of identity security designed by a community of people with a shared vision, from technologists and practitioners to thought leaders and end users. This is the identiverse.”
As is typically the case, I had the opportunity to catch up with many friends, clients and past colleagues — Derek, Karen, Andrew, Tyler, Tom, it was great seeing or meeting all of you. I admit, I spent a lot of time on the API and Microservices tracks — I’m a creature of habit. I was recently given an opportunity to write a couple of blog posts for the Ping Identity [Elastic Beam] blog. To that end, I’ve tried to weave the general themes I saw at Identiverse into a brief narrative that links back to some of the other posts I’ve published on Medium.com and elsewhere.
Of course, the big news coming out of Identiverse 2018 is the strategic acquisition of Elastic Beam by Ping Identity. This acquisition combines the API threat detection capabilities of Elastic Beam (now Ping Intelligence) and the industry-leading identity stack offered by Ping Identity. The combination of technology stacks makes sense and interesting things will be possible with it going forward.
OAuth2 isn’t perfect, but it is an evolutionary step forward from earlier identity protocols.
We’re moving to a passwordless world (which is dominated by Public Key Crypto (PKC)-based solutions).
The FIDO2 standards are the path forward to achieving that passwordless world.
A large outstanding problem in identity is how to achieve centralized fine-grained authorization policy decision points (PDPs) with decentralized policy enforcement points (PEPs) in a way that is standards based and integrates seamlessly with the existing IdP paradigms.
Use a standards-based approach to security—this has been one of my personal favorites for a decade.
Whenever possible, I have provided links to the slide deck that was used during the presentation. The 2017 slides weren’t uploaded to slideshare.net until a bit after the conference ended. So, I will update this post once I see them available. Some individual authors uploaded slides to various sites — thank you for that.
Identity and Access Management at NIST The first session I went to was “Identity and Access Management at NIST” by Kevin Stine. This presented an overview of some of the recent work that NIST has done in the identity standards space. It stayed high level and did get into the weeds. Overall the session re-enforced the recurring theme in my own writing about the importance of standards and using a standards-based approach to application security models. NIST has published a number of identity-related standards — Digital Identity Guidelines. This includes SP 800–63–3, which got everyone excited in 2016 when it deprecated the use of SMS text messages as an acceptable form of two-factor authentication (2FA) — NIST explains this further here. It also includes SP 800–63b, which provides new guidelines for passwords and what constitutes a secure password — a summary of these new guidelines can be found here.
Next, I went to the “Beyond API Authorization” session presented by Jared Hansen. Jared is exploring how to apply the existing OAuth2 and OpenID Connect families of specifications to fine-grained authorization (FGA). His examples centered around the use of something like Google Docs and how the audience and scope fields in a JSON Web Token (JWT) acting as an OAuth2 access token could be used to implement fine-grained access control decisions at the resource server. This is a complex problem that has several issues that must be addressed.
Jared was focused on how to move the FGA decision into the authorization server. I asked some questions about how he planned to capture the authorization policy that needed to be evaluated; in particular, I asked about XACML and ALFA. His talk was primarily focused on defining the problem, not the solution, which is fair. I’ve had to deal with this very issue at multiple client sites where an API Gateway was being deployed. There is something clean and elegant about having all the authorization policies defined, managed, evaluated and enforced at one point; however, that requires potentially having all information available in a request and its run-time environment presented to the IdP for potentially every API call. This can become inefficient quickly and also requires more (potentially sensitive) information to be presented to the authorization server than is desired.
At some point, the application-specific knowledge becomes so great that it makes more sense to put the FGA logic in the resource server (API Gateway or API Provider) rather than the authorization server. There is a gray area here. In practice, most clients I have worked with have had some type of coarse-grained authorization (CGA) decision made at token issuance time (on the authorization server) and additional CGA decisions and FGA decisions made at the resource server.
Microservices Security Landscape Next, I went to the “Microservices Security Landscape” by Prabath Siriwardena. This session explored the challenges of securing microservices. It covered a lot of territory. Some interesting points I wrote down were:
Security services change the security story.
An API Gateway solves the multiple entry point problem, but there is more that can be done in the security space.
Assumed use of self-contained tokens as OAuth2 access tokens. Furthermore, assumed JWT as the OAuth2 access token (most of the presentations I sat in assumed this).
Use OAuth2 Token Exchange to obtain the downstream token needed for the next hop.
Use Embedded PDPs for access control policies in each microservice container. The policies can still be centrally managed.
To facilitate immutable containers, but still have the ability to dynamically update authorization policies, store such policies in memory.
Use sidecar pattern to implement common security features like PDP/PEP.
Use the service mesh pattern to deploy all the security related components (including introspection service, PDP/PEP, UserInfo) together with the microservice in a single pod.
Called out that using JWT for OAuth2 access tokens requires that a token revocation mechanism be created.
Benefit of JWT rather than MASSL for securing server-to-server communication is the ability to set custom claims.
Microservices architecture & security. How (not) to?
Next, I went to the “Microservices architecture & security. How (not) to?” session presented by Bertrand Carlier. Several options were reviewed for securing services (authentication and authorization)—consider API Gateway to back-end services or front-end services to back-end services. These included passing information in HTTP tokens, transmitting the same token end-to-end, use of OAuth scopes, and the use of token exchange. The third and forth options start to look like what I have been writing about and used on recent API-related projects. The pros and cons of each of these are presented. There are many other ways of securing APIs, of course, but the same difficulties remain with all options: key management to authenticate services / sign tokens and define/maintain/centralize fine-grained access policies. The rest of the presentation described several case studies from Wavestone customers.
Finally, a few guidelines were presented to balance API security design:
Different contexts will result in different architectures (security requirements, automation capabilities, gateways vs agents vs micro-gateways).
Token transmission and scope management will fit most security requirements (secure enough for most cases, relatively easy to implement).
Consider other options to cover additional security requirements (service-to-service authentication, token exchange or nested self-issued JWTs).
There are many security building blocks. The main difficulty is to build it without mistakes.
Reuse existing components within an organization. This can be challenging; it isn’t a technology problem. In my experience, it is a failure of leadership.
One final concept that was described in this talk is the use of nested self-issued JWTs to describe the component identities that a request has traversed for auditing purposes. I haven’t explored this in blogs or used it at client sites. We’ll look at that in a future post.
Threat Tolerant IAM MicroServices Architectures Next, I went to the “Threat Tolerant IAM Micro Services Architectures” presented by Rakesh Radhakrishnan. This session described a paper KPMG published last year that describes IAM as microservices. They see all the major IdP vendors moving toward this model. One could split the identity stack into 12–20 different microservices based on a POC that KPMG did. This is a fusion of model-driven architecture, domain-driven architecture and IAM microservices. Threat-centric IAM takes into account threats specific to IAM tools. There were several interesting ideas presented in this session that I don’t have enough room to go into here. The slide deck is worth a read.
Moving Beyond the Password: The State of FIDO Standards Adoption Next, I went to the “Moving Beyond the Password: The State of FIDO Standards Adoption” session by Brett McDowell. The general theme of this session (and the FIDO Alliance) is that the industry needs to move beyond passwords. The FIDO (Fast Identity Online) Alliance was formed in the summer of 2012 by several companies; it was launched publicly in February 2013. The rest of its history can be found here. The original idea was to use biometrics for identification instead of a password and to do it in a way that was based on open standards. The idea has been evolving since then. From the history, the early “discussions led to the basic design insight that a device biometric should be used to unlock a cryptographic key housed on the device. This key would be registered with a server and subsequent authentications would require exhibiting a signature based on the key — thus enabling a passwordless login backed purely by local authentication.
The presentation opened with some stats to get your attention:
In 2016, 81% of hacking incidents used weak, default or stolen passwords.
1 in 14 phishing attacks was successful in 2016.
IN 2017, there were 1,579 breaches; a 45% increase over 2016.
So that’s all bad and we need to do better (my paraphrasing of what the presenter had to say). FIDO replaces passwords with something that is more secure and gives a better user experience. That better user experience comes in the form of something that is faster than password entry and easier to use.
The FIDO Alliance and W3C organization maintain the FIDO2 standards. The FIDO v1.0 specs were released in December 2014; the latest versions of these specs can be found here. The FIDO v1.0 specs include the UAF (User Authentication Framework) and U2F (Universal 2ndFactor) specifications. The original FIDO2 specs can be found at the same location (scroll down). The major components are: WebAuthn and CTAP (Client To Authenticator Protocol) — now at the W3C. CTAP supports USB, NFC and Bluetooth.
Several companies have already begun using FIDO2 standards — you may be using them and not even know it.
The goal is to take user vulnerability out of the equation. The OTP (one-time password) MFA (multi-factor authentication) and shared secrets (password) are replaced with a FIDO key that is protected by biometrics or other allowed method. FIDO standardizes the interaction between the device and FIDO public key (CTAP). It also standardizes the interaction between the device and the service (WebAuthn). FIDO verifies that the user was present during the identification process by requiring a user gesture before the private key can be accessed. FIDO does not mandate the modality of user interaction — passwords, keys, etc. FIDO provides “high-assurance strong authentication” via the use of two+ factors of authentication where at least one leverages public key cryptography, and is not susceptible to phishing, man-in-the-middle or other attacks.
The major browsers support or will be supporting FIDO2 soon.
When Standards Don’t Suffice To mix things up a bit, the next session I went to was entitled “When Standards Don’t Suffice” by George Fletcher. A standards-based approach to security is a good thing, but it isn’t always the most straightforward goal to achieve. Sometimes the specs don’t cover your exact use case. This can be frustrating for the uninitiated and often drives developers toward simply creating a custom solution, which will then tend to lead to insecure solutions. The speaker described the experiences of merging the identity stacks of Yahoo and AOL at the new Oath parent company.
There are many specs in the OAuth2 and OIDC families of specs. Understanding these specs and what situations they apply to can be daunting if you don’t use them every day. That, combined with a situation where they don’t exactly match your use case, can create a complex obstacle to overcome.
To help address the situation, the presenter provided the following principles:
Principle #1: Look for what exists.
Principle #2: Study other extensions. Find the spec or flow that most closely fits what you are trying to do. Then, extend that.
Principle #3: Follow patterns. Research how other people have extended that flow or grant. Understand security implications of previous work.
Principle #4: Contribute back. IF it is useful, make it a standard.
He also mentioned:
Don’t succumb to NIH (not invented here) syndrome.
Don’t implement something proprietary. There are usually better solutions and inherent security benefits derived from the work others have done.
What is Wrong with OAuth? Next, I went to the “What is wrong with OAuth?” presentation by Justin Richer. Justin is one of the authors of “OAuth2 In Action.” According to Justin, the following problems exist with OAuth2:
OAuth2 became a family of standards.
Implicit Grant & Password Owner Credential Grant are used for far more than was ever intended.
Not enough focus on mobile in original OAuth2 RFC.
For the first one, this has happened to every standard in the identity space. Just look at Kerberos and SAML.
Justin had one or two other items in there, but I didn’t write them down — battery died about that time. All the slides from Identiverse will be posted on Slideshare, so I will update this post when Justin’s slides are published.
Token Binding Of all the sessions I went to, this next one is probably the most relevant to my day-to-day activities — I know I’m a nerd. The “Token Binding” session presented by Brian Campbell is of particular interest to anyone that has API consumers making API calls to an API Provider or API Gateway. That OAuth2 access token is a type of bearer token. As such, anyone (including the bad guys) who happen to get their hands on it can use it to access a Resource Server (API Gateway or API Provider in my imagined use case).
You can see an earlier post I wrote (originally for Apigee), regarding the use of Bearer Tokens with APIs (using OAuth2 and OIDC) for more information about a typical use case. I describe what a bearer token is in this post.
Because of its nature, a bearer token (in any form) can be used by any third party that happens to get hold of it. This is far from ideal and makes securing the token at every hop along the path it flows through critically important. A decade ago, if we were talking about SAML Assertions (bearer token subject confirmation method) embedded in the WS-Security header of a SOAP request, we could bind the token to a request and ensure that it isn’t reused by digitally signing the WS-Security header (and probably other parts of the message) with a private key belonging to the message sender. A significant amount of management overhead and complexity went along with doing it, but there was a standards-defined way of doing it. Until now, there wasn’t a standards-defined way of addressing the same problem with OAuth2 access tokens and API requests.
Now there is. Enter Token Binding (that is the 2017 Identiverse presentation). This is a set of several IETF proposals that work together to give the end result I described in the last paragraph. Token Binding ties a bearer token to a particular user/browser session. It addresses a fundamental problem of bearer token use. It does this by requiring the bearer token’s attachment to the message to be digitally signed (using JSON Web Signature). Since the key that is used to sign the message will only be known to the client and the server, it will not be possible to use a leaked bearer token in other contexts (on other TLS connections).
The presentation focused on the use case of a browser to a web server (or other HTTP-enabled endpoint). It will be interesting to see what kind of support emerges for server-to-server communication where end-to-end identity propagation is being used. I look forward to seeing this capability becoming mainstream.
Zero-Knowledge Proofs The last session I wanted to mention is Clare Nelson’s “Zero-Knowledge Proofs” presentation. A zero-knowledge proof is a bunch of math that can prove that a party knows certain pieces of information without having to expose that information to another party. This is still a somewhat abstract idea that isn’t being incorporated into real world security solutions yet, but gives you an idea of where the future may go. That piece of information could be a password or other type of credential that could be used as part of an authentication step. This post has a good write-up on the topic.
Closing Thoughts This was my first time at Identiverse. I had an opportunity to go a few years ago, but the life of an independent consultant dictates taking the job that actually pays when it presents itself. It was exciting to be around a large group of people interested in the same topics that I am. There were many sessions I wasn’t able to attend. If you’re interested in topics not on the tracks I attended or looking for more information about sessions from this year’s conference or previous years, check out the Identiverse Slideshare channel.