Archive

Posts Tagged ‘saml’

SAML joins the IT zombie legions?

I’ve had the privilege to witness many IT funerals. By my reckoning, Mainframes, CORBA, PKI, AS400, NIS+, and countless others are all dead according to the experts. Of course, that means nearly every customer I talk with is overrun with zombies. Because these technologies are still very much alive, or at least undead, in their infrastructures. They are spending tons of money on them. They are maintaining specialized staff to deal with them. And, most importantly of all, they are still running revenue generating platforms on them. Now some of the the venerable folks speaking at CIS2012 want to count SAML among the undead. It’s a sign of the ever increasing pace of IT. SAML, if it’s dead, will be leaving a very handsome corpse. But I think it’s safe to say SAML will be with us for a very long time to come. This meme feels like another flashpoint in the tensions between thought leaders like the list of folks discussing this on twitter (myself included) and the practitioners who have to answer to all the folks in suits who just want to see their needs met. I try to split the difference. It seems to me that the only thing that makes something dead is when people are actively trying to get away from it because they are losing money on it. SAML is nowhere near that. But if dead is defined as not being a destination but rather a landmark in a receding landscape, then maybe it has died. But it’s chasing after us hungry for our budgets and offering being impervious to pain as a trade for that funding, which sounds like some kind of zombie to me. Using SAML will make you impervious to the pain of being so far ahead of the curve there is no good vendor support, impervious to the pain that there are not enough people with talent in your platform that you can’t get things done – or have to pay so much to get things done you may as well not do them, and impervious to the pain of being unable to get what you need done because there aren’t enough working examples of how to do it. Based on what i hear from practitioners, they may like being impervious to all those pains. So the IT zombie legions grow…

Categories: iam Tags: , , , , , ,

What I learned at @kuppingercole’s #EIC11: #identity #IAM #privacy and secrets

I must admit to being very selfish at this year’s EIC. Instead of going to the sessions that would likely have been most useful to Quest, I went to those that spoke most strongly to my own curiosities. The first thing I did was explore how vendors, users, and analysts feel about standards. It seems like it’s a chicken and egg subject – still. Users wait for vendors to adopt standards. Vendors wait for users and analysts to put force behind them. And in the mean time, only obvious success (SAML) and obvious need (XACML) seem to get standards investment and attention. The most interesting moment of this leg of the journey was when @OASISopen‘s Dr. Laurent Liscia asked from the keynote stage how many people in the audience were vendors to “make sure we’re not talking to ourselves.” Apparently we weren’t, but it was an interesting glimpse into how the whole notion is perceived even by those most dedicated to that cause of standards.

I also went to an absolutely fascinating deep dive into EU privacy and data protection law, which was hosted by Dr. Jörg Hladjk of Hunton & Williams LLP. Perhaps the most interesting thing I walked away with was a new sense of how fragile these protections really are. I think people in the US tend to think about these laws as being very intimidating and forceful. But that likely comes from the vastly complicated contract, audit, and procedure (paperwork) that is needed to deal with the laws. However, two shocking things became clear over the course of the day. First, any reasonable legal basis can be used as a basis to get at the data. A person can sign away all the protection in a single stroke – as anyone who agreed to the terms to get an iPhone in the EU has done in some part. And, because the framework is so much more comprehensive, things like a EULA, which is routinely cast aside in US cases since it’s seen as so flimsy, is much more forceful in the EU since the user is deemed to be so much better informed and protected by the framework. Second, there are cases where protections in the US are stronger than in the EU. A good example is when it comes to breech notification, where a data steward is forced to notify you of some event that may have compromised your PII. It seems that between NSTIC, efforts at the state level (like California’s new proposed “social media” law), and other things in the works, the US may actually come out ahead of the game in a practical sense within the decade.

The last lesson was a pleasant surprise: nearly all identity minded people are closet philosophers. Anyone reading this is likely to know my undergraduate (and only) degree is in philosophy, and perhaps also that I still indulge that impulse heavily as often as I can. Dr. Emilio Mordini, CEO of the Centre for Science, Society and Citizenship (CSSC), gave a keynote on the nature of secrets that was a HUGE hit. Not to say everyone agreed with all his views. In fact, @NishantK and @ggebel both took shots at his ideas from the same keynote stage later. The idea that drew the most criticism was Dr. Mordini’s very unpopular conclusion that one shouldn’t worry about securing data, but rather tracking data. He feels it’s less important to worry about keeping a secret than keeping track of who knows the secret. All of this flows from his central thesis that all secrets are Pulcinella secrets – not really secrets but rather, like a secret in a small town, something everyone knows but no one says so long as the parties involved have the power to motivate everyone to not say it in the town square. Tim Cole goes into all the details of the Pulcinella story on his own blog. The truth of all of it is left as an exercise for the reader. But the thing that made me happy was the abstract conversations in the hallways and bars for the rest of the conference as everyone digested the deeply interesting issues that were raised and what they meant in our shared context of identity, access, privacy, and technology.

Policy Translation – The Art of Access Control Transcends RBAC, ABAC, etc.

After some holidays, lots of internal meetings, and some insane travel schedules, things are settling back down this week just in time for me to head to TEC. So I can get back to spending time with Quest’s customers, partners, and having great discussions with people. In the last week, I had three excellent conversations, one with a panel of folks moderated by Martin Kuppinger from Kuppinger & Cole set up by ETM [link to podcast site], another with Don Jones and an audience of folks asking questions set up by redmondmag.com [link to webcast], and the third just today with Randy Franklin Smith [link to webinar site]. All these discussions revolved around managing identity (of course); they focused on the business’s view of IAM, wrapping proper security controls around Active Directory, and controlling privileged user access, respectively. Even though the subjects seemed quite far apart, a common question emerged: how do you translate the policy the business has in mind (or the auditor has in mind) into something actionable which can be enforced through a technical control? Put another way, the problem is how to take wishes expressed in business terms and make the come true with technology. To me, this is the central question in the IAM world. We have many ways to enforce controls, many ways to create compound rules, many ways to record and manage policies. But the jump from a policy to a rule is the tricky bit.

Let’s take an example and see what we can do with it. Everyone in the US and many around the world know SOX, and most that know it are familiar with section 404. There is a great wikipedia article about SOX section 404 if you want to brush up. Section 404 makes the statement that it is “the responsibility of management for establishing and maintaining an adequate internal control structure and procedures for financial reporting.” While this makes sense, it’s hardly actionable. And businesses in the US have relied on many layers of committees and associations to distill this. What is that process? It’s lawyers and similarly minded folks figuring out what executives can be charged for if they don’t do things correctly in the face of vague statements like the one above. So they come up with less and less vague statements until they have something they feel is actionable. Of course, what they feel is actionable and what some specific IT department sees as actionable may be quite different.

From the filtering at the high levels of the interbusiness activities you get a statement like “Understand the flow of transactions, including IT aspects, sufficient enough to identify points at which a misstatement could arise,” which comes from the work done by the SEC and POCAB to interpret SOX section 404. That approaches something IT can dig into, but it’s hardly actionable as is. But now a business can take that, bring it inside the organization, and have their executive management and IT work out what it means to them. Of course, there are scads of consultancies, vendors, and others who would love to assist there. Your results may vary when it comes to those folks, or your own folks, being able to make these statements more or less actionable. With this specific statement about the “flow” of data and not allowing “misstatement” to arise, there is general agreement that having IT staff with administrative powers that could, in theory, alter financial data is a risk that needs to have a control. And from that general agreement has risen an entire market for privileged access management products that allow you to restrict people who need administrative rights to do operational tasks in IT infrastructure from using those rights to somehow change data that would be used in any kind of financial reporting (or use that access to do any number of other things covered by other sections of SOX or other regulations like PCI, etc.).

What should be apparent is that things like RBAC, ABAC, and rules based approaches to access control are all simple and straightforward when compared to taking policy and making it actionable. Putting an RBAC system into place is taking action. But, as anyone who has been through an RBAC roll out will tell you, the hardest bit is figuring out the roles. And figuring out the roles is all about interpreting policies. So what is the answer for all those folks on these webcasts who wanted to know how to master this art? The short answer is like the old joke about how you get to Carnegie Hall: practice. The medium length answer is to find a consultancy and a vendor that you trust and that have had the right amount of practice and make them do it for you. The long answer is to follow the path I took above trying to explain the question. You need to analyze the requirements, break them down, and keep doing that until you start getting statements that look slightly actionable. Of course, that takes a huge amount of resources, as evidenced by all the money that’s been spent on SOX alone in the US (that same wikipedia article quotes one study that says the cost may have been 1.7 trillion USD). And the final trick is to take your actions and breakdowns back to the top, your auditor or CISO or whomever started the chain, and validate them. That’s a step that gets skipped all too often. And then you see million dollar projects fail with one stroke of an auditor’s pen.

Identity Myth: SSO is Hard; Truth: Old Apps Suck

December 1, 2010 Leave a comment

I sat down with a very smart group of folks and they were saying how they think SSO is very, very hard. If your world is all Active Directory (AD), it’s easy. But that is true in a tiny percent of the world. Everywhere there is some odd ball application and in most places there are just as many applications not using AD as there are using it (even if they buy Quest solutions, sadly). The cloud, something everyone is forced to mention in every tech blog post, also complicates this. How do you do SSO when the identities aren’t under your control? Or, reverse that, how do you get SSO from your cloud vendor when your on premise applications aren’t under their control? But every time I have the SSO conversation at length with people the conclusion is always the same. If all you have are applications from the last 10 years and some cloud stuff, there are approaches, including Quest’s, that can fully solve that problem. You can integrate into your commodity AD authentication, put up SSO portals, or use widely adopted standards like SAML – or all of the above in a clever combination. Even thick client GUI applications can be tamed with enterprise SSO (ESSO) solutions at the desktop. The things that always end up falling through all the cracks are older applications. Things that are often the crown jewels of the business. Applications that are so old because they are so critical that no one can touch them without huge impact to the business. But the older technologies resist almost every attempt to bring them under control. Even ESSO, which is the catch all for so many other laggards, can’t tame many of the odd green screens, complex multi field authentications, or other odd things that some of these applications demand at the login event. When I’ve spoken to our SSO customers, they always seem happy with 70-80% adoption on their SSO projects. They know they will never get that last group until the applications change. But there doesn’t seem to be any compelling event for those applications to be changed. So SSO continues to seem hard, but we all know that’s not exactly true.

a new SPML? a provisioning problem.

Mark Diodati of Gartner (that was a bit hard to type right the first time) has published the results of the SPML SIG held at #cat10. I think it captures the feeling of those present very well. At about the same time the minutes of the first meeting of the SPML PSTC for a long while were published. It seems there’s a much different split there than there was at the SIG. The split is basically between folks who want to see a “clean start” with a version 3 and those who want to see version 2 revved so it’s more realistic. I’m on the latter side, and so are the folks at Quest that I’ve spoken to. In fact, both and Quest and at customers, everyone I’ve spoken to about this outside a tight circle of “identity gurus” have all agreed that SPML would best serve the larger community as means to have systems communicate. Anything beyond that is overkill. At least for now. If all the different solutions had a standard way to do CRUD operations between one another, that would go a long way to solving many practical issues in heterogeneous IT environments.

I’d like to get more involved and I’m working with Quest to see if that can happen. This is something I’d like to see done from start to end.

BF8XDEVU8PDS This is here for Technorati. If you’re seeing it it’s because you’re reading this content somewhere besides my blog site and I couldn’t hide it from you. Sorry =]

#eic10 part 2: lacking policy, lagging XACML, authZ not so externalized

I’m not sure why, but the theme for me at EIC10 was policy. It wasn’t that the sessions or discussions were intent on going there. If anything, it was quite the opposite. I sat in on one of the “pre-conference” sessions that was titled “Moving beyond the Perimeter: Identity & Access Management for a Networked World“. That was what set the tone. I went in expecting a lot of discussion about how organization could, should and have been able to overcome the tricky policy barriers to open themselves up and manage access. The reality was that we spent a lot of the time discussing how to get over the challenges of making IAM work inside the perimeter so they can start thinking about the outside. For those that had some established outside presence for identities accessing other resources or accessing their own (and it was only a few), they were set back on their heels by my questions about policy and challenges to explain the legal implications of those access points. Later on, in a session titled “It has been Quiet around Federation. Is this a good Sign or a bad one?“, asked what challenges were faced by your organization when trying to federate, I answered that we (Quest) had faced numerous legal challenges to getting federation done. Each time has been a meeting with lawyers and lawyers meeting with lawyers and so on. The shocked looks from the general audience didn’t quite drown out the few nodding heads that clearly knew exactly what I meant. It shouldn’t surprise me that technology outstrips policy and that technologists don’t see the policy lagging behind until it’s too late, but somehow it always does.

Of course, technology is still my preoccupation so I was equally into the technology of policy that seemed to pervade EIC10. XACML was everywhere. Or maybe it only seemed that way because I attended so many of Felix Gaehtgens‘s sessions. However, there were a few stark contrasts that struck me. First, there were no fewer than 5 vendors on the floor offering XACML based or compliant solutions for externalized authorization. Despite that, I didn’t see one keynote mention it, nor customer story talk about having that be built into their architecture. Even the big vendors, when directly questioned about it, immediately submersed it into an acronym soup of SAML, claims, and other federated related stuff. It seems like many are now using “federated” interchangeably with “externalized”, which is sensible on some level but seems to lose some of the important distinctions between the two (e.g. trust is explicit with federation and implicit with externalization). By far my favorite externalized authorization moment was in a panel titled “How to make your Software Security Architecture Future-Proof” when Felix asked Kim Cameron, who had just made his interstellar announcement, the following: “if the application has to have internal logic to handle claims, then the authorization has not been externalized, right?” Kim made no real answer. But I think Felix said what a lot of people were thinking. Claims are the bees knees, but WIF still embeds all the authorization logic right in the application itself.

This will be the last on the conference. It was a real blast and I got to meet some of the folks who have haunted my mind via twitter for a long time in person. Good stuff.

SAML vs LDAP to the death?

April 8, 2010 3 comments

…with tag team partners STS for SAML and the VDS (Virtual Directory Server) for LDAP?

So I’ve taken Jackson‘s advice and have been reading Microsoft’s “Guide to Claims-Based Identity and Access Control”. While most of it has been things I’ve heard before, the formulation of the ideas the way Microsoft wants to present them to their favorite audience, developers, is very interesting.

The thing that caught my eye and inspired a whole lot of conversation, lightbulbs for me and this post was a quote very early on:

“ADFS has a rule engine that makes it easy to extract LDAP attributes from the user’s record in Active Directory and its cousin, Lightweight Directory Services. ADFS also allows you to add rules that include arbitrary SQL statements so that you can extract user data out of your own custom SQL database. You can extend ADFS to add other stores. This is useful because, in many companies, a user’s identity is often fragmented. ADFS hides this fragmentation. Your claims-based applications won’t break if you decide to move data around between stores.” (from page 6)

Described like this, the STS sounds a heck of a lot like a VDS. So I asked many of the Quest big brains what they thought of the quote and what the quote made me think. I was quickly told that this was silly since the models for an STS and VDS are so different. Some of their points were:

  • STS is a push model where users show up at the applications with claims ready and VDS is a pull model where the application needs to go get the information
  • The VDS approach is about applications using data from multiple sources without modifying the application while the ADFS + WIF approach is about teaching the application to consume claims natively by modifying it
  • The STS and SAML approaches wraps the claims, the identity data, into the authentication operation while the VDS approach simply exposes a service for the application to use through the applications operations.

Somewhere in the midst of this discussion, a big gear clicked into place. I saw something I bet many, many have seen before – but it was new to me. Microsoft and Oracle were really going head to head in identity for applications. Yes, I know it’s hard to believe that Microsoft and Oracle would compete. But that does seem to be what’s happening. You see, the VDS had always been in this spot on my mental whiteboard between the applications and the multiple sources of identity data as an abstraction layer. The STS was somewhere on that mental whiteboard, but it wasn’t there. Now I’d been clearly shown that it could be moved in front of the VDS, or even be moved to replace the VDS. Of course, much depends on the use cases. The STS can’t really do everything the VDS does and vice versa. But I think it’s fair to say that Oracle is betting on people like me who see with an application architect’s eye and try to make the current generation of revenue generating applications do their work better and faster. Microsoft is betting on it’s excellent developer community and credibility to propel the next generation of all applications into a claims based, STS dependent world.

That battle would seem to pit SAML and LDAP against each other, each with one of the largest tech giants in it’s corner. In reality, I doubt it will be anything so dramatic. But before this conversation, I didn’t even see the potential for that battle. It’s amazing how many latent hostilities to some approaches seem clear to me now. I don’t even think some of the people who were hostile realized why. But there are deep mechanisms at work in the respective communities involved that are forming opinions that will likely solidify into “Linux vs Windows Server” style opinion wars soon enough. Here I thought all this good will about interoperability in identity could last forever. Silly me.

%d bloggers like this: