Friday, February 7, 2014

We’ve moved...

http://blog.ctp.com/category/identity-and-security/

Dear Readers and Followers,

We'll start with something unspectacular, nevertheless important this year - a new blog location.

Please redirect your feeds to the new and official Cambridge Technology Partners - Identity and Security blog... http://blog.ctp.com/category/identity-and-security/, some new posts are already waiting to be read...

we will leave all our old posts here as some other sites have referrals to it, but new posts will only appear on the new location.

Monday, July 1, 2013

SailPoint Partnership

Cambridge recently partnered with SailPoint. Cambridge has long been a leader in providing Identity and Access Management services and has maintained partnerships with NetIQ, Oracle and Microsoft for years. We give the highest importance to providing quality services to our clients, using the vendor products we’re partnered with. Therefore, the decision to start a partnership with a new vendor is taken very seriously and involves strategic planning and analysis. This ensures that the vendor is able to provide our clients a high standard of quality.
In case of SailPoint, starting a partnership was an easy decision – they have a leading product in the Access Governance market. Established in 2005, this company has taken a completely fresh approach to Identity Management – through Access Governance. This is the reason why it has managed to set itself apart from the rest so quickly, as it has moved the focus of traditional Identity Management from an IT solution to a Business solution.
Oracle, NetIQ and Microsoft all have strong provisioning engines and provide a comprehensive set of functionalities. However, in all three cases, the Access Governance functionality is missing from their Identity Manager products. Oracle provides this by way of Oracle Identity Analytics while NetIQ is a reseller of SailPoint IdentityIQ. In either case, it is necessary to integrate additional products into the existing environment to get similar functionality. SailPoint IdentityIQ, on the other hand, offers one single product that is able to provide Governance and Provisioning from one single interface.
What sets SailPoint further apart from its competitors, is the intuitive user interface it provides while providing a rich set of features and functionality. From the very little hard disk space and memory it needs, to the ease of installation and patching, it is able to provide powerful results with minimal effort. Therefore, whether it is a large client with millions of identities (their largest client has over 1.7 million) to a small company with less than a 1000 users, the effort and costs associated with the implementation are low.
SailPoint’s recommended strategy while considering any kind of Identity and Access Management project is to start with a business user driven Identity Lifecycle Control, built on a model & policy based approach. That is, understand the current state and clean the data first, before moving on to any kind of provisioning functionality. To support this approach, it provides a Risk Based Model. This Risk-based Model can be used to assign a risk score to all roles and entitlements. Therefore, users with a high risk score can be flagged and their entitlements can be reviewed. If after review, it is discovered that some of the entitlements are not needed by the users, they can be removed, bringing down the user’s risk score. This functionality helps define a strong foundation for any Identity Management solution – clean data leads to easier management of users, easier certification process, and easier maintenance & proof of compliance. Traditionally, Identity Management projects have been, to a certain degree, focused on provisioning while not giving appropriate importance to the data cleansing process. In some implementations, this ends up leading to a state of Garbage In, Garbage Out.
Whether you are completely new to this field and looking for a place to start or already have a centralized provisioning system in place and are looking for a way to integrate Access Governance, SailPoint’s approach helps make this process easier. The technical team responsible for the creation of IBM Tivoli and Sun Identity Manager also created SailPoint IdentityIQ. Therefore their product has been built by experienced people, keeping in mind the issues and challenges faced in the previous generations of Identity Management products.
In summary, they’ve built a single product that solves most requirements around Identity Management and Access Governance. It has been designed with a business user in mind, provides flexible provisioning services as well as a dynamic risk model that allows companies to get compliant, stay compliant and prove compliance. Given all the above-mentioned features, Cambridge has rightly partnered with SailPoint - our team will be happy to help you assess if SailPoint is right for you!

Tuesday, April 9, 2013

Challenges of an Identity Management Deployment


The concept of Identity Management has always sounded very logical, practical and useful from the start. What’s not to appreciate? Users get one interface for Self-Service. Approvers get one interface to approve or decline all kinds of requests. The support team uses one interface to manage all user accounts, on various systems, whether to unlock accounts or to reset passwords or perhaps even provide additional services. The whole user lifecycle can be managed from one system, roles assigned, permissions revoked. The time, energy, effort saved is huge. The list of good things Identity Management brings is probably endless.

Then why do most companies struggle to implement a viable Identity Management solution? Is it the lack of technology? Are the current vendors unable to provide a framework that addresses consumer needs? Is it difficult to gather the information and find common understanding across all departments and all application owners? Is it that converting business needs into a technical implementation is just too difficult? Or is it all of the above?

Let’s take an example. Company XYZ wants to implement workflows to request access to applications. In theory, it sounds quite simple. Anyone who has tried to create the business rules that support all use cases knows: it’s never that simple. How are users managed? Who approves on the first level? Does Department ABC follow the same rules while granting access as Department PQR? If not, then do we set up different sets of rules for every department?  How long will that implementation take? Otherwise, should we come up with the rules to be followed from now on, with this Identity Management implementation? If so, will end users like the change? Let’s face it, although the advent of technology has made us more accepting of change than before, we still try to avoid it as much as possible. Things should stay as they were. Many end-users would prefer to call the helpdesk and request rights instead of figuring out where to go, which link to click, which application to select from a list, which specific rights and roles to choose and submit. Isn’t picking up the phone and asking for the same thing easier? On the other hand, will Application Owners be ok with giving up the power so that the Identity Management solution automatically creates accounts for users, if the request is approved? Or would they instead prefer to know who is getting what access and decide when it is granted? Will they feel less essential to the functioning of the company if certain tasks can be automated?

Let’s take another example. If Company XYZ has 50 departments with over 500 different job titles and or job codes, how many Business Roles should be created that cover at least 80% of the employee’s rights and permissions across the 100 applications used most frequently by the employees? Of course, all vendors now provide features where one click of a button and these roles are generated automatically; the data is mined in the most practical way. It will even suggest new roles if new rights and permissions are detected. But how do you get the most accurate data from all the applications, and create these roles when many definitions change regularly, the data is not of good quality and there is no pre-defined set of rules or logic that decide when and why a user should be given a set of permissions?

Of course, there are hundreds of successful IdM implementations, there is no denying that. But can those implementations truly be called successful? More importantly, do those implementations continue to be successful? Is manual intervention still needed at several points during a user lifecycle? Are they able to keep up with the growing company needs?

Since there is a lot of talk about the cloud these days, it’s very likely that many companies will want to use IaaS (Identity as a Service). However, moving to the cloud still does not solve the basic problems faced by any Identity Management implementation. What are your thoughts? Is implementing an Identity Management solution challenging for you?

Wednesday, November 14, 2012

Identities, the cloud, business needs and a modern approach to handle them

Microsoft, Amazon, Google, everyone talks about the cloud and their fancy new features.
And even though this opens new opportunities and possibilities, this is actually not new at all.
In the past enterprises were hosting services in their DMZ, offering them to partner, customers and employees.
Your partner never worried about how you run the service as to them this was all in “the cloud”. We simply didn’t call it like that back then and this “cloud” was part of your extranet.
Today we have more sophisticated solutions, massive computing power, larger storage, faster internet connections, but one fundamental problem stays the same: Who is accessing the service and what is he or she allowed to do?

A few years ago the answer was simple. Every user that needed access to a specific service got its own user account, mostly in an Active Directory.
This approach seems quite charming, you always know who is accessing your services and data, you know the accounts, you maintain them.
But on the other hand most of us also experienced the downside of this concept. Large numbers of external accounts, consuming a considerable amount of IT resources, creating measurable overhead in the communication between third parties, your business, the IT department, and so on.

This was already a not so ideal solution. And now we add the whole cloud scenario on top.
I want to use cloud services like Office365 in my company, how can I give my internal users seamless access to these applications? Without having to tell them that they now need a second account, LiveID, to access these services?
You develop your own solution in the cloud, how can you handle all those incoming identities?
Or you simply want to offer your service running in your own DMZ, but now with a modern approach to handle identities?

What if we would use all the identities already existing out there? Why not opening your solution to every user that has a LiveID account, or Google, or LinkedIn, or Facebook?
Or in a more business critical area, using the identity records of your partner? They most likely already operate an Active Directory with all their user accounts. Wouldn’t it be great to use them and give those partner users a SingleSignOn experience on top of it?

This is where the whole topic of Federation comes into play. Federation is the primer that brings services and identities together.
Instead of maintaining your own user database, your service relies on a Federation Server to provide it with necessary user information.

So how does this would look like? Let’s assume you operate your SharePoint collaboration platform within your DMZ. Now you want to open it to your partners, so they can work together with you on various projects.
Instead of telling your SharePoint to refer to your own AD for user information, we would implement a Federation services. This Federation service then accepts the request for identity from the SharePoint and forward it to a trusted user database for authentication. Usually this is the existing Active Directory. As soon as the user is authenticated his or her credentials might get enriched with additional information, so called claims, that the Federation service sends back to the SharePoint.

Now we add a second Federation service. This one however is operated by our partner, redirecting user requests to whatever user database they might use.
So instead of maintaining external partner identities, and therefore requiring our partners to remember another set of username and password, the SharePoint sends the request back to the partners own AD. By using technologies like Kerberos this will even create a SingleSignOn experience for this user as he gets seamlessly authenticated by its own AD.

But this would mean we have to constantly change our SharePoint to add or remove new Federation services. To avoid this you are able to use a so called Federation Proxy.
Your application simply needs to trust this Federation Proxy. This proxy will then route incoming requests to the specific Federation services in the back.
In more advanced scenarios you can also implement what is called claim transformation, taking incoming claims and modifying them so you always have the right user information for your application.

What about the rest of your visitors who don’t belong to any of your partners? Well, today almost everyone has a Facebook, Googlemail or Live account. So why don’t we use them?
This is possible by replacing the typical Federation Proxy with Microsofts Cloud service called Windows Azure Access Control Service, or simply Microsoft ACS.
What is ACS? Actually it is a Federation Proxy in the cloud that you can use for your service to use multiple identity providers. You might use ACS to redirect users to your very own on premise ADFS2.0 service, or to use Facebook or Googlemail.

As this service is in the cloud, and doesn’t store any sensitive personal information, you can access it from everywhere without worrying. Whether your application is hosted on Amazons cloud, you want to attach your external SharePoint or even internal applications.

You don’t want to host your own services? No problem, federation supports you even when you only want to consume cloud services.
For example Office365. With Windows 8, Office 2013, SharePoint 2013, etc. on the horizon you might want to benefit from their cloud sharing features.
But as you might now they require MS LiveIDs for authentication. Instead of giving your employees another set of credentials they have to remember and making it necessary to switch between them, use Federation. This will seamlessly authenticate your users against your AD and passes the claims to Office365 giving your users a perfect SingleSignOn experience.

These are just a few examples of how you can make advantage of these new technologies and services, without sacrificing on either your security or user experience.

And as we at Cambridge are not only talking about it, we created our very own "federation playground". This solution is 100% hosted in the cloud and features domain controllers, federation services, collaboration platform, identity self service, etc.

The picture below will give you a rough overview of this extensive setup:
 
You have a Googlemail account or a MS LiveID? So why dont you try it yourself?

Go to http://try.solutions-for-clouds.ch and have a look at our cloud hosted collaboration platform featuring public identities authentication services.

If you are interested to hear more about this or if you already have some specific questions, please don’t hesitate to contact us!

Friday, August 24, 2012

Conferences and Events

....spread the word! ...some conferences and events we participated / will participate in the upcoming weeks.... presenting two of our recent identity and security management projects...







Swiss eHealth Summit 2012: Identity Management @ Inselspital Bern

Monday, July 23, 2012

Oracle Identity Manager & Analytics Integration


It was so cool: I logged into OIA, clicked Administration-> Configuration -> Import/Export and scheduled a job of Type Export Roles. A few minutes later, they were all there, the roles we created in OIA now existed in OIM!
Granted, both OIM and OIA are products that are still “work in progress” and need enhancements to provide complete functionality. Regardless, it was still good to complete that last step of the integration process and test a full user lifecycle:
Users are created or updated in OIM on a regular basis; these users are then imported to OIA. Based on pre-defined rules created in OIA, the users get specific roles assigned to them. This new assignment is then read by OIM which acts as a provisioning server and grants/revokes access based on the policies associated with these roles.
When it comes to using OIA-OIM as a tool to manage users via Roles, not simply for auditing and compliance purposes, there were three main issues that stood out to me. First is role hierarchy. When you export roles from OIA into OIM, the parent child relationship between roles is lost. In other words, when we look at all the roles in OIM, they exist as a flat structure and assignment of a role would not provide any automatic inheritance.
The second issue that makes the integration not as clean is that in OIA, a Policy can only have a single resource associated with it. In OIM we can associate as many resources as we please to a single access policy. Thus, during my ‘Export Roles’, I ended up having 3 separate policies for the same role if they were associated with 3 different resources. Had this policy been created in OIM directly, we could have had a single policy.
The third issue did have a simple workaround. On the same page where I clicked ‘Export Roles’, the link below says ‘Export Policies’.

Policies of course, define a role. Thus in OIA, we need to create Policies separately and then associate them to a Role. This ‘Export Policies’ feature however, does not work. Clicking on this link gives the message below. This message is incorrect since provisioning servers are available.


The only way to export policies associated with a role is to ‘Export Roles’. Along with the roles, all associated policies are also transferred to OIM.
Overall, from what I understand, the future releases on OIA-OIM will share the same database for Roles. Thus explicit integration may not even be necessary.
For our current OIA-OIM integration, the technical documentation was quite detailed and helpful. (http://docs.oracle.com/cd/E24179_01/doc.1111/e23377/toc.htm)
The one issue we had was integrating resources of type ‘Generic Technology Connector’ (GTC) in OIM with OIA. If you read the documentation, it clearly says that the ‘AccountName’ and ‘ITResource’ properties should be set to true on the parent form. The problem was that with GTCs, we didn’t have a field for which the Property ‘ITResource’ could be set like in the case of Active Director, eDirectory etc. What we had to do was create a new field/column of type ‘ITResourceLookupField’ and set the value to the name of the GTC. Now the property ‘ITResource’ could be set to true for this field/column. Once this new form with the additional column was made Active, the integration for GTCs between OIA and OIM worked as expected.
One issue that we still don’t have a clear resolution for is the OIA rules used to automatically assign users to roles. Manually creating over a 100 rules (if 100 roles exist) is quite tedious, not to mention management of such a large set of rules is quite impossible. Even if the logic is quite simple, for example, if field DepartmentName has value ‘XYZ’ then put him in Role ‘XYZ’, there is simply no way to create one single rule that can dynamically check values and assign roles. Hopefully the future releases will address this problem.

Wednesday, December 21, 2011

Oracle Identity Manager LDAP and Smart Card authentication against Active Directory

Just recently we had the requirement to configure Oracle Identity Manager (OIM) 11g for LDAP and Smart Card authentication against Active Directory.

In this article I will share the configuration steps to get this up an running.

Step 1: Configure LDAP Authentication against Active Directory
Step 2: Configure SmartCard Authentication, Certs are stored in Active Directory

We take the following as given:
  • OIM 11g with Weblogic 10.3.x configured and running
  • Working LDAP Server (here Active Directory) available
  • SSL for OIM / Weblogic set up
  • Users from AD exist in OIM (e.g. through trusted source reconciliation)
  • Users in OIM have scrambled passwords
Configure Weblogic (and OIM) LDAP authentication against Active Directory

Requirements (on AD side)
  • LDAP connection user with the necessary rights in AD to do subtree searches on your users and groups container, respectively in the scope we configure below
  • For LDAP in OIM to work, you need an AD Group called "oimusers", in which all users who shall be able to login to OIM need to be member. The group need to be named exactly "oimusers".
Add an additional Authentication Provider
After a standard OIM / Weblogic Installation you should have something like this


Now we add an additional Authentication Provider
Name: ADAuthenticationProvider
Type: ActiveDirectoryAuthenticator
Control Flag: SUFFICIENT



 
 
Configure Provider Specific Options
LDAP Connection Information
(In a production environment you would use SSL enabled LDAP, but this configuration is not part of this article) 

Principal: Your LDAP connection user 




User scope configuration
User Base DN: Container where your users are found
Rest of the parameters stay default  


Group scope configuration
Group Base DN: Container where your groups are found
Your "oimusers" group must be found in this container or in the subtree
Rest of the parameters stay default



Optionally you can also set Weblogic Server Debugging option for troubleshooting
If needed do the same for oim_server1 as well


Restart your AdminServer.
To confirm the debug option is properly set, you should see <Debug> <SecurityAtn> entries in your logfile.

The logfile will be found in your AdminServer log directory, in our Windows Installation this is:
C:\Oracle\Middleware\user_projects\domains\OIM_DEV\servers\AdminServer\logs\AdminServer.log

Login to your Weblogic console, navigate to "myrealm" and check the Users and Groups tabs. You should now see all your users from Active Directory within the subtree of the configured scope.



Check on Groups tab and find “oimusers” (this is a regular AD group, OIM will only accept authentication for users belonging to this object as “member”). 
Notice that the embedded LDAP (DefaultAuthenticator) also has an “oimusers” group.



With that configuration step complete you should already be able to login to OIM with one of your Active Directory users.

Please note:
In our environment Active Directory is configured as a trusted source for OIM, so all the users we log in with already exist as accounts in OIM, but with unknown random passwords.
For simple LDAP login tests you can just manually create a corresponding account in OIM and give it some password. (Preferably not the same as the LDAP password, otherwise you cannot properly test)



Configure Weblogic (and OIM) Smart Card authentication against Active Directory

Add and configure an additional Authentication Provider.
Name: LDAPX509IdentityAsserter
Type: LDAPX509IdentityAsserter
Active Types Chosen: X.509 (already the default)

Provider Specific Parameters
Host: Your LDAP Host, same as used above
Principal: Your LDAP connection user, same as used above
User Filter Attributes: The attribute in Active Directory you want to map the Smart Card attribute to. In our case we map the SmartCard Subject CN to use userPrincipalName in AD.
Certificate Attribute: userCertificate. Please note default value was userCertificate;binary. This didn't work although our Certs are save in binary in AD.

Certificate Mapping: Container where your AD users are, if in doubt, use the same as during our LDAP configuration above.

Now that we configured the new Authentication Providers we need to put them in the right order.


For Weblogic to request the client Certificate, in our use case from the Smart Card, the SSL Advanced Option "Two Way Client Cert Behavior" needs to be modified for each managed server you want to be able to use your Smart Card.
You can either set it to be "... Requested But Not Enforced or ".. Requested And Enforced". 
See here for more details.


Summary
  • We created one WLS Authentication Provider for LDAP authentication
  • We created one WLS Authentication Provider to act as an Identity Asserter for our Smart Card certificates
  • We configured both providers to "talk" to Active Directory using LDAP to find the corresponding users and to do the authentication based on certificates from the Smart Card
  • Additional steps would be to put a proper authorization setup for OIM and Weblogic in place
Please give feedback if you find this useful, or if something is missing or incorrect.