Google Site Search

Google
 

Tuesday, April 28, 2009

NIST guidelines on Enterprise Password Management

GCN Writer William Jackson writes on NIST special publication 800-118 that offers guidelines for password management in the enterprise. The article can be accessed here.

Lets take a peek at the special publication draft.

The publication defines Password management as the process of defining, implementing, and maintaining password policies throughout an enterprise.

NIST recommends protecting the confidentiality of passwords:
1. Create a password policy that specifies all of the organization’s password management-related requirements, including FISMA and other regulatory requirements. “An organization’s password policy should be flexible enough to accommodate the differing password capabilities provided by various operating systems and applications.”
2. Protect passwords from attacks that capture passwords. “Users should be made aware of threats against their knowledge and behavior, such as phishing attacks, keystroke loggers and shoulder surfing, and how they should respond when they suspect an attack may be occurring. Organizations also need to ensure that they verify the identity of users who are attempting to recover a forgotten password or reset a password, so that a password is not inadvertently provided to an attacker.”
3. Configure password mechanisms to reduce the likelihood of successful password guessing and cracking. “Password guessing attacks can be mitigated rather easily by ensuring that passwords are sufficiently complex and by limiting the frequency of authentication attempts, such as having a brief delay after each failed authentication attempt or locking out an account after many consecutive failed attempts. Password-cracking attacks can be mitigated by using strong passwords, choosing strong cryptographic algorithms and implementations for password hashing, and protecting the confidentiality of password hashes. Changing passwords periodically also slightly reduces the risk posed by cracking.”
4. Determine requirements for password expiration based on balancing security needs and usability. Regularly changing passwords “is beneficial in some cases but ineffective in others, such as when the attacker can compromise the new password through the same keylogger that was used to capture the old password. Password expiration is also a source of frustration to users, who are often required to create and remember new passwords every few months for dozens of accounts, and thus tend to choose weak passwords and use the same few passwords for many accounts.”

These are practical guidelines that will help enterprises deal with issues surrounding passwords. Alternative mechanisms such as smart cards augmented by knowledge based authentication mechanisms probably need to be explored. There is no alternative for strong PKI. This sentiment is aired by the publication with "Therefore, organizations should make long-term plans for replacing password-based authentication with stronger forms of authentication for resources with higher security needs."

Wednesday, April 22, 2009

Red Hat CTO Brian Stevens on Cloud Computing

The Laurel Group has released an excellent White Paper on Cloud Computing that includes a Q&A session with Brian Stevens, Red Hat CTO. The paper can be accessed here. The paper has Q&A sessions with visionaries/thought leaders from companies such as IBM, Cisco, Citrix etc. It is an excellent read.

Brian talks about Cloud 2.0. I agree with him. There is a need for research/standards around security for ushering an era of Cloud 2.0. Brian also makes an important point about the SLA language needed to be updated to include the value of data rather than just the value of service (uptime, response times etc). In my opinion, the SLA should also incorporate encryption policies for data, as data security is going to be of the utmost importance in the public/outsourced cloud environment.

Currently, the cloud is relevant to private clouds and outsourcing IT duties for SME, who do not have the capital to host their own compute infrastructure.

I think rather than term the cloud to be just a hype, it is prudent to watch the developments and adopt it over time. Incorporating private clouds is certainly going to benefit companies, irrespective of size.

Lets see how long it takes to see successful ubiquitous 'IT as a service'. It is probably only a few years away. Usecases such as Eli Lilly usage of AWS to yield results for $89 (private investment would have been a billion) cannot be ignored. :)

Lets look at the current state of security in the cloud. Amazon CTO Werner Vogels talks about 3 tiered security in AWS (Physical, Operational and Programmatic Security). Only a selected set of Amazon employees have access to the data centres and infrastructure.

Do not forget to take a peek at the Red Hat Open Source Cloud Computing Forum.

Sunday, April 19, 2009

Whitehurst talks about Open Source Government

Jim Whitehurst's opinion on an Open Source Government that is transparent.
Red Hat is excited that the Obama administration recognizes the value of open source beyond software. Open source principles are changing how we learn, how we share information, how developers create, and how companies do business. Now it has the opportunity to change our government.


Read Jim's post here.

Thursday, April 16, 2009

Is PCI-DSS the panacea to Data Protection woes?

Looking at the battering that PCI-DSS has gone through at a recent US Government Congressional hearing, one might assume that PCI-DSS is just not sufficient for protecting customer data. The congressional hearing is discussed here.

The question should not be WHETHER it is sufficient for protecting customer data, the real issue is are there any other efforts in the industry to define something along the lines of PCI?

PCI is the first standard that has been drawn by the council that includes banks and the credit card companies and is a strict requirement for any entities processing credit card transactions at a large scale. Now, the standard has some rules and requires the expertise of security auditors to evaluate the state of any entity.

Again, the quality of auditors is also critical to the success of the standard. There is a need to work further on the standard to figure out the loop holes and opportunities for improvement, based on the real world experiences from credit card breaches that have happened ever since the standard was introduced.

There is no second chance to any vendor who loses customer data. It is just not reputation that is at stake, it costs MONEY. :(

"Advances in Browser Security" Presentation

As an elected member of the Oasis IDTrust Steering Committee, I had an opportunity to participate at the NIST IDTrust 2009 symposium held at NIST.

I moderated a special session on "Browser Security" where the speakers included Prof.Chadwick from University of Kent.

The complete program is here.

My presentation slides are here.

Enjoy.

Presentation Notes

* The CA goes through extensive review process before issuing an EV Certificate. Annually, the CA has to go through an audit process.
* Firefox2 and Opera display an yellow bar for https. Firefox3 dropped the yellow bar. The user will have to use the tools menu and page info to get information on the certificate.
* Google Chrome uses a secure architecture by separating the web domain and user domain. This separation of domains accounts for about 70% web vulnerabilities. The rest 30% are not under the control of browsers.
* Private Browsing is one of the most sought after features by users for a long time. Apple Safari has private browsing from a long time.
* Many of the plugins operate with root privileges. Hence it is important to use trustworthy plugins.

Tuesday, April 14, 2009

InternetEvolution: Google losing money on YouTube

An interesting analysis on Internet Evolution site on how Google is losing money with YouTube on a daily basis. The information is here.

This analysis looks similar to the Ebay acquisition of Skype gone wrong.

Of course, we all use YouTube, Skype, Facebook etc.

But sometime the companies that you are planning to acquire may be overvalued.:)

Read here.

Friday, April 10, 2009

Bruce Schneier on Cloud Security

Bruce Schneier terms the cloud to be a marketing hype and cautions companies to watch out in his interview.

http://www.schneier.com/news-083.html


Anil's opinion is:

Cloud has some interesting uses mainly for data intensive verticals such as the drug industry. The following article (requires free registration) in the Information Security Magazine, talks of one such use case where a drug researcher at Eli Lilly got his research done for a sum of $89 (using the Amazone EC2) - his own private data centre would have cost a billion and taken months to set up.


http://searchsecurity.techtarget.com/magazineFeature/0,296894,sid14_gci1349671,00.html


In my opinion, the industry needs to address the security in the cloud (via groups such as the Cloud Security Alliance). We cannot just write off the cloud as a hype given useful use cases such as the Eli Lilly case.

Since sensitive data is going to flow around unknown corners of the cloud, it is imperative that encryption (and hence Key Management) becomes an hot topic of the day in the cloud. Apart from that, establishment of trust models is necessary to gain confidence. You may trust the prominent cloud vendors but what about those entities these vendors rely on, to provide their services?



Sunil Madhu, Chief Security Architect, Cisco Policy Unit has the following response to Bruce's interview:
"I agree with Bruce that the word "cloud" is a buzzword. Essentially the cloud is nothing more than a virtual, dynamic -- what I have termed, "commoditized" -- data-center. However let's not confuse the cloud with the traditional data-center. The cloud has aspects of the traditional data-center and dare I say it -- the mainframe computing environment.

Elastic compute/storage capacity, dynamic machine on-boarding/off-boarding, template-based machine composition are all features specific to the cloud environment and not the traditional data-center. With the elastic compute/storage capacity comes additional savings through new subscription/licensing models and pay-as-you-go computing.

Think about this: if you run a service in the traditional data-center, you are forced to buy/lease sufficient infrastructure to meet your peak load scenarios -- such as during the start of the day, intra-day or during a fail-over scenario. You would have to plan for this capacity in advance and spend $$$ on redundant infrastructure accordingly. With the cloud, your compute/storage capacity is elastic -- so you don't have to buy/lease the infrastructure you don't need since the environment itself will expand and contract to meet your load demands. This is a feature that the mainframe-folks will be quite familiar with. Your compute resources go to the parts of your applications and services that need that capacity, on-demand, but at a lower TCO than the mainframe environment.

As for the whole debate about security -- common sense should dictate that just because you are moving from one type of data-center to the next does not mean that you should abandon the security best-practices developed over the last decade. The cloud does pose some new challenges, but then innovation always comes to the rescue. Good architecture and design teaches you to look at attack-vectors as a way of constraining aspects of the design. I have heard the argument that "...if you use a proprietary program or somebody else's web server, you're defenseless..." all too often before. In the past 10 years, how many operating systems did you write yourself? How many IT/IS solutions did you opt to by COTS vs. build yourself? If your own developers were to build your very own OS and your web-server, does that make your software more trust-worthy than something off the shelf? Ever heard of the term "back-door"? This is fear-mongering and I don't at all agree with it. A key take-away from the open-source community it is that transparent software can be made secure and trustworthy more easily than opaque, proprietary software.

Not all applications are suitable to the dynamics of the cloud and not all cloud-providers in themselves are suitable to help you meet 5-nines SLA, although most cloud providers promise at least 4-nines of availability. As you re-engineer/tweak your applications and services to migrate them to the cloud entirely or opt to utilize a hybrid model, make sure that you are following the same security best-practices you would do normally. The are plenty of reputable service providers for the cloud, some with innovative solutions. Indeed, the security vendors of old are slowly moving their wares to support the cloud albeit slowly.

It is a brave new world, but one that will emerge out of the hype-cycle as operational model of choice for today economy and tomorrows services IMO.
"

Yes, it is a brave new world.

Wednesday, April 8, 2009

JBossXACML v2.0.3 Released

Since we had a very successful interoperability experience as part of the Oasis-HITSP Technology Demonstration at HIMSS2009 in Chicago, we are ready to roll the next version of JBossXACML.

JBossXACML v2.0.3 should be available from here. (Click <==)


Release Notes
:
==========================
Release Notes for JBoss Security and Identity Management
Includes versions: JBossXACML_2.0.3.alpha JBossXACML_2.0.3.CR1 JBossXACML_2.0.3.CR2 JBossXACML_2.0.3.CR3 JBossXACML_2.0.3.CR4 JBossXACML_2.0.3 JBossXACML_2.0.3.CR5

** Sub-task
* [ SECURITY-390 ] JBossXACML: ResourceLocator
* [ SECURITY-396 ] Rule:: NPE if description of a rule is empty
* [ SECURITY-400 ] XACML Conformance Tests: Mandatory - attribute references, functions, combination algos
* [ SECURITY-401 ] XACML Conformance Tests: Mandatory - schema components

** Feature Request
* [ SECURITY-257 ] AttributeDesignator should throw RuntimeException when a particular attribute that was expected is missing
* [ SECURITY-275 ] JBossSAMLRequest: buildRequest method
* [ SECURITY-382 ] JBossPDP ctr to take Configuration Metadata also
* [ SECURITY-388 ] JBossXACML: AttributeLocator
* [ SECURITY-407 ] XACML: Configuration should allow specifying just the locators (and no policies)

** Bug
* [ SECURITY-206 ] Attribute type not set in constructor
* [ SECURITY-295 ] NPE in JBossPDP -> createValidatingUnMarshaller when schema is missing
* [ SECURITY-351 ] JBossResponseContext->getResult has missing values
* [ SECURITY-391 ] JBossXACML: PDP construction should be one time
* [ SECURITY-394 ] FunctionBase: bag-size throws an IllegalArgumentException
* [ SECURITY-395 ] AbstractPolicy: Empty Description element throws NPE
* [ SECURITY-397 ] XACML: HigherOrderFunction checkInputs needs to relax type checking on evaluations
* [ SECURITY-399 ] XACML: Apply->evaluate method tries to encode an attributeValue that can be a bag
* [ SECURITY-403 ] XACML: Resource can have multiple attributes with resource-id
* [ SECURITY-405 ] XACML:: TimeAttribute computes GMT miliseconds incorrectly when the date is 1 day after Jan 1, 1970

** Task
* [ SECURITY-335 ] Sync up sunxacml bug fixes
* [ SECURITY-337 ] Validate the Oasis XACMLv2 conformance tests
* [ SECURITY-359 ] Retire jboss-xacml-saml module
* [ SECURITY-360 ] Assembly for jbossxacml
* [ SECURITY-409 ] Release JBossXACML 2.0.3
==============================================================

What is new?
* Some performance improvements in the PDP evaluation. Previously we were creating a PDP per evaluation. Now we instantiate a PDP and then use it for each evaluation. (Call this an oversight. We are human!).
* You can specify just locators (policy, attribute or resource) in the configuration file without the need for specifying the policy/policysets. This is useful when you need to write a locator that needs to fetch a policy or attribute from a different location.
* The Oasis v2 Conformance Tests are now part of the JBossXACML test suite. So every release will ensure that we have conformance.

User Guide: http://www.jboss.org/auth/jbosssecurity/docs/jbossxacml/html/jbossxacml.html