mercredi 21 décembre 2011

Projet Magellan pour le High Performance Cloud Computing (HPCC)

Donc voila, c'est parti pour le projet Magellan qui a été officiellement sélectionné par le gouvernement le 16 décembre dernier dans le cadre du programme Investissements d'Avenir, Appel à Projet #1  du FSN pour l'Informatique dans les nuages. Voir article Journal du Net sur les résultats de la selection. Mes débuts chez Bull en tant qu'architecte et chef de file ont été intenses pour mettre au point ce projet et promettent de l'être plus encore en 2012 avec son démarrage effectif. Les autres partenaires du consortium incluent les sociétés ATEME et HPC-Project, l'Institut Telecom / Telecom Sud-Paris,  le CEA-List, Inria-Reso, l'EISTI, et OW2. L'objectif est de réaliser le prototype d'une plate-forme de calcul intensif en mode cloud sur une base hardware bullx Extreme Computing fournissant entre autres des fonctionnalités avancées de visualisation interactive à distance.

dimanche 27 mars 2011

OpenAM Book Review

The OpenAM book by Indira Thangasamy comes as a new entrant in a series of books that deal with open source single-sign-on and web access management. The OpenAM book covers well the fundamental concepts, properties and core capabilities of the product with the eyes of a well qualified practitioner. From that respect, this makes this book special and worth reading for anybody who intends to use OpenAM/OpenSSO in a production environment. You will learn in great details how to deploy, configure and manage the product in a highly available and secured environment by using the administration console and the sometimes cryptic, yet powerful, ssoadm command line interface. The book is easy to read and well organized. I don't think there are many other books on the market that go into as many real-world examples and nitty-gritty details about customizing the user experience through the console and schema of the configuration and identity stores. A dedicated chapter deals with how to federate identities with Google Apps and Salesforce.com services using SAML. This is of particular interest to those organizations which would like to externalize some of their business functions to the cloud without losing control over their IT service identities and access rights.

I wish the author could have covered more of the Federation services as well as some of the new capabilities that were introduced in version 9 including the Entitlement Service and the Web Services and Secure Token Service.  That's definitely a prospect for a new edition to be written soon!

vendredi 28 janvier 2011

The First OpenAM Book

I was kindly offered by Packt Publishing to receive a free copy of the first OpenAM book by Indira Thangasamy who's a former colleague at Sun Microsystems.
If you read my posts, you may have noticed that Identity and Access Management (IAM) and OpenSSO (the product name has changed from OpenSSO to OpenAM) account for some of my favorite topics. I am hoping the book tackles some of nitty-gritty details of the product's features, which can be tricky sometimes and not always well document. In short, I can't wait to review the book and share my impressions in these very columns.

jeudi 16 décembre 2010

Automated Delivery of an Identity-enabled Drupal in the AWS Cloud (Part 2)

This the second part of a two-part post that examines how to automatically create an identity-enabled business application in the AWS cloud.

The first part discussed some rationals about the idea that open standards, free and open source software (FOSS), and automated configuration management bring business agility and better integration results than most one-size-fits-all identity integration solutions when it comes to extending the reach of an organization's authentication and access control policies outside of its administrative domain.

The second part intends to show how to effectively achieve this kind of integration with Drupal - the well-known open source content management system (CMS), and OpenAM - the market leading open source authentication, authorization, entitlement and federation product, and how the resulting identity integration can be automatically delivered into Amazon EC2 using the Opscode Chef tools and platform.

Let's try to do some work now...

Install and Configure Chef

Your best bet to get Chef running on your workstation is to follow the Opscode's How To Get Started Tutorial. You only need to complete Step1 and Step2. Creating a Chef client (Step3), at this point, is not necessary as we will do it "automagically" by using the knife bootstrap command.

After completing Step2 you should have a Chef repository that you will need to manage your cookbooks.

Import the required cookbooks
You need to download all the cookbooks that are in the dependency chain of the simplesamlphp cookbook, which I created for this post. The simplesamlphp cookbook has a direct dependency on the drupal cookbook, which in turn has a direct dependency over several other cookbooks including php, apache2  cookbooks. In other words, all the cookbooks required to build a full all-in-one LAMP stack.

You can download all these cookbooks separately from the Opscode's Cookbooks Repository or use knife. The best practice in working with the Opscode Platform is to always keep local copies of the cookbooks you are using in your chef-repo.

To import a cookbook with knife:
$ cd ~/chef-repo
$ knife cookbook site vendor cookbook-name

This command downloads cookbook-name, and creates a 'vendor branch' for you, which enables you to make your own custom changes to the cookbook and keep track of the differences between yours and the upstream. You'll find that there is no simplesamlphp cookbook in the Opscode's Coobooks Repository. That's normal because I didn't put it there since this cookbook is just a proof of concept for the post. You need to download it individually as well as a slightly modified version of the cookbook for drupal from GitHub.

Modify simplesamlphp cookbook
If you intend to run the same setup against your own instance of OpenAM server, you'll need to modify the simplesamlphp cookbook in oder to add the SAML v2 metadata descriptor of your own identity provider. For that you need to edit the file saml20-idp-remote.php under files and change the value of the idpname and idpid attributes within the drupalsaml role accordingly.


Upload all the cookbooks and roles
After you have modified your cookbooks or not, you need to upload them in your organization that resides in the Opscode Platform:

$ cd ~/chef-repo
$ knife cookbook upload --all
$ knife role from file roles/drupalsaml

Bootstrap the Drupal Server Instance in EC2

Chef's knife bootstrap command allows to literally bootstrap a fully functional Drupal server in EC2 out of a vanilla linux distribution. In this demo I use an EBS-backed Ubuntu 10.04 LTS (Lucid Lynx) Server 32-bit distribution with AMI id 'ami-f4340180'.
Don't forget to download the AWS key pair, for the AWS region you want to use, somewhere in your home directory (ex. ~/.ssh/eu-west-1-keypair.pem).

Launch a new EC2 instance from the AWS Management Console. Once the instance is running, copy it's Pulic DNS URL obtained from the dashboard and past it into the knife bootstrap command as shown below. Note the parameter -r 'role[drupalsaml]' that tells Chef to configure the instance as per the recipes and attributes defined in the drupalsaml role. The bootstrap is displayed to the instance's console starting with the installation of Chef itself, followed by the automated installation and configuration of all the required components (i.e. Apache2, PHP,  OpenSSL, MySQL, Drupal, simpleSAMLphp, memcached, and sendmail).

# knife bootstrap ec2-46-51-163-241.eu-west-1.compute.amazonaws.com \
> -r 'role[drupalsaml]' -i ../.ssh/eu-west-keypair.pem -x ubuntu --sudoINFO: Bootstrapping Chef on 
0% [Working]3-241.eu-west-1.compute.amazonaws.com 
Get: 1 http://security.ubuntu.com lucid-security Release.gpg [198B]
Ign http://security.ubuntu.com/ubuntu/ lucid-security/main Translation-en_GB   
96% [Connecting to eu-west-1.ec2.archive.ubuntu.com (10.224.74.112)]
Ign http://security.ubuntu.com/ubuntu/ lucid-security/universe Translation-en_GB
Get: 2 http://security.ubuntu.com lucid-security Release [38.5kB]   
0% [Connecting to eu-west-1.ec2.archive.ubuntu.com (10.224.74.112)] [2 Release 
Hit http://eu-west-1.ec2.archive.ubuntu.com lucid Release.gpg                  

[.....]

ec2-46-51-163-241.eu-west-1.compute.amazonaws.com Successfully installed mixlib-authentication-1.1.4
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com Successfully installed mime-types-1.16
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com Successfully installed rest-client-1.6.1
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com Successfully installed bunny-0.6.0
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com Successfully installed abstract-1.0.0
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com Successfully installed erubis-2.6.6
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com Successfully installed moneta-0.6.0
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com Successfully installed highline-1.6.1
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com Successfully installed uuidtools-2.1.1
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com Successfully installed chef-0.9.12
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com 17 gems installed
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com [Wed, 15 Dec 2010 12:55:20 +0000] INFO: Client key /etc/chef/client.pem is not present - registering
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com [Wed, 15 Dec 2010 12:55:24 +0000] WARN: HTTP Request Returned 404 Not Found: Cannot load node ip-10-234-178-251.eu-west-1.compute.internal
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com [Wed, 15 Dec 2010 12:55:27 +0000] INFO: Setting the run_list to ["role[drupalsaml]"] from JSON
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com [Wed, 15 Dec 2010 12:55:29 +0000] INFO: Starting Chef Run (Version 0.9.12)

[.....]

ec2-46-51-163-241.eu-west-1.compute.amazonaws.com [Wed, 15 Dec 2010 13:00:31 +0000] INFO: Navigate to 'http://ec2-46-51-163-241.eu-west-1.compute.amazonaws.com/install.php' to complete the drupal installation
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com [Wed, 15 Dec 2010 13:00:43 +0000] INFO: Chef Run complete in 314.902713 seconds
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com [Wed, 15 Dec 2010 13:00:43 +0000] INFO: cleaning the checksum cache
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com [Wed, 15 Dec 2010 13:00:43 +0000] INFO: Running report handlers
ec2-46-51-163-241.eu-west-1.compute.amazonaws.com [Wed, 15 Dec 2010 13:00:43 +0000] INFO: Report handlers complete

A the end of the run, which in my case took about 5 minutes, you get a fully operational identity-enabled Drupal server instance federated with OpenAM. Note however, the message at the end of the configuration sequence:

Navigate to 'http://ec2-46-51-163-241.eu-west-1.compute.amazonaws.com/install.php' to complete the drupal installation.

This is because Drupal 6.x requires some manual configuration at the end of the installation. According to a conversation I had with Marius Ducea, who authored the cookbook for Drupal, that limitation should be removed with Drupal 7.

It is assumed that there is an OpenAM 9.5 server running somewhere in your data center or in the cloud. That OpenAM server must be accessible from the newly created Drupal instance. In this demo, I have created an OpenAM server in EC2 that is attached to an Elastic IP (static IP address). Chef is also used in the OpenAM server to help with automatically importing the metadata descriptor of the SAML2 service providers that are started in the same Opscode's managed organization. A unique recipe for OpenAM is executed on a regular basis, by chef-client running as a daemon, which only task is to search for any node that is assigned with the drupalsaml role, then retrieve the location of the service provider metadata reading the simplesamlphp.metadata attribute, to finally execute ssoadm import-entity locally. The openam recipe is fairly straighforward.

sps = []
url = String.new
hostname = String.new

search(:node, "role:drupalsaml") do |n|
  sps << n
end

sps.each do |n|
  url = n['simplesamlphp']['metadata']
  hostname = n['hostname']

  if !url.nil? && url.match(/^http/)
    bash "download-entity-descriptor" do
      user "root"
      cwd "/tmp"
      code <<-EOH
      wget #{url}
      EOH
      only_if "/usr/bin/test ! -f /var/log/entities#{hostname}"
    end

    execute "move_entity_descriptor" do
      user "root"
      command "mv /tmp/openam-idp /var/log/entities/#{hostname}"
      action :nothing
    end

    execute "ssoadm_import_entity" do
      user "root"
      command "/opt/openam/bin/ssoadm import-entity -u amadmin -f /opt/openam/password -m /tmp/openam-idp -t local"
      notifies :run, resources("execute[move_entity_descriptor]")
    end
  end
end  

Now, you click on the screencast below to see how the identity federation between Drupal and OpenAM works in practice.


video

vendredi 3 décembre 2010

Automated Delivery of an Identity-enabled Drupal in the AWS Cloud (Part I)

This post is composed of two parts that examine how to automatically create an identity-enabled business application to the AWS cloud.

The first part discusses the rationals behind the idea that open standards, free and open source software (FOSS), and automated configuration management bring business agility and better integration results than most one-size-fits-all identity integration solutions when it comes to extending the reach of an organization's authentication and access control policies outside its administrative domain.

To illustrate that claim, the second part will show how to effectively achieve an integration between Drupal-the well-known open source content management system (CMS), and OpenAM-the market leading open source authentication, authorization, entitlement and federation product, and how the resulting identity integration can be dynamically delivered in Amazon EC2 using the Opscode Chef tools and platform.

The problem in a nutshell

One of the main issues with creating new services in the AWS cloud, like any other public cloud, is that IT may loose control over who has access to what resources because the usual authentication and access control schemes that are commonly enforced within an organization's data center security perimeter may not be applicable in a multi-tenant environment. But IT surely needs to enforce authentication, sign-on (SSO), and access controls across the entire organization's IT services, so that, only those users that are properly authorized by IT are granted access, regardless of whether the application is running in-house or in the cloud. While centralized access control policies are relatively simple to support behind the enterprise's firewall, it is more difficult to support "in the wild" because IT has different types of control (or no control at all) over those software-as-a-service (SaaS) applications that are provided. As a result, compliance with regulations that mandate accountability for data security, privacy and auditing becomes challenging to achieve. Another challenge with moving an organization's business applications to a public cloud stands from the fact that IT doesn't want to duplicate identity and access management (IAM) information all over the place, as it may compromise security and the consistency of the organization's identity management system. In general, IT needs to leverage the identity information where they have it, meaning from within the organization's data center. Additional issues may arise from prohibiting the elasticity benefits of a cloud-operated infrastructure through locking users and applications in a more static compute environment that would prevent features like auto-scaling to work properly.

The devil is always in the details

Identity-enabling a business application, whether it is running on-premise or in a public cloud, almost always requires a fair amount of customization to address the nitty gritty details of an identity integration. I think this is because most business applications, regardless of how they are delivered, are almost never deployed out of the box, as is. If we take Drupal for example, the application does not work out-of-the-box in a SAML-based identity federation without requiring some significant customization. Note that I could also have taken the example of WordPress, Joomla, or MediaWiki. The same customization needs would apply.

Why is that? Simply because many of those applications pre-date the standardization and more pervasive use of federated identity technologies such as SAML. Those applications were designed with a built-in user management support in mind that is tightly coupled with the application's core functions. For example, Drupal creates, at install time, a SQL user database that is built out of a proprietary schema. Drupal needs that database to compute and render customized contents when a user logs in. This unfortunate fact of a fractured identity management landscape is responsible for the proliferation of the so called identity silos that are "traditionally" addressed by expensive identity management software, which mitigate the mess through the deployment of synchronization connectors. With that understanding in mind, it is easy to understand why data security and access control issues rank among the top cloud computing's adoption concerns in customer surveys.

No one-size-fits-all solution for the identity integration problem

Some newly entering cloud security vendors promote an architectural approach that can address the identity and access management divide by extending the reach of an organization's authentication and access control schemes to a cloud through the use of a general-purpose HTTP traffic interposition artifact known as HTTP Access Proxy Gateway. The role of an HTTP Access Proxy Gateway is to enforce user authentication and access control at the HTTP protocol level, somewhere in the cloud infrastructure, to protect access to resources. I would qualify this architectural approach as a one-size-fits-all solution, which in my opinion can hardy address the identity integration challenge, for business  applications like Drupal, because it doesn't address the problem deeply enough to be truly usable. As we have seen above, the identity integration challenge needs to be addressed at the application level, as opposed to the network level, so that the core functions of the application can be maintained. Another issue with the proxy approach is that it creates a performance bottleneck (i.e. every HTTP request must be intercepted and checked against a valid session), and a single point of failure, which altogether doesn't play well with high availability and auto-scaling objectives.

A plug-in approach based on open standards and open source software

The starting point of the reflection builds upon the fact that Drupal, for example, can be easily extended. In open source software, modularity is more a rule than an exception. Therefore, thanks to its modular design, Drupal can incorporate plug-ins (called modules) that can modify the behavior of a particular core function. For example, in Drupal 6.x, a module that implements the hook_user API can alter the login / logout logic so that instead of getting a user's identity (i.e. email, first name, last name, ... attributes) from the database, the module can execute an authentication redirect to a trusted Identity Provider (idP) party, which in the end of the browser-based redirections choreography, returns a SAML assertion in the HTTP request. The assertion that is digitally signed is deciphered using the idP's public key to ensure that the user was properly authenticated against the organization's IAM authority. The SAML assertion contains the attributes of the user's identity that can be mapped to create (or update) a Drupal user account accordingly (if that account doesn't exist yet), and derive which role the user belongs to, and so, allows Drupal to enforce any role-based access control policy that may be defined.

All this is possible with using world-class FOSS from the simpleSAMLPHP project, led by UNINETT, and the OpenAM project, led by ForgeRock and the OpenSSO community.

The simpleSAMLphp project provides a native PHP application that supports the functions of a SAML V2 Service Provider (SP). In addition, simpleSAMLphp provides also a Drupal 6.x module that implements the hook_user API that can redirect authentication requests to a SAML V2 compliant idP like OpenAM. OpenAM is the IAM Swiss Army Knife solution in that it can perform authentication protocol conversions between SAML and many other enterprise's authentication standards including, but not limited to, LDAP, Kerberos, X509 certificate, and Radius.

Automated delivery of the identity-enabled application to Amazon EC2

In a series of foundational articles "Cloud computing and the big rethink", James Urquhart argues that with cloud computing, the very form of application delivery will change in that cloud-operated infrastructures and software development will play a major role. This comes from the need for more efficient application delivery and operations to address the accelerated need for new software functionality driven by end users. The most obvious place where this is happening is in the SaaS area. Cloud services that fall under this category are targeted at end users with specific business needs such as content management system (CMS) and customer relationship management (CRM). I think that this concept is particularly relevant to our identity integration case because it helps addressing the challenge of delivering compliant services in the cloud. As such, the creation of identity-enabled applications calls for a high level of automated operations so that the communicating parties can be properly and securely linked with no (or minimal) manual intervention.

This is were the Opscode Chef framework comes into place. Chef allows to bootstrap a virtual machine instance in an infrastructure supported by the Opscode Platform to dynamically install and configure all the software components that are required to run the identity-enabled application. The concept behind Opscode Chef is known as DevOps. This concept contrasts with the static image building process in that software installation and configuration is performed at runtime by program. A software configuration can be tweaked incrementally until a machine's state matches the desired end state without having to re-bundle a new image every time a tiny change is applied. Then, configuration management becomes more like a software development project, and so, breaking the divide between IT operations and software development.

With Opscode Chef, a configuration management task is called a recipe. It is written in a domain specific language (DSL) based on Ruby. Recipes can execute arbitrary Ruby code, but are mainly designed to execute actions on resources that are an abstract representation of system resources like packages, directories and files. For example to install Apache2 on Linux, one would just have to insert this block in a recipe, or event better, execute the default recipe of the Apache2 cookbook.

package "apache2" do
    case node[:platform]
    when "centos","redhat","fedora","suse"
      package_name "httpd"
    when "debian","ubuntu"
      package_name "apache2"
   end
   action :install
end

Once the Apache2 package is installed on the target machine, further invocations of that recipe would not attempt to reinstall the package. This behavior is referred to as being idempotent meaning
unchanged when multiplied against itself (in mathematics). Opscode Chef also uses the concept of node, which can be associated with one or several roles, which themselves encapsulate runtime configuration parameters and a list of recipes to execute on the target machine. Recipes are packaged in a cookbook whose successive versions are typically managed in a source code control system such as Git or Subversion.

Voila, that is all for today. After this rather lengthly, and hopefully not too boring introduction, the second part of this post will discuss the concrete details of how to do the integration and run the automated delivery of the identity-enabled Drupal.


jeudi 8 juillet 2010

Information Card Authentication Module on OpenAM

That was on my todo list for a while so this morning I took couple hours to verify that the OpenSSO Information-Card Module (a.k.a Authnicrp) works properly on the latest version of OpenAM 9.5. It appears that after modifying the Information-Card-enabled login page (i.e. infocard.jsp) to better reflect OpenAM's logotype graphics, everything worked out-of-the-box without a glitch meaning that OpenAM is 100% compatible with OpenSSO for that extension. The modification will be committed to OpenAM's subversion repository shortly. Below is a snapshot of what the login page looks like with that module.

mardi 30 mars 2010

Take-Away from Kuppinger Cole's Cloud Computing Security Foundations Virtual Conference (Part 2)

Last week, on March 25-26, 2010 Kuppinger Cole sponsored a virtual conference on the topic of the Security Foundations for Cloud Computing. This Kuppinger Cole event has been organized around six identity and security-related questions that have been the subject of keynotes, panel discussions and analyst viewpoints. What follow are my "augmented| take-away notes of these viewpoints. In Part 1 of this post, I focus on the question of the "Cloud Computing security standards: which ones are already there and which ones are missing?". Part 2 of this post focuses on Martin Kuppinger's initial keynote around the question of "Cloud Computing, is it really a risk?"

Let's start by the end of the talk that concludes by debunking some of Cloud Computing security myths.
  1. The Cloud is not inherently insecure. It mainly depends upon the provider's ability to do a proper job with the management of the security threats.
  2. Conversely, the Cloud is not more secure than internal IT. Again, it depends on both the Cloud provider and internal IT expertise to deal with security threats.
  3. A few Cloud security issues are new. Most already exit in internal IT and outsourced service providers.
  4. Security is the problem of the Cloud provider only to the extent that it falls within the scope of its service delivery description. For example, an IaaS provider is not responsible for the security of the hosted operating system. A PaaS provider is not responsible for the security of the enterprise's data and applications. A SaaS provider is not responsible for the enforcement of the enterprise's governance and auditing policies.
  5. We can store data outside of the EU zone, but careful considerations must be taken with regard to the EU's data security and privacy regulations enforced by the member states. For example, the EU privacy laws have established regulations that prevent the disclosure of sensitive personal information without explicit consent of the user to countries outside of the EU who do not honor equivalent privacy laws. To cope with this problem, some providers are now offering features to enable sticky location of data in security zones and regions across their distributed data centers
  6. SAML unfortunately doesn't solve all the IAM issues in the Cloud. It helps to solve secure authentication issues, but doesn't help much with the larger problem of authorization.
  7. It is somewhat true that security in the Cloud can't be measured. In particular, auditing and risk metric logs are missing, due in part to the lack of standardization, as described in Part 1 of this post, although the situation should improve over time.
CEOs and CIOs need to understand that Cloud Computing requires new policies and new controls because it may give rise to new IT risks that can have an operational and even strategic impact on the enterprise's efficiency and effectiveness. Adopting Cloud Computing to externalize computing resources poses the question of ascertaining opportunities versus operational and strategic risks.

When ascertaining Cloud security risks, we tend to think more of the technology side of the issue rather than of the information side. However, this way of thinking should be reversed because, in IT, the technology is here to manage the information. Not the other way around. Therefore, information should take precedence over technology when it comes to assessing the security and risks associated with Cloud Computing. For example, enterprises should start thinking about where the information is and what it means from a security and risk management perspective before moving to the Cloud. This eventually should help to decide what kind of information can reside in a public Cloud versus what kind of information should stay on-premises. Answering this question, along with what services or applications use this information, should also help to figure out which technology and Cloud provider can fulfill the enterprise's data security and risks management requirements. Another important consideration for CIOs at the beginning and along the course of the service procurement process is how well the service continuity and security requirements are met by the Cloud provider. The risk that the Cloud provider does not fulfill these requirements should be covered by the Service Level Agreement (SLA). Thus, a precise and standard description of the service level in the SLA should constitute the foundation of a risk mitigation strategy by looking at important service procurement characteristics such as information location, security of the transport, security of storage, authentication and authorization of users, auditing interfaces, and privileged user controls.

So, yes there are risks involved in using Cloud services. Some are new and some are old, although they can be exacerbated by the diversity and multiplicity of the actors. But most risks associated with Cloud Computing are well-known and not really new or specific to Cloud Computing. Knowing these risks and understanding how well they are covered (or not covered) through a detailed service description allows drawing risk mitigation plans. Depending on the quality and expertise of the Cloud provider, some internal security weaknesses can be reduced by externalizing the procurement of IT services. On the other hand, in doing so, new risks can arise. The crucial point of the decision-making process is to strike a fine balance between taking advantage of the Cloud's numerous benefits in terms of cost cutting and improved business agility, versus new risks arising from externalizing IT services.