Friday, August 29, 2014

Weekly news roundup for Cloud and Networking: August 25 - 29

Cloud and Networking news for the week of Aug 18
  • ZDnet: The case for the hybrid cloud
  • Simon Wardly gets his cloud predictions in early with "Cloud, 2016 and onwards ..." 
  • McKinsey Global Survey: IT as revenue generator: McKinsey Global Survey results
  • TechCrunch: Amazon Opens Up Its Enterprise Cloud Storage Service Zocalo To All >> " Zocalo, its secure document storage and sharing service designed for enterprise use, generally available. The news comes, not coincidentally, on a day when cloud storage competitor Dropbox announced lowered pricing and storage increases for its Pro customers."
  • IBM: 70% of Enterprises Are Using Big Data, Cloud, Mobile and Social

Cloud deployment is broadening, according to the McKinsey Survey

CohesiveFT in the news:
  • Senior Cloud Solution Architect Sam Mitchell in DataCenterPost - Why is UDP Multicast Disabled in the Public Cloud?
Catch up with the CohesiveFT team:

Thursday, August 28, 2014

Guest post: Public Key Infrastructure in the Cloud

Guest blog from Lohit Mehta of the Infosec Institute 

As the adoption of various forms of cloud models (i.e. public, private, and hybrid) in various industry verticals are increasing, the cloud buzzword is on a new high. However, customers still have doubts about the security areas and raise a common question: “How can I trust the cloud?” The simplest answer to this question will be to “build trust around the cloud,” but how?

Well, we have a wonderful concept of Public Key Infrastructure (PKI), which if planned and implemented properly, can be a good fit for the answer to build customers’ trust in a cloud. Before discussing in detail the implementation and challenges of PKI in cloud, let’s learn or refresh some basics.

Each and every security process, layer or software must implement and cover the CIA triad. What is CIA triad?
  • C-Confidentiality: It refers to the process to ensure that information sent between two parties is confidential between them only and not viewed by anyone else.
  • I-Integrity: It refers to the process to ensure that the message which is in transit must maintain its integrity i.e., the content of the message must not be changed.
  • A-Availability: The systems available for fulfilling requests must be available all the time.

Photo credit: Linus Bohman via Flickr

Along with these, some important parameters are described below:
  • Authentication: the process of confirming someone’s identity with the supplied parameters like username and password.
  • Authorization: the process of granting access to a resource to the confirmed identity based on their permissions.
  • Non-Repudiation: a process to make sure that only the intended endpoint has sent the message and later cannot deny it.

Public Key Infrastructure (PKI)
To provide security services like confidentiality, authentication, integrity, non-repudiation, etc., PKI is used. PKI is a framework which consists of security policies, communication protocols, procedures, etc. to enable secure and trusted communication between different entities within as well as outside the organization. PKI is built as a hybrid mode of the symmetric and asymmetric encryption. Let’s discuss this in brief:
  • Symmetric Encryption: A single key is used to encrypt and decrypt the message sent between two parties. Symmetric encryption is fast, and this type of encryption is effective only when the key is kept absolutely secret and secure between two parties. But to transmit this secret key over an un-trusted network i.e., Internet, comes asymmetric encryption.
  • Asymmetric Encryption: A pair of keys is used to encrypt and decrypt the message. The pair of keys is termed as public and private keys. Private keys are kept secret by the owner, and the public key is visible to everyone. Here is how it works: Suppose ‘A’ and ‘B’ want to communicate using asymmetric encryption. So ‘A’ encrypts the message with ‘B’ public key so that only ‘B’ can decrypt the message with its private key. After decrypting the message, ‘B’ will encrypt the message with ‘A’ public key so that only ‘A’ can decrypt it using its own private key. Sounds like a perfect solution, doesn’t it? Well as far as secrecy is concerned it is, but when it comes to real world scenarios, asymmetric encryption is pretty slow as the keys involved in this process are of 1024, 2048 bits ,etc. and after the initial handshake, for subsequent requests this overhead still needs to be incurred. So what to do?

In comes the PKI approach, which is a hybrid approach of symmetric and asymmetric encryption. In this, the handshake process happens with asymmetric encryption to exchange the secret key used for symmetric encryption. Once the secret key is exchanged, the rest of the communication happens over asymmetric encryption. In this way, security and performance are both achieved. PKI is a hierarchal model which is comprised of the below components:
  • Certificate Authority (CA): This entity issues certificates for requests received. This can be in-house or trusted third parties CA like ‘Verisign’, ‘COMODO’, ‘Thwate’ etc.
  • Registration Authority (RA): This entity performs the background checking process on the requests received from end point entities like their business operations in order to avoid issuing any certificate to a bogus entity.
  • Certificate Revocation List (CRL): This is the list issued that contains a list of the certificates which are no longer valid to be trusted.
  • End-point Entities: These entities make requests for the certificates in order to prove their identity and gain trust over the Internet.
  • Certificates Repository: This is the repository which contains a list of issued certificates which the end point entities can retrieve in order to verify the corresponding server. For end users, this repository is usually located in the browser, such as Firefox, IE, Chrome, etc.

As it can be noted, the maintenance of these keys is of utmost importance and losing control over these keys will leave the encryption on data useless. Key management is an important process and the most challenging process, as any deviation in this could lead to data loss. The key management life cycle involves the following steps:
  1. Creation:  The first step in the key management life cycle is to create a key pair and apply access control around it. While creating the key, certain important factors need to be considered like key length, lifetime, encryption algorithm, etc. The new key thus created is usually a symmetric key and it is encrypted with a public key of the public-private key pair.
  2. Backup:  Before distributing keys, first of all the backup of keys should be made to some external media. As normally the key created is a symmetric key a.k.a shared key which should be encrypted with a public key from the key pair, then it becomes important to protect the other part of key pair i.e. the private key. Also the policies around the backup media and vaults should be up to the same effect as is designed for any critical business operation to recover from any type of disruption.
  3. Deployment:  After the key is created and backed up, then it is ready to be deployed in the encryption environment. It is advisable to not directly put these keys into action on the production environment. The key operations should be analyzed, and if successful the key should be used for encrypted production data.
  4. Monitoring:  Monitoring of the crypto systems is very important to check for unauthorized administrative access and key operations such as: creation, backing, restoring, archival and destruction.
  5. Rotation:  Keys should be rotated on a regular basis with the keys that are either meant to be expired or need to be changed following a business change. It is important to realize that keys should not be put into system.
  6. Expiration:  As per the best practices dictated in compliances like PCI-DSS, it is important that even valid keys need to be changed after a span of time, not only after the keys are expired. Before the expiration phase, key rotation phase should take place by replacing the associated data with new keys.
  7. Archival:  Before the destruction of keys, archival of expired and decommissioned keys is important if there is still related data in the environment that needs to be recovered like data for recovery operations. This phase is very important from the business decision perspective, and there are some appliances which never go for the destruction phase that causes a risk to be attached. Archived copy of the keys should be properly secured.
  8. Destruction:  After the business use of key is over or its validity expires, secret and private keys should be destroyed in an efficient manner. All the traces of keys should be completely removed from the whole environment, even from the removal media, and vaults where the keys are stored for backup processes.

Photo credit: plenty.r. via Flickr

PKI Risks on Migrated Data
We cannot sit back and relax by implementing a PKI over the business applications and data which is migrated to the cloud, because when the data migrates to the cloud, various issues tend to arise, such as:
  • Because in the way the cloud model is designed, control over the migrated data to the cloud is completely lost.
  • The key management server, which is responsible for storing and managing keys – if hosted in the cloud then there is a risk from the CSP side. The risk is in how we can be sure that the CSP is making sure that our keys are secure, i.e. what access controls mechanism, SOD (segregation of duties) and policies the CSP (Cloud Service Provider) has put in place.
  • If some third party vendor solution is leveraged for PKI deployment, then where all the keys are used; what model for key management the vendor is using; how the vendor is making sure that even if deployed in cloud, customer keys are secured from the vendor remote access (SaaS APIs) and from some Virtual Machine (VM) corruption event such as, what will happen if a snapshot of the VM is stolen – will the keys reside in the snapshot, and if yes, for how long?
  • How to make sure that if the customer has leveraged a vendor’s PKI SaaS service, then even the vendor does not have access to the customer’s keys, and what measures the vendor has implemented to address the multi-tenant issue.
  • After decommissioning of systems in the cloud, how to make sure that data is completely removed from systems.

The below sections describe some of the best practices and design that organizations must follow in order to reap true benefits of PKI in cloud.
  • The key management server must be hosted within the organization, and whenever the data which is hosted in the cloud needs keys to decrypt the data as a part of end-user request, the key management server provides them. The key used for decryption should never be stored in the cloud VMs and must be in-memory for a few-nano seconds only.
  • With the above discussed model, all the data that is leaving and entering the organizations can be encrypted and decrypted respectively.
  • All the VMs that are hosted in the cloud must be encrypted to protect data loss when a VM snapshot is stolen.
  • When the data which is encrypted and put in a cloud is no longer needed, the organization must revoke the keys associated with it, so that even if some trail of data remains in the decommissioned VM, it cannot be decrypted.
  • The Hardware Security Model (HSM) should be used to store keys for cryptographic operations such as encryption, decryption, etc.
  • Use of old and insecure protocols like Data Encryption Standard (DES) must be avoided.

An environment with a poorly managed Public Key Infrastructure (PKI) is as good as an environment with no PKI. However, when organizations plan to migrate data to a cloud and decided to implement PKI onto any cloud model, i.e. public or private, they should make sure that complete ownership of the keys falls on their plate.

About Lohit: 
Lohit is a Security Researcher for the InfoSec Institute as well as a Security Analyst with Oracle. He has experience in working with RFPs/RFIs; Security HLD and LLD design; Network Security elements like Firewall, IDS/IPS, DLP, Reverse proxy, WAF; in Public Key Infrastructure(PKI); in Application Security testing for OWASP Top 10; in Compliance's like PCI-DSS 2.0,3.0 , ISO 27k1, HIPPA; with Cloud Service Provider's such as AWS; in Security Incident and Event Management(SIEM) with tools like Splunk; in Vulnerability Testing.


Thursday, August 21, 2014

Cloud & networking weekly news roundup: Aug 18 - 22

Cloud and Networking news for the week of Aug 18
  • From ZDnet: Software-defined networking expected to be worth $8B by 2018 >> reserch firm IDC predicting software-defined networking will "grow from a market worth roughly $960 million in 2014 to more than $8 billion by 2018. That translates to a compound annual growth rate of 89.4 percent."
  • From Randy Bias on Cloud Ave: Public Cloud Economies of (Web-)Scale Aren’t About Buying Power
  • "99 problems but the cloud ain't one: What are SaaS, PaaS and IaaS?" on IT Pro Portal
  • Via DataCenter Dynamics: CenturyLink Offers Global Private Cloud Service 
  • Cloud use moving from experimental to full on adoption, according to new research via Smart Brief >> 91% of companies claim some form of cloud computing, and 4 out of 10 channel firms expect their cloud-based revenue to grow by 15 percent or more in the coming year. 
  • Image credit: IT Pro Portal

    CohesiveFT in the news:
    Catch up with the CohesiveFT team:

    Ask a Cloud Netwoking Expert: What is cloud VPN?

    Virtual private networks (VPNs) use the public internet to connect remote sites, offices, or users together. VPNs use "virtual" connections routed through the Internet but the security settings of a VPN ensure data security because all traffic inside the VPN is encrypted. 

    From a user’s point of view, a VPN connection is the same as a connection within a private network. For example, a remote worker can access a VPN from the road and have the same experience as she would while working in the office. Similarly, large enterprises with data centers, offices, and partner networks spread across the globe can use VPNs to connect their resources into one logical network.

    Why use VPNs in enterprise - security guarantee 
    VPNs help companies prove that they are complying to security standards and that they “own” all the data inside their network. In cloud computing, there is a hurdle for enterprises who want to use the public internet to connect to a cloud-based customer or cloud-based storage but they just cannot let their private data be outside of their own network. 
    Image credit: Wikimedia Commons

    VPNs are more secure because they use tunneling protocols and data encryption. Tunnels help enterprises ensure sender authentication so unauthorized users cannot access their VPN. VPNs can also guarantee the message was not tampered with during transmission.

    The data traveling through the VPN tunnel is also encrypted. Even if network traffic is sniffed at the packet level, attackers would only see encrypted data. See Wikipedia for more on network sniffer.

    Some secure VPN protocols include:

  • Internet Protocol Security (IPsec) 
  • Transport Layer Security (SSL/TLS)
  • Secure Shell (SSH) VPN

  • Also, check out some of our similar Ask a Cloud Networking Expert posts. We’ve covered more technical and specific topics, like encryptionIPsec security and how UDP multicast can work in the cloud. 

    Cloud VPNs
    Cloud VPNs (sometimes called virtual private clouds or VPCs) can be the answer to security and compliance concerns in public clouds. Cloud VPNs work just like a local network VPN, but instead of creating a private network on top of wifi or the public internet, a cloud VPN is a private network over top of a cloud providers’ network infrastructure that can bridge data centers and cloud geographies. 

    Where does CohesiveFT's VNS3 fit in?
    Using VPN technology, VNS3 creates an overlay network over top of any hardware or cloud computing resource. VNS3 connects the VPN with your IPsec tunnels to any customer or partner networks.  VNS3 lets you launch and configure a secure network with either REST APIs or through a web-based interface.

    VNS3 allows you to separate control from the hardware level. Because control is separated from hardware, you have more control over your network security. So, essentially VNS3 lets you free your application from a cloud provider, hardware, or risky network sniffers.

    Expert Profile 

    Source: this guy
    Name: Ryan Koop
    Title: Director of Products and Marketing
    Favorite Snack: Cashews
    Credentials in the "Expert's" words
    Do I have a bunch of certifications?  Nope.  Sadly my Smart Cloud Advisor and Architect certs just expired.  My primary job function is marketing, let's see if I can self promote.

    I have a over six years of experience designing our cloud network product.  I also moonlight as a member of our support team (does that make me a masochist?) and services team.  In those roles I have helped our customers design, deploy and tune hundreds of cloud networks and troubleshoot thousands of IPsec tunnel negotiations.  In many cases, I end up configuring both sides of the connection and have become familiar with a number of network security and routing hardware appliances from all corners of the market: Cisco, Juniper, Watchguard, Dell SONICWALL, Netgear, Fortinet, Barracuda Networks, Check Point, Zyxel USA, McAfee Retail, Citrix Systems, Hewlett Packard, D-Link, WatchGuard, and Palo Alto Networks.

    Friday, August 15, 2014

    Cloud & networking weekly news roundup: Aug 11 - 15

    Cloud and Networking news for the week of Aug 11
    • From ODCA president Correy Voo: Why cloud spells the end of the IT department…as we know it on iCIO 
    • Network World:  Confused by SDN & NFV? These may be complementary options for the network of tomorrow >> "The goal with both SDN and NFV is to control the network logically, with software and minimize hands-on work with those network devices."
    • Wired: The Internet Has Grown Too Big for Its Aging Infrastructure
    • From GigaOm Is Docker a threat to the Cloud ecosystem? 
    • InfoSec Institute: The Cloud is Both More and Less Secure than you Think
    • Datacenter Knowledge: IBM Opens SoftLayer Data Center in Toronto, Canada 
    From Gigaom Research - Stack-inception: Containers and how they relate to systems software with VMs.

    CohesiveFT in the news:

    Catch up with the CohesiveFT team:

    Share this Post

    Related Posts Plugin for WordPress, Blogger...