Menu

FOG COMPUTING FEATURE NEWS

10 Recommendations for Securing Big Data in the Cloud

By Special Guest
Prat Moghe, Founder and CEO, Cazena
April 07, 2016

As enterprises consider augmenting or moving their big data clusters and data warehousing to the cloud, security and compliance remains a persistent concern. According to Gartner, however, only a small percentage of cloud security incidents to date were the cloud provider's fault, highlighting that this worry is largely misplaced.

This pattern isn’t expected to change. Gartner predicts that through 2020, 95 percent of cloud security failures will be the customer’s fault. Experts like U.S. CIO Tony Scott agree that public cloud offerings like Amazon Web Services (AWS) and Microsoft Azure are more secure than on-premises data centers. Even financial services companies, traditionally the most security-conscious, now utilize the public cloud.

Today, cloud providers at all levels (IaaS, PaaS, services, etc.) offer a rich array of security tools. But it is up to enterprises to fully utilize these capabilities. This presents challenges. It requires investments in resources and new skills to bridge the “uneven handshake” between the way on-premises enterprise IT systems work and the way the cloud works.

I recently sat down with Equifax CISO Susan Mauldin to discuss how to secure the cloud. She said, “CISOs are becoming more comfortable with cloud because there are controls available to us today that weren’t there years ago.” But cloud providers built these controls from a cloud-centric perspective, not an enterprise view. Enterprises need to ensure that the cloud services they select fit in with their existing systems, processes, management tools and security policies.

Here are 10 recommendations for ensuring security and compliance as enterprises migrate big data workloads to the cloud.

1. Lock down public cloud resources using gateway technology
Enterprises must avoid making cloud another silo with separate monitoring and security. To streamline the integration between on-premises systems and cloud services, consider installing a security gateway, a dedicated VPN tunnel between the enterprise and the cloud provider, to block all public access to your cloud except through the gateway. All access – whether it is data moving in or out, or analytic queries from tools – should go through this gateway.

2. Utilize single tenancy
Enterprises need the trust and control offered by a single tenant model (i.e. not shared with other users). There is a popular myth that public cloud is all “shared” and hence cannot be trusted. In reality, while physical infrastructure on the public cloud may be shared, there is a strong separation between resources at the virtual level. This logical separation can provide a single tenant cloud service.

3. Encrypt data in motion
Data movement to and from the cloud differs quite a bit from data movement within the enterprise, and requires different skills. Ensure all data moving in and out of the cloud is encrypted. Consider using IPSEC or SSL for encryption depending on your specific firewall configuration.

 4. Encrypt data at rest with HSM-based key management
All data at rest within the cloud should be encrypted. Consider using industry standard AES 256 encryption implemented in a hierarchical model. For example, if departments at your firm have their own data warehouses or big data clusters, you can allocate each one a separate cloud with an individual customer key. Secure keys using an HSM (hardware security module), either cloud-based or on-premise. No unencrypted keys should persist outside the HSM.

5. Segregate duties with role-based access control
Think of the cloud as a house with various rooms in which you want to restrict different users. For example, BI analysts can only query a cloud data warehouse, while data managers can load and administer data while IT can access system logs. A role-based access control system, similar to the access controls you have in the enterprise, must be layered on top of the cloud

6. Log and audit everything
All cloud service components at all layers should be logged. Log all infrastructure and data access, including data movement and supporting services. Further, these logs must be normalized across various data services. All logs must be aggregated into a centralized, indexed, searchable repository that is integrated or exported into enterprise log management systems.

7. Set up security monitoring and alerting
In addition to logging, all security events must be continually monitored for malicious or unauthorized activity. A separate risk dashboard and alerting function should keep your security group on top of the cloud platform. Additionally, all security events should be integrated within your existing SIEM environment.

8. Extend patching and vulnerability assessments beyond core infrastructure
While cloud infrastructure is regularly patched and assessed, it is important that the overall posture of your cloud match your enterprise policies. Since your users will access this cloud, think of the cloud asset as another enterprise asset; it must be subject to similar patching, assessment and mitigation processes.

9. Check for compliance certifications
Consider top cloud providers that provide strong compliance and security controls and have a wide range of compliance certifications. Typical certifications to look for include PCI DSS Level 1, SOC 2, ISO 27001, HIPAA, FedRAMP, FISMA, DIACAP and NIST. As you integrate with this cloud infrastructure, be sure that new any components you add, such as a gateway, are also certified.

10. Look to partners and technology to supplement internal skills
It is easy to get overwhelmed by the sheer number of security measures offered by cloud providers. For example, Azure and AWS offer hundreds of security tools and features. Enterprise IT teams may struggle to configure many of these because the cloud operates on a software-driven paradigm that differs significantly from the hardware-oriented data centers they are accustomed to.

With its favorable cost model, scalability and agility, the cloud is clearly the future of data analytics. Salesforce, Workday and other enterprise SaaS apps prove that the cloud can be plenty secure for mission-critical applications. Now is the time for IT teams to tackle security of analytic environments – the next frontier in the cloud.


About the author: Prat Moghe is a successful big data entrepreneur with nearly 20 years of experience inventing next-generation products and building strong teams in the technology sector. As SVP of strategy, products, and marketing at Netezza, Prat led a 400-person team that launched the latest generation Netezza appliance which led the market in price and performance. Netezza was acquired by IBM for $1.7B in 2010.

Edited by Ken Briodagh


SHARE THIS ARTICLE
Related Articles

Schneider Electric Announces Micro Data Center Xpress Range

By: Ken Briodagh    3/15/2017

In a recent announcement, Schneider Electric, the global specialist in energy management and automation, today introduced the Micro Data Center (DC) X…

Read More

Greenwave Systems Unveils Edge Analytics Integration

By: Ken Briodagh    3/15/2017

In a recent announcement, Greenwave Systems, a managed services and Internet of Things (IoT) software developer, announced a collaboration with Wind R…

Read More

Qualcomm and Microsoft Join Forces to Accelerate Cloud Services

By: Ken Briodagh    3/8/2017

According to a recent announcement from Qualcomm and Microsoft, the two companies will work together with the goal of accelerating next generation clo…

Read More

Itron and Bsquare First to Provide App Community for Grid Edge Devices

By: Ken Briodagh    1/26/2017

Itron to demonstrate dynamic app delivery for smart devices using Bsquare DataV

Read More

Saguna Collaborates with Wind River to Accelerate Deployment of Edge Computing Solutions

By: Ken Briodagh    1/13/2017

Saguna Networks has announced it is collaborating with Wind River to validate the Saguna Open-RAN Edge Computing solution operating on Wind River Tita…

Read More