||U.S.: 7 Tips for Securely Moving Data to the Cloud
||Sunday, June 18, 2017
ICT for MDGs, Knowledge Management in Government, Internet Governance
||Jun 19, 2017
A few years back, an unmistakable trend emerged that cloud computing was growing in both percentage of organizations adopting cloud solutions as well as the amount and type of data being placed in the cloud.
Earlier this year, I highlighted research that made it clear that trust and risks are both growing in government clouds. Since that time, many readers have asked for more specific guidance about moving more data to the cloud in the public and private sectors. I was asked: What are the right cloud questions?
Questions like: Where are we heading with our sensitive data? Will cloud computing continue to dominate the global landscape? These are key questions that surface on a regular basis.
The forecast for the computer industry is mostly cloudy. Here are some of the recent numbers:
- Cloud computing is projected to increase from $67B in 2015 to $162B in 2020 attaining a compound annual growth rate (CAGR) of 19 percent.
- Gartner predicts the worldwide public cloud services market will grow 18 percent in 2017 to $246.8B, up from $209.2B in 2016.
- 74 percent of tech Chief Financial Officers (CFOs) say cloud computing will have the most measurable impact on their business in 2017.
Back at the end of last year, The Motley Fool reported 10 Cloud Computing States That Will Blow You Away, and the last three listed are especially intriguing to me. Here they are:
- 71 percent of respondents use hybrid cloud platforms, compared to just 58 percent last year. Hybrid platforms keep more recent data onsite, while moving older data to the public cloud. That's an ideal setup for companies that aren't ready to move all their data offsite. Public cloud leaders like AWS integrate with third-party private clouds to become hybrid platforms, while private cloud leaders like VMware also integrate with other public clouds.
- The biggest challenge for the cloud market today is a lack of resources and expertise, according to RightScale. Thirty-two percent of respondents felt that their IT departments were poorly equipped to handle the growing workloads in the cloud, compared to 27 percent in 2015.
- The average company uses about 1,427 cloud-based services, according to Skyhigh Networks. Facebook is the most popular cloud-based social media service in the workplace, Office 365 is the top collaboration platform and Dropbox is the top file-sharing service. Those figures explain why Facebook is expanding into enterprise networking with Workplace, and why Microsoft is challenging Slack on the collaboration front with Skype Teams.
IoT, Other Trends and the Cloud
And while it is true that the Internet of Things (IoT) has taken over the mantle as the hottest trend in technology, the reality is that “The Internet of Things and digital transformation have driven the adoption of cloud computing technology in business organizations,” according to a U.S.-based cloud infrastructure firm Nutanix.
This article from CxO Today lays out the case that the cloud remains the most disruptive force in the tech world today. Why?
“While premise-based IT software and tools have their own advantages, the global trend is for cloud based applications since they offer more connectivity and functionalities than legacy systems. Moreover, enterprises are naturally gravitating towards it as the technology is reasonably reliable, affordable, and provides them access to other new and emergent technologies as well as high end skills. The cloud boom is also propelled by the fact that enterprises are trying to improve performance and productivity over the long term. Looking at the tremendous response for cloud services, several IT companies are designing applications meant solely for pure cloud play.”
Other experts say that several overlapping trends are colliding as “The edge is eating the cloud.” These trends include:
- Cloud computing, centralizing IT for massive economies of scale and agile provisioning, volatility and growth
- The Internet of Things (IoT), where things are becoming connected and sending reams of data
- Machine learning, taking all of that data and improving processing and predictions
- Augmented and Mixed Reality (along with Virtual Reality), where people can interact with other people and things both in physical and virtual worlds
- Digital Business and the Digital World, where connections between things and people are pushing us to more and more real-time interactions and decisions.
Overcoming Fears in the Cloud
And yet, there are plenty of enterprises that continue to have significant concerns regarding cloud computing contracts. Kleiner Perkins’ Mary Meeker highlighted the fact that cloud buyers are kicking the tires of multiple vendors while becoming more concerned about vendor lock-in.
Also, technology leaders often move to the cloud to save money, but CFOs are now telling IT shops to cut costs in the cloud — fearing that resources are being wasted. For example:
- The public cloud IaaS market is $23 billion
- 12 percent of that IaaS market is Microsoft Azure, or $2.76 billion
- 44 percent of that is spent on nonproduction resources — about $1.21 billion
- Nonproduction resources are only needed for an average of 24 percent of the workweek, which means up to $900 million of this spend is completely wasted.
Also, while overall trust in cloud infrastructure is higher, new concerns are rising about application security delivered through the cloud.
My 7 Tips for Moving Data into the Cloud
So what can technology and security leaders do to protect their data that is moving to the cloud?
Here are seven recommendations that can help you through the journey. Note that the first four items are largely best practices about your current data situation and options before your data moves.
1) Know your data. I mean, really know what is happening now — before you move the data. Think about the analogy of a doing a house cleaning and organizing what you own before putting things in storage to sell your house.
If you don’t want to catalog everything (which is a mistake), at least know where the most important data is. Who is doing what regarding the cloud already? What data is sensitive? This is your “as is” data inventory situation with known protections of current data. And don’t forget “shadow IT.” There are plenty of vendor organizations that can help you through this process.
2) Have a defined and enforced data life cycle policy. You need to know what data is being collected by your business processes, where does it go, who is accountable (now) and what policies are in force.
Ask: Is there appropriate training happening now? Is it working? What policies are in place to govern the movement of your data? For example, my good friend and Delaware CSO Elayne Starkey does a great job in this area of policies. You can visit this Web portal for examples: https://dti.delaware.gov/information/standards-policies.shtml
3) Know your cloud options: Private, public, hybrid or community cloud? This simple step often gets confusing, in my experience, because some staff mix these terms up with the “public sector” and “private sector” definitions — wrongly thinking that a private cloud means private-sector-owned cloud.
Here are some basic cloud definitions to ponder with your architecture team:
Private Cloud: The organization chooses to have its own cloud where the resource pooling is done by the organization itself (Single Organization cloud). May be or may not be on premises (in your own data centers.)
Public Cloud: Different tenants are doing the resource pooling among the same infrastructure.
— Pros: It can be easily consumable, and the consumer can provision the resource.
— Cons: Consumer will not get the same level of isolation as a Private cloud.
Community Cloud: Sharing the cloud with different organizations usually unified by the same community sharing underlined infrastructure (halfway between private and public) small organizations pooling resources among others. For example, some state and local government organizations share email hosting with other state and local governments in the U.S. only.
Hybrid: Mixture of both private and public i.e., some organization might say we would like elasticity and cost effectiveness of public cloud and we want to put certain applications in private cloud.
4) Understand and clearly articulate your Identity and Access Management (IAM) roles responsibilities and demarcation points for your data. Who owns the data? Who are the custodians? Who has access? Who can add, delete or modify the data? Really (not just on paper)? How will this change with your cloud provider?
Build a system administration list. Insist on rigorous compliance certifications — Incorporate appropriate IAM: Incorporate appropriate IAM from the outset, ideally based on roles, especially for administration duties. When you move to the cloud, the customers, not the provider, are responsible for defining who can do what within their cloud environments. Your compliance requirements will likely dictate what your future architecture in the cloud will look like. Note that these staff may need background checks, a process to update lists (for new employees and staff that leave) and segregation of duties as defined by your auditors.
5) Apply encryption — thinking end to end — data at rest and data in transit. We could do an entirely separate blog on this encryption topic, since a recent (and scary) report says there is no encryption on 82 percent of public cloud databases. Here are a few points to consider. Who controls and has access to the encryption keys? What data is truly being encrypted and when? Only sensitive data? All data?
6) Test your controls. Once you move the data, your cloud solution vulnerability testing should be rigorous and ongoing and include penetration testing. Ask: How do you truly know your data is safe? What tools do you have to see your data in the cloud environment? How transparent is this ongoing process?
The cloud service provider should employ industry-leading vulnerability and incident response tools. For example, solutions from these incidence response tools enable fully automated security assessments that can test for system weaknesses and dramatically shorten the time between critical security audits from yearly or quarterly, to monthly, weekly, or even daily.
You can decide how often a vulnerability assessment is required, varying from device to device and from network to network. Scans can be scheduled or performed on demand.
7) Back up all data in a distinct fault domain.
Gartner recommends: “To spread risk most effectively, back up all data in a fault domain distinct from where it resides in production. Some cloud providers offer backup capabilities as an extra cost option, but it isn’t a substitute for proper backups. Customers, not cloud providers, are responsible for determining appropriate replication strategies, as well as maintaining backups.”
No doubt, managing your data in the cloud is a complex and ongoing challenge that includes many other pieces beyond these seven items. From contract provisions to measuring costs incurred for the services to overall administration functions, the essential data duties listed are generally not for technology professionals or contracts pros lacking real experience.
Nevertheless, all organizations that move data into and out of cloud provider’s data centers are constantly going through this data analysis process. Just because you moved sensitive data in the cloud five years ago for one business area does not mean that new business areas can skip these steps.
If you are in a large enterprise, you may want to consider adding a cloud computing project management office (PMO) to manage vendor engagement and ensure the implementation of best practices across all business areas.
And don’t just fall for the typical line: “I know ‘xyz’ company (Amazon or Microsoft or Google or fill-in-the-blank) is better at overall security than we are — so just stop asking questions.” Yes — these companies are good at what they do, but there are always trade-offs.
You must trust but verify your cloud service — because you own the data. Remember, you can outsource the function, but not the responsibility.
(BY DAN LOHRMANN)