Salesforce helped pioneer the concept of putting confidential organizational data in a "public cloud" system. Other key vendors offering public cloud data services include Microsoft Azure and Amazon S3 services. "Cloud" has come to mean many things to many people as far as putting internal office functions up on the Internet. The word “public” is important to understand. This means that all data--every corporate and nonprofit user--sits in one enormous database. This is in fact a reassurance, not a drawback. In a public cloud web service, the vendor focuses its security attentions on one system, top to bottom.
Public Clouds and the Apartment Building Analogy
The data system in a public cloud is "multi-tenant"--tenant as in big apartment building. Even more than in a condo development, an apartment building provides uniform infrastructure supports distributed across all residents. Apartment building tenants in a well-run building can still personalize a unit yet off-load most all concerns for maintenance and infrastructure. Salesforce, for example, manages its 100,000 customer accounts with something like 20,000 servers. These servers are located in multiple data centers worldwide each with thousands of servers. A typical hosted website may be secured by the Internet Services Provider, yet it resides in one physical place at a time. In a public cloud system, your data is never just on one server in one data center. There is immense redundancy and optimization so you never need to think about where things are stored and where the next page view is coming from. Likewise, you don't have to think about doing a back-up. It just happens.
While it still strikes many as challenging to use a public cloud service for storing confidential donor or client lists, we all trust public cloud environments daily. When you use GMail, Google Docs, Facebook, Flickr, Twitter, you trust your data to a public cloud. In each of these, your data sits, personal or organizational, sits with everyone else’s in one place. For that matter, specialized email manager services like Constant Contact, on-line donation systems, BaseCamp also are public clouds of sorts. Though you only see your own projects, Basecamp or Central Desktop project information sits in the same database as thousands of other customers.
In a cloud environment, everyone logs into one place. Once you log in, you are directed into a compartmentalized set of functions and data and can never see anyone else's. The privacy is covered by the user name (email address used to log in) and password. It is up to the vendor to make sure other people’s data doesn’t leak through to your account. What they gain, again, is the ability to manage all security and performance issues in one place.
As with modern network managers, cloud systems typically enable an organizational systems administrator to further govern the user list: you can restrict usage by the location (IP address) a user can use, time of day, and other rules. Cloud environments should make these rules --as well as the log in history of each user--visible to the system administrator(s) of the account (including you) at all times. The systems also encourage robust passwords and resetting them periodically.
When someone leaves the company/organization, the administrator should be able to simply deactivate their user name . This immediately locks them out. Deactivated users in some system can remain in your account without counting against licensing. This means that records of work and activity remain even after the user is locked out of the system. A bulk transfer of record ownership from a user who has left to a new user is a handy feature to use once the user is de-activated.
Add-on tools to cloud services also need to be secure. Sometimes people ask me if they can “see” their cloud data directly using ODBC, as they might have done with MySQL or SQL Server based traditional websites. In order to hold things together securely, public cloud services typically ensure that add-ons for email, event registration, as well as your own custom pages use an additional layer of security beyond user name and password. Salesforce, for example, adds a long, apparently random "security token" to encrypted data transfers by add-on tools or pages.
How much security responsibility do you want to have in the era of WIkiLeaks?
Another element of cloud-system security has to do with browser standardization. When you create or update your public website, you can still opt to have the developer support older, unsecure browsers like Internet Explorer 6. The developer won’t be happy, but it can be done. Cloud systems typically set standards for secure browser connections, which may frustrate users on older computer that haven’t updated, yet it adds protection and ensures encryption to the organizational data.
Cloud environments also enable centralized treatment. Every week there are new threats to computer systems. These threats could potentially affect cloud systems, but the difference is, it is cloud vendor’s job to make the patches, and when they do, they do it in one play for all customers. Everywhere else, you have to have some concern when patches will be applied, who will do it, and whether it will affect your customizations. Cloud vendors like Salesforce focus on pre-announced major upgrade cycles annually and roll them out in organized fashion, while making security patches regularly.
A different security issue concerns how much can we trust a cloud vendor to maintain the confidentiality and privacy of data you entrust to them. This is a consideration with any software any of us uses these days. We live in an era of WikiLeaks, apparent semi-official Chinese theft of Google and Adobe data, Israeli attacks on Iranian nuclear power system networks, and newly intrusive data mining by US authorities. Yes, you do have to pick who you can trust. An organization that was counselling undocumented immigrants, pregnant teens or anything else confidential needs to consider what their network provider, hosting company or cloud service would do in the event of a government warrant or determined attack by a political or organizational opponent. In a cloud environment, you have to select vendors you can trust based on their size, history with privacy incidents, and leadership and board commitments. Given that we expect more of our desktop or network server software to “check for security updates” regularly, to be fair, we need to make the same determination for all software, not just cloud systems.
In general, using a public cloud environment is like using the electrical or water utility: each month, you use as much or as little as you need, and you pay as you go. Technology managers who speak in terms of "Enterprise" class services also, among other things, seek environments where your software applications are always available, and the infrastructure can handle any size load reliably. The various forms of cloud technology make this possible and the birth of public cloud providers makes this economically feasible for small organizations.
As I mentioned cloud can mean different things. These days, everyone wants to say they have a "cloud" offering. We also use and have created web applications using Drupal or .Net that might be called "private cloud" systems. A private cloud does give an organization more unilateral control over its data yet it also leaves more responsibility in the hands of the developer or support provider. You may use a shared web infrastructure which off-loads some responsibilities, while leaving other issues of integrity, up-time, and some of the security features depend on the developer.
Web sites may be hosted on a server that has dozens or more separate sites. They share some things on the physical server and its surrounding network at your Internet Service Provider. They use virtual environments to confine your view to what looks like an entire isolated web server. If you do something bad, your site will go down and no one else’s. While that sounds good, it also means you have more responsibility to make sure that your site doesn't do anything "bad." It is a mixed blessing, and as software gets more complex, a growing burden.
What about traditional networked databases and software?
A private cloud is more reliable than an internal networked database. A networked database (eg, Raisers Edge or Sage) running on Windows server leaves you with complete responsibility for all aspects of the integrity, security, back-up, updates and so on of the software. Larger organizations feel comfortable doing this and they pay for it, either to their network admin team, to software vendor, or a consultant. Smaller organizations cut corners and may be vulnerable. Yes, you can isolate things by virtue of your network firewall and server security software. One might debate in 2011 whether this is more secure than cloud environments. On the whole, I no longer think it is. Network administrators and server software vendors are fighting at best a stalemate keeping malware outside the door. Malware on an individual, inadequately protected computer on an organizational network can undermine the entire network.
As far as the reliability and responsiveness of the database--a different kind of security: Since in a public cloud database or network environment, everyone uses the same underlying data store, this means that a small nonprofit with 10-20 users and records in the tens of thousands gets the same treatment as a giant entity with hundreds of users and millions of records. In an environment you maintain yourself, you have to be concerned with whether the set up of the database (including Access or Microsoft SQL Server) can support the amount of data and number of users. Anticipating growth and scalability in a private or networked system is a big headache.
Organizations that host their own software applications internally have a basic conundrum. The cost of network hardware has dropped considerably. Standard networking software can either free (Linux) or, for nonprofits, subsidized (Microsoft on techsoup). Yet those reduced base costs mask a continuing high cost of maintaining a server year by year.
The conundrum comes from bearing the entire cost of maintaining that server yet most likely hardly ever using its entire capacity. Until a few years ago, organizations found themselves adding multiple servers to separate file and print services from email management, application and database servers and such. Virtualization technology, a fundamental technology for cloud computing, has allowed the consolidation of servers. This provides a noticeable economic savings as well as environmental benefits. The simplest analogy is switching from incandescent bulbs to compact florescents or LEDs. This partly mitigates the responsibilities for maintaining local servers, but only partly.
Broadening Safety in 2011
Along with physical security of a local server are issues of data security. Data security is multi-layered and multifaceted. Here in Massachusetts, the state enacted laws to protect personally identifiable information. This changes the security model considerably because organizations, including nonprofits, now need to proactively take responsibility for protecting against internal inappropriate data access. Sensitive data now needs to be encrypted in the data base and users granted specific permission to access that data. If your organization must provide a security compliance audit, some say that public clouds with their proprietary technology may not pass scrutiny as they don't reveal the internal workings. At least for now, this is a major rationale in the corporate world for “private cloud” environments, where you get some of the benefits of hosted cloud infrastructure, yet take responsibility for everything you have up there. On the other hand, organizations with their own local servers may lag behind in documenting and keeping up to date a security compliance plan for their network and server infrastructure.
Below private clouds and networked systems lies the security of individual computers and now mobile devices. Here the security picture is even bleaker, with inadequate protection and frequent vulnerabilities. When someone says, they will install their accounting system or other critical data on just one computer and keep it off the network, to be safe, you can ask, how much effort are they putting into back-up, anti-virus, hardware maintenance and all the rest for that one computer? How much does a systems administrator at that organization monitor the complete set-up of that confidential computer?
For a sense of emerging security issues we will likely face in 2011, check out this article
from the great technical security resource, http://darkreading.com.
From my point of view, 2011 looks to be the year that no matter your application platform, there is an affordable, secure cloud solution for you. Whether you start with simple server virtualization or move to a more complex cloud solution, it’s time to do something about that server in the closet. From a security point of view, the old internal server looks less like a private fortress than a single point of failure. Some would call it a “stress box.” The burdens of maintaining private systems have become so great, developers and software publishers feel increasing pressure to move in this direction in order to survive and give their customers good service. Infrastructure specialization is a good thing - it enables organizations to focus on their core mission; reduce operating costs, handle unexpected contingencies, and contribute to a greener world. My sense is that over the next several years, more and more corporate systems will move to the cloud, and to the public cloud, and in their wake, so will nonprofit data.
Trackback URL for this post: