Best practices for securing an AWS environment have been well-documented and generally accepted, such as AWS’s guidance. However, organizations may still find it challenging on how to begin applying this guidance to their specific environments. In this blog series, we’ll analyze anonymized data from Netskope customers that include security settings of 650,000 entities from 1,143 AWS accounts across several hundred organizations.
It’s time to update the list of security incidents caused by misconfiguration of cloud storage resources since the last couple of weeks have unfortunately been quite prolific. The shared responsibility model continues to be overlooked, or simply misunderstood by too many organizations, and as a consequence tons of sensitive data is leaked from the cloud on a daily basis, putting thousands of individuals (and dozens of municipalities) at risk of fraud, identity theft, and phishing campaigns.
In the last several years, companies have accelerated their cloud adoption and have invested time and resources to lift and shift their content, development and applications to public and private clouds. The onset of the global health crisis has further accelerated even the more traditional brick-and-mortar companies to invest in cloud technologies. Yet, we still see customers hosting content on on-premises repositories in spite of inexpensive per-GB cloud storage. Why is that?
Misconfigurations remain one of the most common risks in the technology world. Simply telling organisations to “fix” this problem, however, is not as easy as it might first seem because there’s a myriad of technologies at play in modern infrastructure deployments. All of this results in a complicated mix of hardening approaches for each system. What is key, then, is to identify where hardening is required and then consider the methodology for each area.
Cloud storage has become mainstream. It is one of the fastest-growing segments of IT spending and an indispensable tool for many modern businesses. However, not enough is being done to secure data residing in the cloud. According to Gartner, 90% of organizations that fail to control public cloud use will share information inadvertently or inappropriately through 2025. Almost all cloud security failures will be due to the cloud customer, not the service provider.
Enterprise data is growing exponentially and becoming more complex, making it harder to manage, and an even bigger challenge to store. That’s why more organizations are looking to public cloud storage options, such as Azure Storage, as a way to get the scalability, durability, maintenance and accessibility they need to keep them competitive–benefits often difficult to get from on-prem data centers.
Amazon S3, one of the leading cloud storage solutions, is used by companies all over the world to power their IT operations. Over four years, UpGuard has detected thousands of S3-related data breaches caused by the incorrect configuration of S3 security settings. Jeff Barr, Chief Evangelist for Amazon Web Services recently announced public access settings for S3 buckets, a new feature designed to help AWS customers stop the epidemic of data breaches caused by incorrect S3 security settings.
During the course of UpGuard’s cyber risk research, we uncover many assets that are publicly readable: cloud storage, file synchronization services, code repositories, and more. Most data exposures occur because of publicly readable assets, where sensitive and confidential data is leaked to the internet at large by way of a permissions misconfiguration.
Despite spending billions on cybersecurity solutions, private industry, government and enterprises alike are faced with the continued challenge of preventing data breaches. The reason cybersecurity solutions have not mitigated this problem is that the overwhelming majority of data exposure incidents are due to misconfigurations, typically by way of third-party vendors, not cutting-edge cyber attacks.