Cloud computing has been one of the most significant shifts in information technology in the past years. Having an almost infinite amount of compute and storage resources available at an instant and at a controllable cost can bring huge benefits, so it’s no surprise that companies of all sizes are now taking advantage of cloud services.
This means that data is increasingly stored outside the walls of the organisations and not on their own hardware. Giving up some level of control over data and processes while relying on cloud providers comes with additional risks that need to be considered. Such risks include, for example, increased complexity of IT infrastructure, opaque duplication and replication of data and more attack surface for unauthorized access.
But working fully within a cloud environment does not only add risks to cybersecurity – it also brings some clear advantages. Datatonic’s information security management system (ISMS) positions itself to leverage the benefits of cloud as much as possible while controlling the risks of using a public cloud provider. From day one, Datatonic has made information security a point of pride and one of its top priorities. We have a specialised team dedicated to security for our own and our customers’ services, and run continuous security testing and build solutions upon the best practices of secure engineering on Google Cloud Platform (GCP).
This blog covers some of the advantages that help us stay secure and effective.
Using cloud means there is less administration and management needed, as physical infrastructure is limited and all assets are clearly defined, labelled and registered. The risk of unknown data storage or access points is therefore greatly reduced. Overall this makes the process of maintaining our ISMS much easier. We built on the monitoring and alerting infrastructure that GCP and Google Workspace provide to survey all our systems continuously. To find out more about migrating to cloud, have a look at our whitepaper.
With our cloud infrastructure in place, access levels are easily defined and updated. This also goes to network infrastructure and connectivity making it easy to enforce the principle of least privilege organisation-wide, meaning that:
As we operate completely within GCP for our own IT services as well as for customer projects we can leverage the same methods internally and for our clients.
Find out how you can manage access rights down to a user level in Looker here.
Google Cloud has built a thorough suite of security products to up the defences of their users. Using these existing tools makes it easy for us to accomplish the level of security we aim for. Key tools that are worth highlighting are:
We will cover these in more detail on our next blog.
Lastly, storing data in the cloud reduces the risk of data loss massively. Data that is stored in the data centres of public cloud providers is routinely backed up and replicated. This guarantees fast access to data at all times, keeping everyone’s files safe and up to date. Rollbacks and restoration of snapshots is also commonly implemented in managed versions of cloud data stores. These features make disaster recovery simple while keeping the configuration overhead to a minimum. Through granular and transparent access levels (see point 1) it is further comparatively easy to control data access for users, thereby minimising the risk of data breaches and information being stolen.
When working with the data of our clients, security takes a prime role and is addressed long before a project kicks off. Simply put, we want our clients’ data to be as safe as possible. Effectively building on top of Google Cloud’s safety protocols and maintaining our ISO27001 certification are two key aspects of our IT security practices. Incorporating these latest security standards is particularly important for our clients in banking, healthcare and e-commerce where we process and handle large volumes of personally identifiable information on their behalf.
Curious about the state of your data security? Request a health check from our team below.
Know exactly where and how to start your AI journey with Datatonic’s
three-week AI Innovation Jumpstart *.
* Duration dependent on data complexity and use case chosen for POC model
With your own data sets, convince your business of the value of migrating your data warehouse, data lake and/or streaming platform to the cloud in four weeks.
With your own data, see how Looker can modernise your BI needs
with Datatonic’s two-week Showcase.