We, as partners, helped modernize their data management solution. They wanted to migrate from AWS Redshift to BigQuery for better scale, and we helped them with the design for the solution. The challenge As the client’s customer base increased, the amount of data they needed to handle also began increasing. They were looking for a solution where the overhead of managing data and customer access to their data would be much reduced. Since each customer accessed their data independently, it was critical to manage customer data access securely. They were also looking at ways to scale quickly, and this would be possible only with a solution requiring minimal to no configuration.
The obvious solution was to use BigQuery for the data warehouse. BigQuery is a data warehouse as a service, which means there is no configuration or monitoring needed. It can handle large amounts of data and produce results in milliseconds. From a security point of view, each customer needed to be given access only to their data, and the safest way to do this was to let each customer have their datasets in their projects in BigQuery. While one project was created to help manage the admin processes, a separate project was created for each customer. Creation of projects and BigQuery datasets and imports into BigQuery were all automated using Terraform. This made onboarding customers very simple and, at the same time, offered complete security.
The results The result was an easy-to-manage, secure and scalable solution. The solution resulted in overall cost savings at the infrastructure level and also due to operational efficiency. Automating asset creation and
no-management services such as BigQuery make management and maintenance very efficient. So the main areas of benefit were
- Cost saving – BigQuery charges are based on data storage and
- Operational Efficiency – achieved by automation of asset creation.
- Scalability – achieved by the use of BigQuery is a managed data