GCP Data Lake Migration for Enterprise Data Capture Leader

Our client creates enterprise-level data capture and automatic identification solutions that provide businesses with operational visibility for the retail/e-commerce, manufacturing, transportation and logistics, and healthcare industries. They are helping customers empower their mobile workforce with tools optimized to easily capture critical information, as well as make data accessible to the people that make the business run.

Business Challenge

Our client had workflows running on AWS and wanted to migrate them to GCP. The ultimate long-term goal was to save costs.

Specific business goals for our client included:

  • Migrate from Cloudera and AWS to Google Cloud Platform
  • Review existing workflows as well as document them
  • Create a data lake for data processing and analytics based on the POC delivered by SoftServe

Project Description

SoftServe began with a discovery phase for further research. Initially, our client expected to quickly move all workflows running on AWS to GCP. However the investigation proved that such a migration required significant work and couldn’t be completed in the desired timeframe. So the client decided to instead build a data lake on GCP from scratch.

Conducting further investigation to advise on different approaches using the best technolog y stack for production, the Soft Serve team built a template to migrate all the future data from AWS to GCP, which will bring a quick wins for the rest applications; as well as built key data lake components and ingested the first data sets. Specific technologies used include Firestore and Apigee.

Value Delivered

Leveraging the data lake and template delivered by SoftServe, our client will save costs moving from AWS to GCP and will reach quick integration of their applications in order to gain key insights. SoftServe continues to advise our client regarding the automation of daily data syncs between AWS and GCP for defined business processes.

Let's Talk