- The client reached out to Aptitive in the middle of a cloud migration crisis that was starting to have a negative impact on their business
- Aptitive designed a customized data migration solution that efficiently moved a large volume of the client’s data to the cloud
- Additional systems were built to ensure the client’s front-end web applications interworked with the new cloud databases
Just the Headlines
Short on time? Here are the key facts.
AWS Database Migration Service
A global provider of digital curriculum solutions was looking to migrate its on-premise databases to the cloud. This process included ingesting large, delimited files from the client’s schools to support their front-end web applications.
Unfortunately, this ingestion system was severely bottlenecked, often taking 24 hours or more to validate and load a single file. The result of this bottleneck was a system that chronically lacked the latest information and provided a poor user experience to both educators and students.
The Aptitive team determined that a serverless architecture built on Amazon Web Services would provide the client with the best possible foundation for a reliable and scalable result. Our migration solution included the following elements:
- Configuring the system to upload the client’s files to an Amazon Simple Storage Service (S3) bucket, which triggered a series of Lambda functions that performed file level and row level data validation.
- Automatically feeding validated data into a newly created AWS Aurora database.
- Replicating data to an on-premise SQL Server using AWS Database Migration Services (DMS) to support the front-end application.
Besides the primary hurdle of poor data ingestion efficiency, another major challenge was that the high volume of data was causing the DMS to frequently lock up the Aurora database. To solve this problem, the Aptitive team created a DMS health probe to regularly check the status of these locks and restart when necessary.
By leveraging best-in-class data management tools, the Aptitive team was able to help the client process their files in parallel, reducing the ingestion time for new data from hours to mere minutes. This eased the load on applications throughout their organization, allowing clients to access and use the platform to fetch their information without interruption.