Logistics Leader Turns System Logging Data into Revenue Stream with Snowflake and AWS
Digital transformation provides innovation and creates new opportunities. A logistics industry client decided to leverage their ecosystem to monetize their data and to fine-tune their operations.
The client was already using AWS, DynamoDB, and MuleSoft. What they needed was a way to ingest and store data around business information like shipments, loads, invoices, acknowledgments, tenders, and so on and share that information internally and with clients. Doing so would promote better, data-backed business decisions for the company and provide a new product that could be sold to customers.
The first order of business was to create a place to store the data to make it accessible without requiring significant transformation and could be used with the existing systems. The decision was made to create a Data Lake to accommodate the structured and unstructured data.
For this to be a product, the data needed to be shareable with clients. The decision was made to implement Snowflake for the project. Snowflake was already known to work well with AWS and would slot into the existing architecture nicely.
Creating a logging framework for the data handling and processing drew on Big Compass's previous experience with other clients where we'd developed similar frameworks and allowed us to optimize the solution from the start. The architecture was designed to load the logs into AWS S3 from different systems using MuleSoft to AWS SQS. Once in S3, the files were aggregated using an AWS Lambda function to temporarily load the logs into DynamoDB. The extracted logs could then be broken into 10-megabyte files for streamlined ingestion through Snowflake's Snowpipe API. Lastly, Looker sat on top of the Snowflake data to create consumable dashboards for customers and the internal team.
Once the solution was in place, customers were able to add the data service to their accounts to get a clearer picture of their usage of the company's services. Customers could perform data analysis on a broad range of data points across the logistics services they used and gain greater insight into the files shared with our client.
Additionally, the new solution allowed internal teams to gain a better understanding of their own business. The log processing provided executives, support teams, and other internal users with the opportunity to create dashboards for analysis and surface errors and process faults faster. In the end, the new logging framework enabled the company to drive decisions based on the wealth of data they were already gathering, furthering the value of their system investments.