Aptitive needed to organize a variety of data sources to coordinate our first ever (data) Science Fair. So, we decided to use the opportunity to flaunt our cloud architecture skills and create a miniature analytics platform to show off to the attendees. Our main goal was to display how your company can use the same strategy to collect, transform, and use your data in the cloud!
At the event, we had a few disparate sources of data coming from other projects:
Our first step was to stage this raw data in our data hub. To do that, we used Alooma to extract our source data from Azure/Google and load it to tables in Snowflake. Interesting side note, we did not have to predefine the table schemas or E-L routines because that was all handled by the software.
Imagine collecting all your firm’s siloed data into one central location (with no new hardware when using the cloud). It is the first step to building your enterprise-level architecture.
Next, we arrived at the hardest part for any data project. We spent 80% of our time designing the cloud architecture and coding the SQL that combines, consolidates, and transforms the raw data into something usable by our reporting tools. Using Snowflake as our data warehouse, we organized the data into two facts and four dimensions based on the business logic of the underlying projects. In other words, we collected transactions from the Rima Bot and connected them to the individual characteristics of the registrants, dates, and booths.
Imagine transforming your data into a view of the entire enterprise. You can unify your information in a single data hub!
Now that our project data was organized, we passed the connection string to our analytics team to build out their portion of the fair (a fancy Looker dashboard for communicating the scoreboard). Note that the report developers did not need to model the data in the business intelligence tool because it was already done in the data warehouse! We brainstormed many other ideas for using the DW as well …What should we do next year?
- List reporting
- Data discovery
- Trend Analysis
- Event Segmentation
- Real-time Anomaly Detection
Imagine using your data quickly, effectively, and with whatever tool you want. This is the value of well-designed data warehouse.
Data architecture is the core of everything we do at Aptitive and the glue that held all of (data) Science Fair projects together. Like our event, your company can dramatically improve your business intelligence by organizing and combining your disparate sources. Contact us at email@example.com if you’re interested in learning more about cloud architecture.
Greg Marsh is a Data Engineer Manager at Aptitive. In his role, Greg facilitates the discovery of business insights from data. From “Big” data like IoT streams or classic relational ERP information, Greg helps companies to unlock the power of their data.