The problem
At my company, we had a classic data challenge: operational data coming from a TMS (Transportation Management System) API that needed to be both visualized in executive dashboards and editable by operations teams. Think of it like this — logistics managers needed to see real-time KPIs, but they also needed to correct, annotate, and update records that the API got wrong or missed.
Most teams solve this with two completely separate systems: a BI tool for reporting and some custom web app for data entry. That means duplicate data, sync issues, and twice the maintenance. I wanted a single architecture where both reporting and CRUD operations feed from the same source of truth.
The architecture
Here's what I built — entirely with Microsoft tools, no Python scripts, no custom backends, no third-party ETL:
The beauty of this design is that SQL Server is the single source of truth. SSAS reads from it, Power Apps writes to it. When a user edits data in Power Apps, the next SSAS process picks it up, and Power BI reflects the change. No sync conflicts, no duplicate databases.
How each layer works
Layer 1: API → SQL Server
The TMS API serves JSON data about shipments, packages, and transfer histories. I use a SQL Agent Job that runs a PowerShell script via xp_cmdshell to call the API, parse the JSON, and insert into staging tables. From there, a merge procedure loads the data into the production tables.
-- SQL Agent Job calls this on a schedule -- PowerShell fetches from TMS API, parses JSON -- Inserts into staging table -- MERGE procedure handles upserts to production ;
Layer 2: SQL Server → SSAS Tabular
SSAS sits on top of SQL Server as an in-memory analytical engine. It reads the production tables, applies the data model (relationships, hierarchies, measures), and serves as the semantic layer for Power BI.
Key decision: I defined my DAX measures in the SSAS model, not in Power BI. This means any Power BI report connected to this model gets the same business logic — no inconsistent calculations across reports.
-- SSAS processing via SQL Agent Job (scheduled after ETL completes)
-- Uses XMLA command to process the tabular model
<Process xmlns="...">
<Type>ProcessFull</Type>
<Object>
<DatabaseID>Model</DatabaseID>
<TableID>FactTable</TableID>
</Object>
</Process>
Layer 3a: SSAS → Power BI (Reporting)
Power BI connects to SSAS via a Live Connection. This is important — it means Power BI doesn't import any data. It queries SSAS on the fly, which means reports always show the latest processed data without needing a separate refresh schedule.
The dashboards I built include real-time logistics KPIs, package transfer tracking across locations, and operational monitoring with automated alerts.
Layer 3b: SQL Server → Power Apps (CRUD)
This is where it gets interesting. Power Apps connects directly to the SQL Server tables via the on-premises data gateway. Operations teams get a clean, mobile-friendly form where they can:
- View current records from the same tables SSAS reads
- Edit incorrect shipment data (wrong status codes, missing locations)
- Add manual entries for records the API missed
- Delete duplicate or test records
When they save changes, it writes directly to SQL Server. The next SSAS process cycle picks up the changes, and Power BI reflects them. The feedback loop is complete.
The scheduling chain
Timing matters. Here's how I sequenced everything with SQL Agent Jobs:
Lessons learned
Keep DAX in SSAS, not Power BI. If you define measures in Power BI Desktop, they're locked to that one report. Put them in SSAS and every report inherits the same logic. This is non-negotiable for governance.
The on-premises gateway is the weak link. Power Apps needs it to reach your SQL Server. It works fine, but it's one more thing that can go down. Monitor it. Set up alerts.
Don't skip the staging layer. I use a truncate-and-load staging pattern before MERGE into production. It adds a step, but it means I can inspect and debug data quality issues before they hit the model.
Power Apps isn't pretty, but it's fast. Coming from a React frontend background, Power Apps feels limiting. But I built a fully functional CRUD interface in a day. For internal tools, speed of delivery beats pixel-perfect design every time.
What's next
I'm planning to add Power Automate into this architecture to trigger notifications when specific data thresholds are hit — for example, alerting the logistics team when packages are stuck at a location for too long. That would complete the full Power Platform integration: Power BI for seeing, Power Apps for doing, and Power Automate for reacting.
If you're building something similar or have questions about any of these layers, feel free to reach out on LinkedIn. I'm always happy to talk architecture.