Power BI Power Apps SSAS

How I Built an End-to-End CRUD + Reporting App Using Only the Microsoft Stack

API ingestion, SQL Server, SSAS tabular modeling, Power BI dashboards, and Power Apps for data editing — one unified architecture, zero third-party tools.

BE
Batuhan Erdeniz | April 10, 2026 | 6 min read

The problem

At my company, we had a classic data challenge: operational data coming from a TMS (Transportation Management System) API that needed to be both visualized in executive dashboards and editable by operations teams. Think of it like this — logistics managers needed to see real-time KPIs, but they also needed to correct, annotate, and update records that the API got wrong or missed.

Most teams solve this with two completely separate systems: a BI tool for reporting and some custom web app for data entry. That means duplicate data, sync issues, and twice the maintenance. I wanted a single architecture where both reporting and CRUD operations feed from the same source of truth.

The architecture

Here's what I built — entirely with Microsoft tools, no Python scripts, no custom backends, no third-party ETL:

APITMS endpoint
SQL ServerSingle source of truth
SSASTabular model
Power BIReporting
Power AppsCRUD / editing

The beauty of this design is that SQL Server is the single source of truth. SSAS reads from it, Power Apps writes to it. When a user edits data in Power Apps, the next SSAS process picks it up, and Power BI reflects the change. No sync conflicts, no duplicate databases.

How each layer works

Layer 1: API → SQL Server

The TMS API serves JSON data about shipments, packages, and transfer histories. I use a SQL Agent Job that runs a PowerShell script via xp_cmdshell to call the API, parse the JSON, and insert into staging tables. From there, a merge procedure loads the data into the production tables.

-- SQL Agent Job calls this on a schedule
-- PowerShell fetches from TMS API, parses JSON
-- Inserts into staging table
-- MERGE procedure handles upserts to production
;
Why not use SSIS? For API-based sources, SSIS adds unnecessary complexity. A lightweight PowerShell script scheduled via SQL Agent Job is simpler to maintain and debug. Save SSIS for database-to-database ETL where it truly shines.

Layer 2: SQL Server → SSAS Tabular

SSAS sits on top of SQL Server as an in-memory analytical engine. It reads the production tables, applies the data model (relationships, hierarchies, measures), and serves as the semantic layer for Power BI.

Key decision: I defined my DAX measures in the SSAS model, not in Power BI. This means any Power BI report connected to this model gets the same business logic — no inconsistent calculations across reports.

-- SSAS processing via SQL Agent Job (scheduled after ETL completes)
-- Uses XMLA command to process the tabular model

<Process xmlns="...">
  <Type>ProcessFull</Type>
  <Object>
    <DatabaseID>Model</DatabaseID>
    <TableID>FactTable</TableID>
  </Object>
</Process>

Layer 3a: SSAS → Power BI (Reporting)

Power BI connects to SSAS via a Live Connection. This is important — it means Power BI doesn't import any data. It queries SSAS on the fly, which means reports always show the latest processed data without needing a separate refresh schedule.

The dashboards I built include real-time logistics KPIs, package transfer tracking across locations, and operational monitoring with automated alerts.

Layer 3b: SQL Server → Power Apps (CRUD)

This is where it gets interesting. Power Apps connects directly to the SQL Server tables via the on-premises data gateway. Operations teams get a clean, mobile-friendly form where they can:

When they save changes, it writes directly to SQL Server. The next SSAS process cycle picks up the changes, and Power BI reflects them. The feedback loop is complete.

The scheduling chain

Timing matters. Here's how I sequenced everything with SQL Agent Jobs:

1
API fetch (every 30 min) PowerShell script calls TMS API, loads staging tables, runs MERGE to production.
2
SSAS process (triggered after ETL) XMLA command processes the affected tables. Only runs after Step 1 succeeds — chained via Job Steps with "On Success" conditions.
3
Power BI auto-refreshes Since it's a Live Connection to SSAS, no separate refresh needed. Users see updated data immediately after SSAS processes.
4
Power Apps writes happen in real-time User edits go to SQL instantly. They're picked up in the next SSAS process cycle (within 30 min max).

Lessons learned

Keep DAX in SSAS, not Power BI. If you define measures in Power BI Desktop, they're locked to that one report. Put them in SSAS and every report inherits the same logic. This is non-negotiable for governance.

The on-premises gateway is the weak link. Power Apps needs it to reach your SQL Server. It works fine, but it's one more thing that can go down. Monitor it. Set up alerts.

Don't skip the staging layer. I use a truncate-and-load staging pattern before MERGE into production. It adds a step, but it means I can inspect and debug data quality issues before they hit the model.

Power Apps isn't pretty, but it's fast. Coming from a React frontend background, Power Apps feels limiting. But I built a fully functional CRUD interface in a day. For internal tools, speed of delivery beats pixel-perfect design every time.

The bottom line: You don't need a custom web app, a separate database, or Python scripts to build a complete data solution with read AND write capabilities. The Microsoft stack — SQL Server, SSAS, Power BI, and Power Apps — handles it end-to-end. The key is making SQL Server the single source of truth for everything.

What's next

I'm planning to add Power Automate into this architecture to trigger notifications when specific data thresholds are hit — for example, alerting the logistics team when packages are stuck at a location for too long. That would complete the full Power Platform integration: Power BI for seeing, Power Apps for doing, and Power Automate for reacting.

If you're building something similar or have questions about any of these layers, feel free to reach out on LinkedIn. I'm always happy to talk architecture.