Intune App Deployment: How Long Does It Really Take?

Intune is one of those technologies many endpoint teams use every day—even if fewer of them would say they love it. That tension was the perfect setup for this EUC Forum session, where Ryan Ferris-Bikker and Lee Jeff (both independent consultants and the people behind GO-EUC research) tried to put hard numbers behind a common complaint:

> “I assigned the app… why isn’t it installed yet?”

Rather than trading anecdotes, they approached it like GO-EUC always does: with repeatable testing, clear metrics, and enough data points to be confident the result isn’t just a one-off.

Why this question matters now

Intune began life as a cloud service for managing mobile devices, but it has grown into Microsoft’s center of gravity for endpoint management: configuration, compliance, security baselines, and—crucially—application deployment.

That last piece becomes even more important as the Microsoft portfolio expands beyond physical PCs into Azure Virtual Desktop (AVD) and Windows 365. In those environments, “eventually” isn’t always good enough. If app delivery is slow or unpredictable, it can directly impact user experience, onboarding timelines, and even operational models like just-in-time provisioning.

The reality: Intune doesn’t push apps

A key point in the talk is a mental model shift:

  • Admins assign apps in Intune.
  • Devices pull policy and app intent on a schedule.

Intune app deployment is therefore gated by several asynchronous steps: the assignment needs to exist, group membership needs to be correct, devices need to check in, and only then does installation begin.

This is where many delays hide. It may feel like “Intune is slow,” but part of what you’re seeing is simply the architecture: the endpoint checks in on an interval, and that interval can be 30 minutes or longer depending on schedules and conditions.

App types and targeting choices

The speakers also highlighted that Intune supports multiple packaging types—Win32, MSI, Microsoft Store apps, line-of-business apps, and MSIX—but regardless of format, two targeting modes dominate:

  • User-based targeting (install follows the user)
  • Device-based targeting (install follows the machine)

Both work, but each interacts differently with identity, enrollment, and check-in behavior. That matters when you’re trying to predict timing.

Static vs. dynamic groups: the hidden multiplier

To make the timing story more concrete, Lee walked through group mechanics in Entra ID (Azure AD):

  • Static groups behave like traditional AD groups: membership changes are immediate.
  • Dynamic groups use rules (expressions) to calculate membership, which requires backend evaluation.

Dynamic groups introduce a delay that can surprise teams, especially during device onboarding. Ryan called out two practical constraints:

  • Dynamic group evaluation can take up to ~8 hours in the official guidance.
  • There’s also a scaling limit around 5,000 dynamic groups, unless you involve Microsoft support.

So if your app deployment depends on a dynamic group, “time to install” may include “time for the directory to notice the device qualifies.”

GO-EUC methodology: no stopwatches, no single samples

The most important part of the session wasn’t a single timing number—it was how the team ensured the result meant something.

GO-EUC research follows a structured process:

1. Define the goal (what are we trying to prove?)

2. Choose the metrics (what timing points matter?)

3. Ensure repeatability (can someone else reproduce it?)

4. Run enough iterations (GO-EUC uses at least 10 runs per scenario)

5. Analyze and publish openly (no registration walls)

Lee joked that this is the kind of research you only do if you really want the answer—because collecting, automating, and analyzing data at this scale takes weeks, not hours.

Their test setup (so the data is credible)

For this Intune deployment research, they built a controlled environment:

  • Eight on-prem virtual machines on vSphere
  • Hybrid-joined (Active Directory + Entra ID)
  • Tested on Windows 10 and Windows 11 (Windows 11 was added after Ryan insisted)
  • Tested with static and dynamic groups
  • Used a repeatable cycle: assign app → detect install → uninstall → repeat
  • Automated timing collection with PowerShell, remote execution, and CSV output
  • Disabled power-saving behaviors to avoid “sleep” skewing results

A lightweight app—7-Zip—was used so the measurement focused on Intune’s delivery pipeline rather than app install complexity.

The takeaway

The session reframed a familiar frustration into something measurable: app deployment time in Intune is rarely just “download + install.” It’s the sum of identity evaluation, group membership, device sync cadence, and policy/app processing.

If you want faster, more predictable deployments, the first step is understanding which part of that chain is actually consuming the time—and GO-EUC’s approach shows how to measure it instead of guessing.