Most marketing and revenue teams aren’t short on data. They’re short on usable data.
That’s where Lumen found itself. As a long‑time leader in enterprise connectivity, the company was already operating at serious scale. But inside the organisation, decades of growth had left a familiar pattern: aging systems, data silos, duplicated pipelines, and different teams quietly maintaining their own truths.
Marketing and sales, in particular, were feeling it. Ingestion from third‑party tools was slow, copying data into SQL servers was a daily chore, and “reporting” often meant reconciling inconsistent metrics instead of answering strategic questions.
For a business trying to position itself as “the trusted network for AI,” that wasn’t good enough. So Lumen did something many teams talk about but rarely execute well: it rebuilt its data backbone for go‑to‑market around Microsoft Fabric.
When your own systems slow the story you’re trying to tell
Externally, Lumen’s story is about speed, intelligence, and adaptability in a multicloud, AI‑first world. Internally, the reality had drifted away from that story:
-
Marketing data lived in multiple third‑party tools, APIs, and partner platforms.
-
Moving data into analytics environments required repeated transfers, custom pipelines, and a lot of manual wrangling.
-
Business logic lived in spreadsheets or local files, leading to version control nightmares and inconsistent metrics across teams.
The people closest to this pain were the ones tasked with driving growth.
“Our job is to make the complex simple,” Lumen’s Director of Data Engineering said. “But we needed better ways to collaborate, move faster, and get to real insights—not just raw data.”
When it takes hours every day just to copy data into SQL and line up columns, the team is burning time on plumbing that should already be solved.
Choosing Fabric: not just a new BI tool, but an end‑to‑end environment
Lumen didn’t just want a nicer front‑end. It needed an end‑to‑end environment that:
-
Centralised access to data.
-
Simplified ingestion from external sources.
-
Let engineers, analysts, and business users work in the same ecosystem.
Already deep in the Azure ecosystem, they chose Microsoft Fabric as the backbone. Fabric brought together what they had been doing in separate places—Spark, SQL, notebooks, Power BI—and turned it into one cohesive platform.
With Fabric, the team could:
-
Consolidate third‑party marketing data into OneLake, Fabric’s unified storage layer.
-
Use Spark notebooks to clean, standardise, and reshape data without configuring VMs or networks.
-
Rely on shortcuts and mirroring to access external data sources and keep them in sync without physically moving everything.
“OneLake allowed us to ingest once and use anywhere,” their cloud architect noted. “That flexibility is something we never had before.”
From “copy and paste into SQL” to “open a notebook and start working”
Before Fabric, transforming data meant spinning up separate environments, setting up private networks, and manually wiring connections. It was slow and fragile.
With Fabric, engineers could move transformation work into Spark‑powered notebooks, using Python and SQL in a single interface. No more juggling separate systems for ingestion and transformation.
“In Fabric, we just open a notebook and start working. We don’t need a separate environment; it’s all integrated,” the cloud architect said.
A few practical shifts this enabled:
-
Prebuilt connectors and Spark utilities accelerated ingestion of partner and API data.
-
Shortcuts provided instant access to external sources without duplicating data.
-
Mirroring kept REST APIs and external systems in near real time, without custom sync jobs.
Once transformed, data stayed in OneLake. With Direct Lake, Power BI could query that data directly—no imports, no scheduled refreshes, no hidden latency.
“There are no refresh cycles, no latency. It just works,” a senior BI developer said. “With Direct Lake and semantic models, we can develop faster, share dashboards instantly, and eliminate duplication.”
Fixing the hidden problems: governance, version control, and trust
Performance and convenience are obvious wins. But Fabric also helped Lumen tackle some quieter but equally important issues around governance and consistency.
Previously, key business logic often lived in spreadsheets or personal files. That made collaboration difficult and metrics unreliable across teams.
Fabric’s native GitHub interoperability and pipelines changed that:
-
Source control became a first‑class citizen for analytics work.
-
Deployments could be automated and repeatable.
-
Governed data products could be published in a way that aligned with corporate policy, not individual preferences.
As an early adopter of Microsoft 365 Copilot, Lumen layered AI on top of this foundation. Copilot inside notebooks and Power BI now helps teams write cleaner code, generate DAX, and explore data using natural language—speeding up development and increasing confidence in the work.
Copilot stopped being a novelty and started becoming part of the daily workflow.
10,000 manual hours eliminated—and a different kind of marketing team
The impact on time savings alone is hard to ignore.
With marketing data fully consolidated into OneLake and pipelines automated, Lumen eliminated nearly 10,000 manual hours in a single year. That’s time previously spent copying data into SQL servers, reconciling outputs, and manually handing off files to sales.
Engineers reported saving up to six hours a day that had once been dedicated to copying data into SQL.
That reclaimed time now goes into:
-
Creating better, governed datasets.
-
Building dashboards sales and marketing can both trust.
-
Improving lead targeting and measuring campaign ROI more precisely.
“Every executive wants to know if they’re making the right decisions,” Lumen’s Chief Marketing and Strategy Officer said. “Fabric helps us connect all our data and actually see what’s working—across every campaign, every ad, every dollar.”
Moving to Fabric and the cloud also avoided renewing costly third‑party licenses and buying more on‑prem hardware. While exact numbers are confidential, the team describes the savings as substantial—further validating the shift.
Looking ahead: from “What happened?” to “What should we do next?”
Lumen’s marketing and data teams are already thinking beyond descriptive analytics.
“We want Fabric to tell us where to spend our next dollar,” their CMO said. “That’s the future: not just analysing what happened but recommending what to do next.”
With near real‑time data sharing, governed models, and AI‑assisted development in place, the foundation for that future is already there. The journey now is about:
-
Using predictive models and intelligent automation to guide spend and targeting.
-
Continuing to tighten the loop between marketing, sales, and product.
-
Keeping governance strong as they scale new use cases across the business.
Fabric gives them the environment to do that without stitching together yet another stack of disconnected tools.
If your go‑to‑market data feels messy, where would you start?
If parts of Lumen’s story hit a little too close to home—slow ingestion from third‑party tools, teams copying data between systems, inconsistent metrics, and dashboards that lag behind reality—you’re not alone. Many B2B and enterprise organisations are carrying similar baggage.
Onyx Data works with marketing, sales, and data teams in exactly this position: ambitious growth goals resting on scattered data, fragile pipelines, and manual glue work that eats entire days.
Often, the first step is deceptively simple: map where your go‑to‑market data actually lives today, where duplication and friction are highest, and which decisions suffer most from slow or inconsistent insight.
If you’d like to explore what a Fabric‑based, governed data backbone could look like for your own revenue engine, complete the short form below. Share your role, your current platform mix, and the one data bottleneck that bothers you most. From there, Onyx will suggest a tailored, no‑obligation starting point you can take back to your team.