The early rollout of Microsoft 365 Copilot across South African organisations brought a wave of enthusiasm. Demos looked impressive. A handful of keen early adopters found immediate use in drafting emails and summarising meetings. Leadership gave the green light for wider deployment. And then the hard part started.
Because at some point, the CFO, the CIO, or the managing director is going to ask a direct question: what are we getting for this spend? That question lands differently in South Africa than it does in London or New York. Licensing costs are denominated in US dollars, and with the Rand sitting where it does, the per-user monthly cost of Copilot translates into a meaningful line item for local businesses. A company with 200 Copilot licences at USD 30 per user per month is spending close to R1.2 million a year at current exchange rates, before accounting for the underlying Microsoft 365 subscription. That is not a number you defend with a few positive anecdotes from the marketing team.
The organisations we work with at Braintree are past the experimentation phase. They need proof. And proof requires measurement.
The Problem with Anecdotal Feedback
When adoption conversations rely on individual stories, they miss the full picture. One user who saves an hour a day on email is encouraging, but it tells you nothing about the 180 other licensed users who opened Copilot twice last month and then forgot about it. It does not tell you whether the sales team is using it at all, or whether the finance department found it useful for anything beyond formatting spreadsheets.
Anecdotal feedback creates a distorted view. It skews positive because the people who talk about Copilot tend to be the ones who like it. The silent majority, the ones who are confused, uninterested, or struggling to see how it fits into their daily work, go unheard.
For South African businesses operating in a constrained economic environment, this kind of blind spot carries real cost. Every unproductive licence is wasted spend that could have gone toward connectivity upgrades, training, or security improvements.
What Copilot Analytics Provides
Microsoft has invested in measurement tools that have matured over the past twelve months. The Copilot Dashboard, accessible through Viva Insights and the Microsoft 365 admin centre, gives IT and business leaders a direct view into how Copilot is being used across the organisation.
The dashboard tracks adoption rates by team and department, engagement frequency per user, usage distribution across Microsoft 365 workloads like Word, Outlook, Teams, Excel, and PowerPoint, and patterns segmented by role or business unit.
This data matters because it separates signal from noise. Consistent daily usage across a team tells you Copilot has become part of how people work. Sporadic use, a login here and there with long gaps in between, tells you something is off. The reasons vary. It could be a lack of training. It could be unclear use cases. It could be that the team’s daily work does not align well with what Copilot does today. Each of these requires a different response.
Microsoft’s own internal data shows that organisations actively using the Copilot Dashboard see 2.1 times more Copilot usage compared to those who do not track adoption at all. The act of measuring changes behaviour.
Benchmarking Against Peers
A feature that landed in the dashboard late last year allows admins to compare their organisation’s Copilot adoption against external benchmarks, segmented by company size, industry vertical, and user type. This is a meaningful addition for South African businesses, many of whom operate without local reference points for what good AI adoption looks like.
If your 80-person logistics company in Johannesburg has a 25 percent active usage rate, is that good or bad? Without benchmarks, you are guessing. With them, you can see that similar-sized companies in your sector are running at 55 percent, which tells you there is a gap worth investigating.
Internal benchmarking is equally useful. If your Cape Town office has three times the Copilot engagement of your Durban office despite identical license allocation and access, that is a clear signal. The high-performing team becomes a source of repeatable use cases. The lower-performing team becomes a priority for support and structured enablement.
Connecting Usage to Business Outcomes
Tracking adoption is step one. Linking it to measurable business outcomes is where the ROI case gets built.
At Braintree, we work with customers to define two or three specific outcomes before a broad rollout. These outcomes vary by industry and by team, but common examples include reduced turnaround time on proposals and client deliverables, faster document production across legal, compliance, and reporting functions, improved meeting follow-through by using Copilot-generated recaps and action items, and time reclaimed per employee per week.
The time savings alone are worth modelling. If Copilot saves each user 30 minutes per day, and you have 150 active users, that adds up to more than 19,000 hours per year. Translate that into productive output or reduced overtime, and the financial case writes itself.
For South African organisations dealing with the realities of hybrid work, inconsistent power supply in some regions, and the pressure to do more with smaller teams, those recovered hours are not abstract. They are the difference between meeting a deadline and missing it.
POPIA and Data Governance Considerations
Measurement also intersects with compliance. Under POPIA, South African organisations have specific obligations around how personal information is processed and stored. When you deploy Copilot and start tracking usage analytics, you need to ensure that your data governance framework accounts for this.
Microsoft has published guidance specific to POPIA compliance for Copilot for Microsoft 365, confirming that organisational data remains within the Microsoft 365 tenant and is not used to train public AI models. Data residency options allow organisations to align storage with local regulatory requirements.
From our perspective at Braintree, the measurement conversation and the compliance conversation should happen at the same time. If you are going to track Copilot usage by department and role, make sure your information officer is part of that process, and that your data classification and access controls are set up properly before deployment scales.
What We Recommend
The organisations we see getting the most from Copilot follow a clear pattern. They deploy to a defined group first, measure adoption and outcomes over 60 to 90 days, identify what is working, build internal playbooks, and then expand. Teams with low adoption are not written off. They receive targeted support, structured training sessions, and use cases tailored to their specific workflows.
If your organisation has deployed Copilot and you are not measuring usage, you are missing the information you need to make informed decisions about scaling, training, and budget allocation. The tools exist. The data is available. The question is whether you are using it.
Get in touch with the Braintree Modern Workplace team to set up your Copilot analytics framework and build a measurement strategy that supports your next budget conversation.



