75% of Your Team Uses AI at Work. None of That Knowledge Belongs to You.
By Milo Team · April 25, 2026 · 7 min read
There is a version of this problem that agencies have always lived with. A senior account manager leaves. She takes with her three years of hard-won knowledge about a client — how they actually make decisions, what language lands in presentations, which stakeholder is the real blocker, what failed in 2023 and why. None of it was ever written down. It lived in her head. Now it lives nowhere.
That problem hasn't gone away. But in the last two years, it has quietly gotten a new layer — one that most agencies haven't fully reckoned with yet.
Where work actually happens now
In 2024, 75% of knowledge workers reported using AI tools in their day-to-day work. ChatGPT alone is used by 65% of workers who have adopted AI — more than any other tool, by a significant margin. Daily usage doubled in a single year. Among the professionals most likely to be on your team — people in marketing, professional services, and creative work — the numbers are even higher.
This isn't people dabbling. It's people doing real work. They're drafting client briefs, thinking through campaign strategy, summarizing competitive research, workshopping messaging, writing and rewriting copy. They're doing the kind of thinking that used to happen in documents or in their heads — but now it happens in conversation with an AI.
By the numbers
Here is the part that should make you pause: 73.8% of the ChatGPT accounts being used for work are personal accounts. Not corporate. Not enterprise. Personal free or paid accounts that belong to your employee — not to your company.
The same old problem, wearing new clothes
Organizational psychologists have a name for the knowledge that never gets written down: tacit knowledge. It's been studied for decades. The numbers are consistent and grim. Roughly 80% of an organization's total knowledge base is tacit — it exists in people's minds, not in documents. Studies peg the cost of losing it at staggering levels: IDC estimated that a company of 1,000 employees with just 7% annual attrition loses $300,000 per week in knowledge and productivity. The costs compound across the organization — lost productivity, repeated work, and relationship setbacks that never appear on a balance sheet but accumulate steadily.
Studies consistently find that a significant portion of institutional knowledge — researchers put estimates above 40% — lives solely in individual employees' heads. When those employees leave, that knowledge doesn't transfer. It disappears.
That has always been true. What's changed is where that knowledge now lives while those employees are still with you.
Previously, the tacit knowledge problem was invisible — it lived in people's heads and you simply couldn't capture it. Now, your team is externalizing that knowledge every single day, in real time, inside AI tools. They're typing out context, asking questions that reveal what they know, documenting their thinking in the act of prompting. The knowledge is being written down — just not anywhere you can see or keep.
"Every query, instruction, or conversation with ChatGPT is stored indefinitely — unless deleted by the user. Including sensitive data like personal details, proprietary code, or internal business strategies."
— Nightfall AI, 2025
What this looks like at an agency
Picture your account manager — the one who's been running your best client relationship for two years. Over that time, she's had hundreds of conversations in ChatGPT. She used it to prep for client calls. To think through why a campaign underperformed. To draft the deck that turned around a difficult review. To workshop the positioning brief that finally clicked. To summarize a long email thread and identify what the client actually wanted.
All of that is context. All of it is institutional knowledge. And all of it lives in her personal ChatGPT account.
When she leaves — for another agency, for a bigger opportunity, for a career change — that account history doesn't transfer. The chat logs aren't in your systems. You can't export them. You can't hand them to the person who takes over the account. You start from scratch, making the same expensive mistakes, asking the client to repeat themselves, renegotiating trust you'd already built.
This happens across every account, with every employee, on a continuous loop.
The deeper irony
The same tools that make your team more productive are actively accelerating your knowledge loss problem. The better your people are at using AI — the more they rely on it, the more context they pour into it — the more institutional knowledge ends up locked in accounts you don't control and can't access.
This isn't a reason to ban AI use. That ship has sailed, and trying to reverse it would make your team less effective without solving the underlying problem. But it is a reason to think carefully about where that knowledge is going and who owns it.
According to research from Cyberhaven, the amount of corporate data employees put into AI tools increased by 485% between 2023 and 2024. Sensitive data — client information, strategic thinking, proprietary processes — now makes up 34.8% of employee ChatGPT inputs. Your team is building a detailed picture of how your company works, one prompt at a time. That picture belongs to them, not you.
What the solution actually looks like
The answer isn't surveillance. It isn't restricting which tools your team can use. And it isn't asking people to manually document everything they do — they won't, and the handful who try will create incomplete records that are almost as useless as nothing.
The answer is a company-level AI that works alongside your team the same way personal AI tools do — but captures context into a shared memory that belongs to the company, not the individual.
When someone asks Milo about a client, that interaction becomes part of your company's knowledge base. When they use it to think through a project decision, that context is preserved. When they leave, their accumulated knowledge of their accounts, their clients, and their work stays behind — not in a folder that has to be handed over, but as active, queryable memory that the next person can pick up immediately.
The goal isn't to replace the tools your team already uses. It's to make sure the knowledge generated by using those tools doesn't keep walking out the door.
The tacit knowledge problem has existed for as long as companies have existed. For the first time, the technology exists to actually solve it — not by forcing documentation, but by capturing knowledge as a natural byproduct of the work people are already doing.
We're building Milo to solve exactly this.
A company brain that captures the knowledge your team generates every day — so it stays when they don't. We're working with a small group of agencies as design partners. Free access, direct input into what we build.
Apply as a design partner →Sources
- Microsoft & LinkedIn, "AI at Work Is Here. Now Comes the Hard Part" — 2024 Work Trend Index, May 2024. Survey of 31,000 people across 31 countries. [microsoft.com]
- AIPRM, "AI in the Workplace Statistics," 2024. [aiprm.com]
- OpenAI, "ChatGPT Usage and Adoption Patterns at Work," 2025. [openai.com]
- OpenAI & David Deming (NBER), "How People Use ChatGPT," September 2025. Analysis of 1.5 million conversations. [openai.com]
- Cyberhaven, "Shadow AI: Employee AI Adoption Risks Your Company Data" — Q2 2024 AI Adoption and Risk Report. (73.8% personal accounts; 485% data increase; 34.8% sensitive data.) [cyberhaven.com]
- IDC, "The High Cost of Not Finding Information," 2001. Widely cited industry estimate; original figure from a 1,000-employee, 7% attrition model. [IDC white paper (via computhink.com)]
- Nightfall AI, "Does ChatGPT Store Your Data in 2025?" — on data retention practices. [nightfall.ai]