Experts raise privacy concerns about Microsoft 365’s Copilot

IT organisation SURF has issued a warning against the use of Microsoft Copilot in higher education, claiming that the AI assistant compromises the privacy of students, teachers and researchers. TU/e privacy team previously advised against this as well.

by
photo whiteson / istock

Programs like Word, Excel and Outlook now include Microsoft 365’s new AI tool, Copilot, a smart assistant that can analyse data and documents, create presentations or draft emails.

Chief Privacy Officer Ineke Litjens explains that SURF's advice is not about Copilot in general, but specifically about MS365 Copilot. ‘Simply put, MS365 Copilot is a Copilot on your internal system: Copilot can scan public TU/e documents in Microsoft applications and use them for the output it generates – including for example information on our intranet. This does indeed present all sorts of complications – which we also recognize. Of course, it’s very confusing that there is Copilot (the generic AI tool) and MS365 Copilot. Both are made by Microsoft but only MS365 Copilot can incorporate internal documentation (which is public for the entire TU/e) in its responses. And that is what the advice pertains to,’ she writes in an email response.

Copilot at TU/e

To reassure everyone, Litjens adds that there is no problem with TU/e community members using the “regular” Copilot. Moreover, Microsoft 365 Copilot cannot be used without a paid account. That said, TU/e is currently participating in a pilot project. Dzemal Sukurica is AI Lead for Microsoft Copilot and says the following: “In May 2024, we started a pilot for Microsoft 365 Copilot at TU/e, together with twenty other educational institutions under the coordination of SURF. Fifty TU/e employees took part in this pilot to investigate what this technology might mean for us. In December 2024, the Data Protection Impact Assessment (DPIA) for Microsoft 365 Copilot was published. Our privacy team will assess the listed points in January to determine the measures under which we can continue the pilot. SURF has confirmed that they will urge Microsoft to make changes to their software. We’re currently awaiting the outcome of these discussions. Similar discussions with Microsoft have previously led to good results.”

Until then, TU/e will not be offering Microsoft 365 Copilot and will limit its use to a trial project with a small number of users (less than one percent of TU/e employees) to gain experience using generative AI. 

Nationwide

SURF, the ICT cooperative for Dutch educational and research institutions, believes that educational institutions should not use MS365 Copilot. Last month, SURF issued an advisory following extensive research as well as consultation with software giant Microsoft.

The lengthy document highlights two key issues: besides concerns about user privacy, Copilot doesn’t always provide accurate or complete information, especially when it comes to queries about people.

Incomprehensible

The personal data Copilot collects may not be limited to basic information like the user’s name and employer, but may also include personal preferences that help the tool write emails and create presentations.

“It’s not clear what personal data Microsoft collects and stores about the use of Microsoft 365 Copilot”, says SURF. Efforts to gain insight into Microsoft’s data collection have yielded little success, as the information provided by the company is both “incomplete and incomprehensible”.

False information

Another problem is the inaccurate or incomplete personal data Copilot may provide in response to a question about, say, a professor or politician. There’s a risk that users will place too much trust in this type of misinformation, says SURF.

Microsoft is well aware of this problem. “Sometimes Copilot will be right, other times usefully wrong – but it will always put you further ahead”, according to a blog post on the company’s website. Evidently, SURF has serious doubts about this supposed ‘usefulness’.

There could also be problems if higher education institutions were to use AI in their admissions processes, or to assess job applicants. Even if educational institutions do decide to allow the use of Copilot, they might have to make exceptions for certain departments.

Big tech

Criticism of big tech and generative AI is certainly nothing new. Major tech companies are often accused of having little regard for intellectual property rights and privacy, while shirking responsibility for the disruption caused by AI. Digital pioneer Marleen Stikker has even argued that higher education institutions have grounds to sue the creators of AI software for damages.

But the conflict between digitalisation and privacy predates the latest technological advances. Over five years ago, well before the advent of generative AI, Dutch university presidents sounded the alarm about the influence of major US tech companies on education. “The student and the teacher are becoming the product; the data no longer belongs to them, or to their institution”, they argued.

Cloud-based data storage raises similar questions: if all our data is stored on American servers, who can access it? A group of researchers from five universities are now joining forces to create their own cloud over the next two years. Their goal is to become independent of companies like Google and Microsoft.

 

 

Share this article