- The University , Employee
- 14/05/2024
Copilot as a safer alternative to ChatGPT
Cursor got a question about the recent launch of Microsoft Copilot at TU/e. One of our readers wondered why we offer this tool, if everyone’s privacy is safe and if it’s worth the money. Cursor decided to dive into this matter. Spoiler alert: yes, you’re safe (double safe, but more on that later) and no, it doesn’t cost the university money.
Most of us know ChatGPT by now. An AI tool released by the American company OpenAI. From the beginning many people played with the tool. And the playing is actually teaching it to become ‘smarter’, or ‘dumber’, as you please. But many people don’t realize that what they put in AI chatbots most often is no longer their property. In ChatGPT you can now manually flip a switch to not allow the system to use your input for training purposes. But it’s not set by default.
TU/e recently introduced Microsoft Copilot. Product owner Wybe Smits: “We see the benefits and potential of generative AI. And Copilot is available within our license of Office 365, hence no extra cost.” TU/e-privacy officer Satwik Singh says: “The app has commercial data protection, which is positive for the people using it. But as with any new application, we execute a privacy assessment.”
What does the commercial data protection of TU/e look like?
1. Microsoft has no eyes on access to the data that is entered via prompts.
2. The data never leaves the TU/e tenant.
3. The data is not used for the training of the model.
These are explicit guarantees provided by Microsoft.
Watch out for Google Translate
Microsoft Copilot (intranet link) has it set by default that your data are not used for training purposes. Also, once you close the application, you’re history is not even saved. Now, if you think you’re not that dumb to enter sensitive data into ChatGPT: good for you. But did you know this also applies to apps like Google Translate?
Many people put sensitive info in a translation tool to quickly get a translated version of their text. Imagine putting an NDA or work contract in there. That brings privacy officer Satwik Singh to the following point: “Always read the guidelines: if you use an Generative AI tool like ChatGPT/Copilot, you’re not allowed to enter any personal or sensitive data. That’s protection number one. Number two, in case of Microsoft Copilot, there is an added protection, which is the way by which Microsoft handles your data as described above. So even if you make a mistake, there is still a protection on.”
Singh: “Privacy always gets our full attention in apps as big tech does not always have our privacy interest in mind when designing them. But in the case of Microsoft Copilot we saw an app which has included privacy by default and by design through the technical measures as described above, which made us happy.”
Former Bing Chat
It might be the case that you have used Microsoft Copilot before. “It used to be called ‘Bing Chat Enterprise’,”, Smits says.
The Cursor reader dropping the question was also curious if the university just rolls out anything Microsoft launches. The answer is no. “Even if it’s included in our license, we remain critical. There needs to be a business benefit as any launch into the organization also costs us time and effort.”
He gives an example of a Microsoft app that wasn’t launched: “Microsoft Places has been assessed but didn’t pass as it was too much of an overlap with Planon.”
The privacy requirements are assessed independently from the business requirements, Singh adds. "The advice to use a tooling is only given after a stringent test. In the case of Microsoft Copilot, the go-ahead was only given once it had been confirmed that the Commercial Data Protection option was indeed working technically for all users.”
Research purposes
So, with the release of Microsoft Copilot, does the TU/e want us to not use ChatGPT or similar tools anymore? Smits: “No, that’s not the case. With OpenAI (the company behind ChatGPT, ed.) we don’t have a contract and that brings the responsibility on the user. With Microsoft we have the user protection well figure out, so it’s the preferred tool. But I can imagine some research initiatives might require OpenAI for advanced stuff. Microsoft Copilot is more for the average users. Advanced users and researchers might want to build something custom in Azure for example.”
Pubquiz
Singh: “As mentioned before, TU/e already has privacy and data protection for users of generative AI. In that guideline it’s specifically mentioned that when you use any tool like Copilot, you cannot put any personal or sensitive data in the prompts.” He grins. “And you’re supposed to read the privacy guidelines they have released at TU/e before you use the software.” The journalist grins too: “I’ll take an educated guess that not all people read those.”
Singh nods. “We’d like to increase knowledge on these topics in a fun way and therefore we’re organizing a Privacy and Security Pub Quiz on May 27 at Hubble. More details are found online.” Participants can sign up via this link (intranet link).
Discussion