Menu
Research has found that an alarming number of employees are now using their own AI tools at work, without the permission of their organization.
According to a survey by Software AG, half of all knowledge workers – defined as “those who primarily work at a desk or computer” – use personal AI tools.
Most knowledge workers said they use their own AI tools because they prefer their independence (53%). An additional 33% said it’s because their organization does not currently offer the tools they need.
This suggests that if businesses want their employees to use officially issued tools, a different process is needed for determining the ones that are actually made available.
The research goes on to show that personal AI tools are so valuable that half of workers (46%) would refuse to give them up, even if their organization banned them completely.
This is a powerful signal to organizations that they need more robust and comprehensive AI strategies, to prevent inviting significant risk into their business.
In a recent article the BBC spoke to a product manager at a data storage company, which offers its people the Google Gemini AI chatbot.
External AI tools are banned by the company, but the product manager uses ChatGPT through search tool Kagi. He finds the biggest benefit of AI comes from challenging his thinking when he asks the chatbot to respond to his plans from different customer perspectives.
“The AI is not so much giving you answers, as giving you a sparring partner,” he says. “As a product manager, you have a lot of responsibility and don’t have a lot of good outlets to discuss strategy openly. These tools allow that in an unfettered and unlimited capacity.”
He’s not sure why the company has banned external AI. “I think it’s a control thing,” he says. “Companies want to have a say in what tools their employees use. It’s a new frontier of IT and they just want to be conservative.”
It’s an interesting perspective – but Shadow AI comes with significant risks.
Modern AI tools are built by digesting huge amounts of information, in a process called training, with around a third of applications being trained using information entered by the user.
Consequently, the uncontrolled use of Shadow AI can result in company data being stored in AI services that the employer has no control over, no awareness of, and which may be vulnerable to data breaches.
It’s another example which shows cybersecurity isn’t just about firewalls and encryption – it’s about people. And HR holds the key to making every employee a vigilant defender of the company’s digital assets.
If you would like to discuss how we can help we can help build cybersecurity into the culture of your organization, please get in touch with us today!