Microsoft open-sources tool that helps secure AI and machine learning

Microsoft Logo at Ignite
Microsoft Logo at Ignite (Image credit: Windows Central)

What you need to know

  • Microsoft recently open-sourced Counterfit, an automation tool for security testing AI systems.
  • The tool can be used to assess the security of machine learning systems and AI.
  • A Microsoft survey shows that many organizations do not have the right tools in place to secure AI systems.

Earlier this week, Microsoft open-sourced its automation tool for security testing AI systems called Counterfit. The tool can be used to perform security risk assessments of AI and machine learning systems.

Microsoft explains Counterfit in its blog post on open-sourcing it. The company explains that Counterfit was "born out of our own need to assess Microsoft's AI systems for vulnerabilities with the goal of proactively securing AI services." Initially, the tool used attack scripts written to target specific AI models, but it evolved over time through development.

Microsoft regularly uses Counterfit as part of its AI red team operations. The company uses the tool to automate techniques and then pit them against its AI services.

Matilda Rhode, senior cybersecurity research, Airbus, explains why Counterfit getting open-sourced is important:

AI is increasingly used in industry; it is vital to look ahead to securing this technology particularly to understand where feature space attacks can be realized in the problem space. The release of open-source tools from an organization such as Microsoft for security practitioners to evaluate the security of AI systems is both welcome and a clear indication that the industry is taking this problem seriously.

Taking security seriously is an important trend at the moment. Microsoft surveyed 28 organizations, including Fortune 500 companies, governments, non-profits, and small and medium-sized businesses to see what processes are already in place for securing AI systems. The survey showed that 25 out of 28 organizations said that they don't have the right tools in place to secure AI systems.

Sean Endicott
News Writer and apps editor

Sean Endicott is a tech journalist at Windows Central, specializing in Windows, Microsoft software, AI, and PCs. He's covered major launches, from Windows 10 and 11 to the rise of AI tools like ChatGPT. Sean's journey began with the Lumia 740, leading to strong ties with app developers. Outside writing, he coaches American football, utilizing Microsoft services to manage his team. He studied broadcast journalism at Nottingham Trent University and is active on X @SeanEndicott_ and Threads @sean_endicott_.