- Microsoft launches a preview of a new AI chatbot for security.
- The AI is known as “Microsoft Security Copilot.”
- It’s based on ChatGPT 4 and uses a new security model from Microsoft.
- The technology will help to quickly identify breaches, help understand data, and automate tasks.
Microsoft unveils the “Microsoft Security Copilot,” a new AI chatbot designed specifically for cybersecurity in organizations. Similar to Copilot for Office, the Copilot for security will assist network administrators in identifying breaches more quickly and help understand different types of data.
The chatbot is based on the ChatGPT version 4 large language model from OpenAI, and it combines a proprietary security model from Microsoft to provide assistance and improve security. One unique aspect of this technology is its learning capabilities that can create and tune new skills. In addition, the Copilot integrates with other products, including Microsoft Defender and Microsoft Sentinel.
Security Copilot uses a simple interface with a dark theme that you can use to ask security-related questions using natural language, such as “what are all the security incidents in the company?” and the chatbot will take and analyze the query against the 65 trillion signals that Microsoft collects daily to answer the question.
In addition to using its own intelligence database, Security Copilot can answer questions using information from the Cybersecurity and Infrastructure Security Agency, the National Institute of Standards and Technology vulnerability databases.
The chatbot can also be used to help with security investigations, create reports, and summarize events. Unlike Bing Chat AI and Copilot for Office, the conversations will be saved for transparency if they ever need to be revised.
Administrators can pin responses to a shared workspace to share them with colleagues that may be working on the same incident. In addition, Security Copilot also has a prompt book feature that allows you to create automated tasks and then offer them as a single click or prompt. For example, one could share a prompt to reverse engineer a malicious script so that the other person doesn’t have to wait until someone else performs the task.
You can’t use Security Copilot to query for the weather, news, sports, and things like that, but you can use the assistant to create a PowerPoint slide to outline a security problem.
As usual, the company doesn’t recommend blindly relying on the chatbot response since it’ll make mistakes. As such, it’s up to the one using the system to double-check the information.
The company has already begun previewing the new Microsoft Security Copilot with a limited number of customers, and it’s not sharing when this technology will be available more broadly.