Tenable study shows 73% of Indian organisations plan to use Generative AI to enhance security measures and align IT objectives with the business 

0
Nigel_Ng

Tenable®, Inc has published a new study revealing that 73% of organisations in India plan to harness generative AI (GenAI) within the next 12 months to enhance security measures and align IT objectives with broader business goals. Despite this surge in adoption, the study also reveals a worrying trend, as only 8% of organisations demonstrate high confidence in effectively implementing GenAI technologies.

The study identified two major challenges hindering Indian organisations from utilising or optimising AI technologies: a lack of technological maturity (71%) and uncertainty about the applicability of AI within their operations (54%).

“Despite the rise of AI, many Indian businesses are still developing their technology maturity and often lack the resources or skills needed to properly create, train, and implement AI, as well as maintain high standards of data governance,” said Nigel Ng, Senior Vice President, Tenable APJ. “The increasing use of cloud services, virtualisation platforms, microservices, applications, and code libraries introduces additional challenges, such as vulnerabilities, cloud misconfigurations, and risks associated with identity access, groups, and permissions. These factors are prompting security professionals in India to explore the best ways to leverage AI for preventive security efforts.”

The data is drawn from the Indian edition of “How to Discover, Analyse and Respond to Threats Faster with Generative AI,” a commissioned study of 826 IT and cybersecurity professionals, including 52 Indian respondents conducted in October 2023 by Forrester Consulting on behalf of Tenable. The research sheds light on the growing adoption of generative AI within Indian businesses, marking a significant pivot in their strategic focus. It reveals a sense of hopeful anticipation among security leaders regarding the capacity of GenAI to enhance security measures. Nevertheless, it also accentuates the intricate nature of the path towards AI integration, as organisations navigate the delicate balance between innovation and potential risks. 

An aspect of concern highlighted by the study is the perception of GenAI as a greater security threat than an opportunity among 40% of India organisations. This sentiment reflects widespread apprehension regarding cybersecurity risks associated with GenAI implementation. Internal misuse of GenAI emerges as a prominent concern, with 67% of respondents expressing worry about potential misuse within their organisations. Additionally, 60% of respondents say that providing sensitive data to open-source GenAI puts them at risk of intellectual property theft.

Despite facing significant challenges in adopting AI technologies, cybersecurity and IT leaders in India are optimistic about the potential benefits of generative AI. They see opportunities for improvement in several key areas: 31% believe generative AI can enhance preventive threat response, 42% think it can automate security measures, and 40% feel it can improve actionability.

However, the effectiveness of AI is contingent on the quality of the data it is trained on. Acknowledging this, 73% of Indian organisations agree or strongly agree that the success of generative AI heavily relies on the data used to fuel it.

“Currently, generative AI is predominantly being used for improving customer experience and operational efficiency. However, the future holds great promise in leveraging GenAI’s evolutionary capabilities, for preventive cybersecurity measures. This shift moves the needle on merging IT and security objectives, and places importance on prioritising AI governance, and using the technology responsibly,” Ng added.

To harness the full potential of AI, these organisations are focusing on areas such as training and upskilling cybersecurity professionals (63%), implementing automated reporting and alerting systems (46%), and improving fraud detection capabilities (46%).

LEAVE A REPLY

Please enter your comment!
Please enter your name here