Key Takeaways from Three Recent Influential Privacy Reports
What can we learn from three recent reports into the state of privacy and cyber security from the most influential privacy and cyber security member organisations: the IAPP, ISACA and (ISC)2?
Earlier this year we covered the IAPP Privacy Governance Report 2024, providing an analysis and key takeaways from an Australian perspective. The two other surveys that are also worth considering:
- ISACA – Exploring the State of Privacy 2025; and
- (ISC)2 – Global Cybersecurity Workforce Prepare for an AI driven world.
Both these reports are discussed in more detail below.
Although all three reports have very different focuses and approaches, there are some common threads that run through them, and it is not great news for privacy professionals:
- Privacy (and cybersecurity) teams are flat-lining. This is not because the risk environment is stabilising but because of economic conditions and tightening budgets;
- Privacy management is still largely manual – while cybersecurity has rapidly moved to adopting a range of technical solutions to support security practice. This is particularly evident in the pace of adoption of AI in cyber security practice. The slow adoption of privacy tech impacts the ability to automate some of the more basic privacy functions;
- Responsibilities for both privacy and cybersecurity practitioners is expanding , particularly with the increasing use of AI. This puts even more pressure on existing limited resources and manual processes;
- Pathways into privacy still seem difficult – with experience as the number one requirement. This seems to have been addressed more effectively in the cyber security space however most people moving into the field are still older and coming from an IT background. In both fields, there is a view that skills shortages continue as a serious issue.
ISACA – Exploring the State of Privacy 2025
For this report, ISACA surveyed 1600 global privacy professionals.
Key findings include:
- The median privacy staff size dropped from nine last year to eight this year.
- This is consistent with a drop in privacy budgets: 48% of respondents noted they think their privacy budgets will decrease in the next 12 months. This is consistent with last year’s findings.
- Both legal/compliance and technical privacy roles continue to be in demand, with just over 50% of respondents saying there would be an increase in demand in both roles in 2025.
Experience continues as the key skills looked for – including experience with different types of technologies and/or applications (61%), experience with frameworks and/or controls (49%), and technical expertise (48%). Despite the experience requirement, the top strategy to address the privacy skills gap is training to allow non-privacy staff who are interested to move into privacy roles (48%).
The top three obstacles respondents indicated were:
- complex international legal and regulatory landscape (38%),
- lack of competent resources (37%),
- management of risk associated with new technologies (36%).
Meanwhile, the most common privacy failures were:
- lack of training or poor training (47%),
- data breach/leakage (42%), and
- not practicing privacy by design (41%).
The survey found a growing use of AI for privacy-related work this year than last year, and fewer respondents said they have no plans to use AI for privacy than they did last year. The use of AI for privacy-related tasks is higher in organizations that are not purely compliance-driven.
In terms of stress, 63% percent of respondents shared that their privacy roles are more stressful now than five years ago. Reasons for this included technology’s rapid evolution (63%) and how enterprises have rushed to adopt new tech, such as generative AI, without adequate consideration of the associated privacy risks. Compliance challenges are another top reason for the increase in stress (61%).
Download the report here.
(ISC)2 – Global Cybersecurity Workforce Prepares for an AI-Driven World (2024)
For this report, ISC2 surveyed a record 15,852 international practitioners and decision-makers. These cybersecurity professionals span the globe from North America to Asia, Latin America, Europe, the Middle East and Africa.
Overall, it found that organisations impacted by a marked increase in risk and disruption in 2024. Economic pressures, exacerbated by geopolitical uncertainties, have led to budget and workforce reductions, while cybersecurity threats and data security incidents have continued to grow. Alongside these issues, organizations and professionals have had to keep pace with rapidly advancing technology innovations such as artificial intelligence (AI). While the offer of transformative potential has fuelled adoption, such technologies also introduce additional risks and exposure to regulation.
What this led to was a year where resources are strained, while cybersecurity teams have to respond to new technologies, particularly AI, and protect against the nuanced threats they pose to their organizations.
The report itself is broken down into the following areas:
Economic conditions create resource shortages
A lack of skills was the most challenging aspect of respondents’ jobs over the past 12 months. Although respondents believe changed conditions required more people to secure organisations – organisations are cutting back both on hiring and the professional development of their cyber security teams. 67% of respondents indicated they had a staffing shortage this year, with lack of budget being the main cause for talent shortages and skills gaps (followed by lack of time). 60% of respondents believe that the skills gap significantly impact their ability to secure their organisation.
Cybersecurity career growth, aspirations and pathways
Entry pathways to the cybersecurity workforce are changing, as are their priorities – however 70% of entrants continue to come from IT. Entrants continue to trend older (35% of entrants were 39- to 49-year-olds) although tenure within careers varied. While IT is the traditional path into cybersecurity, more and more entrants come from different backgrounds or verticals, with these diverse pathways providing equally valuable to success in cybersecurity. Once in, cybersecurity professionals are still focused on higher education and professional development but increasingly prioritise work-life balance (the top-ranked method of deriving meaning from their careers) – because they don’t expect promotions or wage growth.
Respondents are changing their skills approach to prepare for an AI-driven world
AI will likely replace some of the technical skills needed in cybersecurity. While study participants speculated on what skills may be automated or streamlined, they cannot yet predict what activities, if any, AI will replace. As a result of this uncertainty, hiring managers aren’t rushing to hire more specialized workers. Instead, they are prioritizing nontechnical skills like problem-solving, teamwork, collaboration, curiosity and communication that will be transferable through the increased use of AI. These skills ranked higher than technical skills like cloud computing security, risk assessment, analysis and management and AI.
How AI benefits and increases risks for cybersecurity teams
Cyber professionals are generally excited about the potential of AI, with 54% saying it will be helpful to cybersecurity. 45% of cybersecurity teams have implemented Gen AI into their teams’ tools to help bridge skills gaps and improve threat detection (among other things). However, they don’t believe that AI will replace their entire role.
In comparison, 64% of respondent organizations have implemented Gen AI in other departments, causing more work for cyber professionals. Over half have already faced data privacy and security concerns due to organizational adoption of Gen AI.
In terms of the way forward:
- Formal organisational strategy for Gen AI: Lack of a clear Gen AI strategy was cited as one of the top barriers to its organizational adoption by nearly half (45%) of all participants. This lack of a clear strategy can pose challenges for organizations in effectively harnessing the potential benefits of Gen AI while mitigating associated risks, making it crucial for organizations to develop a well-defined and comprehensive strategy to guide the integration and usage of Gen AI in cybersecurity practices.
- Gen AI Use Guidelines: However, there’s good recognition of the need for regulations and guidelines to govern the safe and responsible use of Gen AI. Almost 90% of respondents said their organization has a Gen AI use policy, although these policies are not as advanced as they could be: 65% of say more regulations on the safe use of Gen AI are needed.
Download the report: (ISC)2 – Global Cybersecurity Workforce Prepares for an AI-Driven World (2024)
More information here: 2024 ISC2 Cybersecurity Workforce Study