Data privacy remained a critical enabler of customer trust in 2023, causing an uptick in the demand for privacy roles, even as the privacy budget faces considerable cuts over the next 12 months, according to independent surveys conducted by Cisco and ISACA.
Privacy laws were strongly supported by organizations across the globe with a clear indication for demand for privacy roles to take off in the coming year.
With a little over 18,000 respondents (2,600 for Cisco and 15,500 for ISACA) globally, the surveys underlined various global observations around privacy compliance and practices and major privacy failures.
Data privacy is good for business
Ninety-four percent of Cisco respondents said their customers would not buy from them if they did not protect data properly. Eighty percent of them said legislation around privacy has had a positive impact on their businesses as they met global compliance.
The economics of privacy looked attractive for the year, with 95% of Cisco respondents saying privacy benefits exceeded compliance costs and organizations realized an average return of 1.6x on their privacy investments.
âCiscoâs report demonstrates that data privacy is much more than a compliance issue. Almost all organizations have come to believe that privacy is a business imperative with positive and measurably significant ROI,â said Jack Poller, an analyst at ESG Global.
As privacy investment recorded positive returns globally, organizations have ramped up hiring skills with âprivacyâ credentials. Twenty-five percent of ISACA respondents said their organizations had open legal/compliance privacy roles, while 31% reported open technical privacy positions. Albeit, lower than last year’s numbers (27% and 34% respectively), the stats reflect good hiring sentiments for the segment amid a massive budget, according to the ISACA report.
Privacy still faces setbacks
The privacy segment observed slow progress on transparency and AI readiness. Sixty-two percent of consumers are concerned about how organizations apply and use AI and 60% already have lost trust in organizations over their AI practices, according to a parallel Cisco study.
âWe asked organizations about (customer AI readiness), and 91% of respondents said their organizations needed to do more to reassure customers that their data was only being used for intended and legitimate purposes when it comes to AI,â the Cisco report added. This was at 92% last year, reflecting very little progress.
Ninety-two percent of Cisco respondents said they see generative AI as a fundamentally different technology with novel challenges and concerns requiring new techniques to manage data and risk. Among the top concerns with the technology, 69% said it could hurt their organization’s legal and intellectual property rights, 68% feared public and competitors’ sharing of uploaded content on GenAI tools, and 68% questioned the authenticity of the data returned by these tools.
âAs I expected, there is a myriad of open privacy issues with using GenAI,â Poller said. âAnd thatâs because GenAI is a black box where organizations have little to no visibility into how the AI engine incorporates input data (prompts) into outputs. There is a risk that input data could be improperly used in the opaque decision-making process, and that this data could be improperly exposed.â
Sixty-three percent of ISACA respondents, who believed their privacy budgets are underfunded already, said they feared that the funding would further decrease in the next 12 months. This was only at 8% in a study conducted in 2021.
âThe most surprising finding is that despite new regulations, privacy budgets are shrinking,â Poller added. âI believe this reflects both pessimism about current economic conditions and an overconfidence in AI-based tools.â
Go to Source