{"id":16,"date":"2024-06-18T06:49:12","date_gmt":"2024-06-18T06:49:12","guid":{"rendered":"https:\/\/m155.mamcungtamlinh.com.vn\/?p=16"},"modified":"2024-06-18T06:49:12","modified_gmt":"2024-06-18T06:49:12","slug":"will-deepfakes-threaten-corporate-security","status":"publish","type":"post","link":"https:\/\/t155.tusksbarandgrill.com\/?p=16","title":{"rendered":"Will Deepfakes Threaten Corporate Security?"},"content":{"rendered":"<p><em>AI-powered deepfake technology is rapidly advancing, and it\u2019s only a matter of time before cybercriminals find a business model they can exploit, some security experts say. Deepfakes, which have already troubled celebrities and politicians, are poised to infiltrate the corporate world, offering cybercriminals a new avenue for profit. CIOs, CISOs, and other corporate leaders must brace for AI-assisted attacks involving realistic but fake voice calls, video clips, and live video conferencing calls.<\/em><\/p>\n<h2>The Evolution of Deepfakes in Cybercrime<\/h2>\n<p>Deepfakes\u00a0involving voice calls are not a recent development. Michael Hasse, a longtime cybersecurity and IT consultant, recalls presenting on the topic to asset management firms as early as 2015 after some companies in the industry fell victim to voice-based scams. However, since 2015, AI technologies for deepfakes have improved significantly and become more accessible.<\/p>\n<p>Hasse notes that the primary factor preventing widespread use of deepfakes by cybercriminals is the lack of a packaged, easy-to-use tool for creating fake audio and video. However, he predicts that such a tool is imminent, likely to appear in the criminal underground before the US elections in November, targeting political campaigns initially. \u201cEvery single piece that\u2019s needed is there,\u201d Hasse says. \u201cThe only thing that has kept us from seeing it just flooding everything is that it takes time for the bad guys to incorporate stuff like this.\u201d<\/p>\n<h2>Deepfakes as a Corporate Threat<\/h2>\n<p>It\u2019s not just\u00a0<a href=\"https:\/\/tech.yeuyoga.xyz\/cybersecurity-risks-in-mergers-and-acquisitions-analysis\/\">cybersecurity<\/a>\u00a0experts who are sounding the alarm about the corporate risk from deepfakes. In May, credit ratings firm Moody\u2019s issued a warning about deepfakes, highlighting the new credit risks they create. The report detailed several attempted deepfake scams, including fake video calls targeting the financial sector in the past two years.<\/p>\n<p>\u201cFinancial losses attributed to deepfake frauds are rapidly emerging as a prominent threat from this advancing technology,\u201d the report states. \u201cDeepfakes can be used to create fraudulent videos of bank officials, company executives, or government functionaries to direct financial transactions or carry out payment frauds.\u201d<\/p>\n<div class=\"google-auto-placed ap_container\"><\/div>\n<p>Jake Williams, a faculty member at IANS Research, a cybersecurity research and advisory firm, mentions that deepfake scams are already happening, but the extent of the problem is hard to estimate. Often, scams go unreported to save the victim\u2019s reputation, and in other cases, victims of different types of scams may blame deepfakes as a convenient cover for their actions.<\/p>\n<h2>The Challenge of Detecting Deepfakes<\/h2>\n<p>Technological defenses against deepfakes are cumbersome and may have a limited shelf life due to rapidly advancing\u00a0AI\u00a0technologies. \u201cIt\u2019s hard to measure because we don\u2019t have effective detection tools, nor will we,\u201d Williams, a former hacker at the US National Security Agency, explains. \u201cIt\u2019s going to be difficult for us to keep track of over time.\u201d<\/p>\n<p>While some hackers may not yet have access to high-quality deepfake technology, faking voices or images on low-bandwidth video calls has become trivial. Unless a Zoom meeting is of HD or better quality, a face swap may be convincing enough to fool most people.<\/p>\n<h2>Real-world Deepfake Incidents<\/h2>\n<p>Kevin Surace, chairman of multifactor authentication vendor Token, shares a personal encounter with voice-based deepfakes. He received an email from the administrative assistant of one of Token\u2019s investors, which he quickly identified as a phishing scam. When he called the administrative assistant to warn her, the voice on the other end sounded exactly like the employee but responded oddly to questions. It turned out that the phone number in the phishing email was one digit off from the real number, and the fake number stopped working shortly after Surace detected the problem.<\/p>\n<p>\u201cPeople are going to say, \u2018Oh, this can\u2019t be happening,\u2019\u201d Surace says. \u201cIt has now happened to a few people, and if it happened to three people, it\u2019s going to be 300, it\u2019s going to be 3,000, and so on.\u201d<\/p>\n<h2>Potential Uses of Deepfakes in Corporate Crime<\/h2>\n<p>So far, deepfakes targeting the corporate world have primarily focused on tricking employees into transferring money to criminals. However, Surace envisions deepfakes being used for blackmail schemes or stock manipulation. If the blackmail amount is low enough, CEOs or other targeted individuals might opt to pay the fee rather than explain that the person in the compromising video isn\u2019t really them.<\/p>\n<p>Both Hasse and Surace foresee a wave of deepfake scams coming soon. They expect many scam attempts, like the one targeting Surace, are already in progress. \u201cPeople don\u2019t want to tell anyone it\u2019s happening,\u201d Surace says. \u201cYou pay 10 grand, and you just write it off and say, \u2018It\u2019s the last thing I want to tell the press about.\u2019\u201d<\/p>\n<h2>Obstacles and Solutions<\/h2>\n<p>While the widespread use of deepfakes may be close, some impediments remain beyond the lack of an easy-to-use deepfakes package. Convincing deepfakes can require significant computing power, which some cybercriminals may lack. Additionally, deepfake scams tend to be targeted attacks, such as whale phishing, which require time to research the target.<\/p>\n<p>Potential victims are inadvertently aiding cybercriminals by sharing extensive information about their lives on social media. \u201cThe bad guys really don\u2019t have a super-streamlined way to collect victim data and generate the deepfakes in a sufficiently automated fashion yet, but it\u2019s coming,\u201d Hasse warns.<\/p>\n<h2>Strategies for Mitigating Deepfake Threats<\/h2>\n<p>With more deepfake scams likely targeting the corporate world, the question is how to address this growing threat. Given the continuous improvement of deepfake technology, there are no easy answers. Hasse believes awareness and employee training are crucial. Employees and executives need to be aware of potential deepfake scams and verify any suspicious requests, even if they come via video call. Making an additional phone call or verifying the request face-to-face is an old-school but effective form of multi-factor authentication.<\/p>\n<p>When the asset management industry first began falling victim to voice scams nearly a decade ago, advisors enhanced their know-your-customer approaches, starting conversations with clients about their families, hobbies, and other personal details to help verify identities.<\/p>\n<div class=\"google-auto-placed ap_container\"><\/div>\n<p>Another potential defense is for company executives and other critical employees to intentionally lie on social media to throw off deepfake attacks. \u201cMy guess is at some point there will be certain roles within companies where that is actually required,\u201d Hasse suggests. \u201cIf you\u2019re in a sufficiently sensitive role in a sufficiently large corporation, there may be some kind of a level of scrutiny on the social media where a social media czar watches all the accounts.\u201d<\/p>\n<h2>Technological and Procedural Measures<\/h2>\n<p>Surace\u2019s company sells a wearable multi-factor authentication device based on fingerprints, which he believes can help defend against deepfake scams. Next-generation MFA products need to quickly and securely verify identities, such as every time employees log into a Zoom meeting.<\/p>\n<p>Williams, however, is skeptical about the effectiveness of new technologies or employee training. Some people may resist using new authentication devices, and cybersecurity training has had limited success over time. Instead, he advocates for procedural changes, such as using secure applications for transferring large sums of money instead of email or voice calls.<\/p>\n<h2>The End of Voice and Image-Based Authentication<\/h2>\n<p>For centuries, people have relied on voices and images to authenticate each other, but that era is ending. \u201cThe reality is that using somebody\u2019s voice or image likeness to authenticate that person has always been, if you look at it through a security perspective, inadequate,\u201d Williams concludes. \u201cTechnology is catching up with our substandard or ineffective processes.\u201d<\/p>\n<p>As deepfake technology continues to advance, corporations must adapt and implement robust security measures to protect themselves from this emerging threat. Awareness, training, and procedural changes will be crucial in mitigating the risks associated with deepfakes. By staying vigilant and proactive, companies can better defend against the sophisticated tactics employed by cybercriminals leveraging deepfake technology.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>AI-powered deepfake technology is rapidly advancing, and it\u2019s only a matter of time before cybercriminals find a business model they can exploit, some security experts say. Deepfakes, which have already troubled celebrities and politicians, are poised to infiltrate the corporate&#8230; <\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[],"class_list":["post-16","post","type-post","status-publish","format-standard","hentry","category-cloud"],"_links":{"self":[{"href":"https:\/\/t155.tusksbarandgrill.com\/index.php?rest_route=\/wp\/v2\/posts\/16","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/t155.tusksbarandgrill.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/t155.tusksbarandgrill.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/t155.tusksbarandgrill.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/t155.tusksbarandgrill.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=16"}],"version-history":[{"count":1,"href":"https:\/\/t155.tusksbarandgrill.com\/index.php?rest_route=\/wp\/v2\/posts\/16\/revisions"}],"predecessor-version":[{"id":17,"href":"https:\/\/t155.tusksbarandgrill.com\/index.php?rest_route=\/wp\/v2\/posts\/16\/revisions\/17"}],"wp:attachment":[{"href":"https:\/\/t155.tusksbarandgrill.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=16"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/t155.tusksbarandgrill.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=16"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/t155.tusksbarandgrill.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=16"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}