FBI has an ‘AI’ warning for you – Times of India

The Federal Bureau of Investigation of the United States has issued a warning that hackers are using generative artificial intelligence (AI) tools such as ChatGPT to create malicious code and easily carry out cybercrime activities.

The agency expressed concerns during a call with journalists. It revealed that AI chatbots have been used for various illicit activities, including scammers and fraudsters perfecting their techniques and terrorists consulting the tools on how to launch more destructive chemical attacks.
During a conference call with reporters, an anonymous official from the bureau revealed that there are two main risks associated with the use of AI.

The first risk is “model misalignment,” which is when AI software is developed or deployed in a way that leads to undesirable outcomes. The second risk is the direct “misuse of AI” to support other operations.
An official stated that they anticipate an increase in the use of AI models among people as the adoption and democratisation of such technology progresses. Unfortunately, there are also bad actors who can use AI to enhance their criminal activities.
Cybercriminals are using AI to create new malware attacks and delivery methods, such as AI-generated phishing websites and polymorphic malware that can bypass antivirus software. The FBI warns of scammers using AI to create sexually explicit deep fakes for extortion. Criminals also enhance traditional scams, like using AI voice-cloning technology in scam phone calls.
The FBI did not disclose what AI models criminals are using. However, an official mentioned that hackers are more gravitated towards free, customizable open-source models, as well as private AI programs developed by hackers, which are available in the cybercriminal forum
The officials also mentioned that foreign actors are increasingly targeting and collecting information from US companies, universities, and government research facilities for AI advancements. This includes algorithms, data expertise, computing infrastructure, and even talented individuals.

function loadGtagEvents(isGoogleCampaignActive) { if (!isGoogleCampaignActive) { return; } var id = document.getElementById('toi-plus-google-campaign'); if (id) { return; } (function(f, b, e, v, n, t, s) { t = b.createElement(e); t.async = !0; t.defer = !0; t.src = v; t.id = 'toi-plus-google-campaign'; s = b.getElementsByTagName(e)[0]; s.parentNode.insertBefore(t, s); })(f, b, e, 'https://www.googletagmanager.com/gtag/js?id=AW-877820074', n, t, s); };

window.TimesApps = window.TimesApps || {}; var TimesApps = window.TimesApps; TimesApps.toiPlusEvents = function(config) { var isConfigAvailable = "toiplus_site_settings" in f && "isFBCampaignActive" in f.toiplus_site_settings && "isGoogleCampaignActive" in f.toiplus_site_settings; var isPrimeUser = window.isPrime; if (isConfigAvailable && !isPrimeUser) { loadGtagEvents(f.toiplus_site_settings.isGoogleCampaignActive); loadFBEvents(f.toiplus_site_settings.isFBCampaignActive); } else { var JarvisUrl="https://jarvis.indiatimes.com/v1/feeds/toi_plus/site_settings/643526e21443833f0c454615?db_env=published"; window.getFromClient(JarvisUrl, function(config){ if (config) { loadGtagEvents(config?.isGoogleCampaignActive); loadFBEvents(config?.isFBCampaignActive); } }) } }; })( window, document, 'script', );

For all the latest Technology News Click Here 

Read original article here

Denial of responsibility! TechAI is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.