Monday, April 20, 2026
Home PoliticsA means of electronic infiltration.. “Chatbots” are a friend who does not keep secrets

A means of electronic infiltration.. “Chatbots” are a friend who does not keep secrets

by Marwane al hashemi
0 comments


The Director of the Cybercrime Department at the General Department of Criminal Investigation and Investigation at Dubai Police, Brigadier Saeed Al Hajri, warned against being lenient in providing artificial intelligence applications, specifically chatbots, such as “Chat GPT”, with personal data, noting that some people treat these applications as if they were a safe virtual friend, while they may be a window for electronic infiltration and hacking.

Al-Hajri told Al-Emarat Al-Youm, “Artificial intelligence applications have become of great importance to a huge number of users, in light of the support they provide in all sectors, such as preparing research, writing articles, responding to emails, and others. However, there is a weakness in the culture of using them.”

He added, “With the introduction of the voice feature that has impressed many, the personality of the application user has become an open book for these advanced technologies. You will find someone telling the artificial intelligence robot everything about his life, and he may even take the initiative to explain his concerns and personal circumstances to it, stressing that “this is wrong behavior, because artificial intelligence has tremendous capabilities in analysis, and it has an infinite memory.”

He pointed out that “some people deal with applications such as (Chat GPT) as if they were human beings, meaning that they are prone to forgetfulness and losing what is stored in them over time, but this magic that the user finds when he talks to artificial intelligence, and it responds to him in simple and easy language as if it senses him and understands his needs, comes after the person provides it with data about himself, his work and his country, and from here fears have emerged that these applications represent a major security breach, not only at the level of individuals, but even at the level of countries.”

He pointed out that “dealing with artificial intelligence applications should be as needed, without giving them any details about personal life, but rather limiting oneself to asking general questions about the topic for which assistance is requested.”

“There is a major problem that we must pay attention to as well, related to the disruption of brain capabilities as a result of relying on these robots,” he added, noting that “it was natural in the past for each of us to be keen to know the roads and directions so that we could move easily without getting lost, but now many prefer to use GPS technologies, even if they are on their way to a place they know well.”

He added, “This can be measured by the increased reliance on robots or artificial intelligence in writing research, translating an article, rephrasing a text, and even responding on behalf of the person. These negatives are compounded with younger groups, specifically students, because they reduce the efficiency of their mental abilities at an early age, in addition to being the most vulnerable to the risks of cybercrime related to these technologies.”

He explained that “artificial intelligence technologies – like anything else – are a double-edged tool that may be used for beneficial purposes, such as improving and enhancing the quality of life, maintaining security, providing information and assistance in various matters. They may also be used by cybercrime professionals in hacking, fraud or breaching systems.”

He stressed that “Dubai Police and other police agencies use big data and artificial intelligence within a security system to maintain security and the course of justice. On the other hand, they monitor the misuse of these technologies and work with partners to limit them and hold perpetrators accountable.”

Al-Hajri said, “The train of development is moving on its way without stopping, and the concerns associated with artificial intelligence are renewed, and there are different opinions about the most appropriate ways to deal with these developments, but it is necessary to educate individuals about the extent of the challenges and the mechanisms for safe use.”

“Data is the new oil today, but it doesn’t need to be mined, just as oil appeared in the last century and became a source of wealth, because as soon as you turn on your phone, a person’s data, secrets, and life details are transferred through applications that transmit everything that goes on around them through the tools available on the phone, such as the microphone and camera, in addition to the data that the user volunteers, because he has previously signed usage agreements that allow those applications to commodify him as a mass of data,” he said.

He pointed out that privacy ends once you connect to the Internet, and most applications are managed by artificial intelligence, which determines the location through GPS and the Internet network used, and stores images and information in infinite memory, so European countries intervened to impose restrictions on the mechanism of operation of these applications, to protect the privacy of their people.

Al-Hajri stressed the importance of establishing an ethical framework for companies trading in data, or providers of these services, which includes first taking into account the supreme interests of the state, respecting its privacy and the traditions of its people, and then they must take into account the humanity of the user, and not turn him into a commodity or a target for all types of trade, in addition to including rules that guarantee the reduction of extremism, and it is necessary for these companies to understand that there are specificities that differ from one society to another, and what can be imposed or marketed in one place, may not be appropriate in another place.


Risks to teenagers

Recent studies have warned of several risks associated with students relying on chatbots or artificial intelligence applications to conduct research or complete academic assignments, noting the possibility of cheating and plagiarism of articles or research belonging to others, as well as the possibility of using false or inaccurate information, or leaking biased content that the student is not aware of.

Studies have also shown that “children may suffer from excessive addiction to communication or the use of chatbots, at the expense of other social activities, such as exercising or cultural interaction.”

She pointed out that applications such as “Chat GPT” may hinder critical thinking skills, as they limit the development of information analysis and inference skills, in addition to their negative impact on creativity and innovation, given that they provide ready-made solutions to problems. Studies have confirmed the ability of modern applications to violate privacy and hack into the user’s personal information, without him realizing the risks of disclosing the data.

New opportunity for cybercriminals

Many hail the emergence of chatbots as a valuable opportunity for businesses, but they have also proven to be an opportunity for cybercriminals.

Technicians said the deep dive into the chatbot without an account raises security concerns about the advancement of artificial intelligence used for malicious purposes.

They stressed that this makes it difficult to predict the attack and the method used to carry it out.


Fears of becoming emotionally attached to chatbots

Technicians have warned of human risks to users of the chatbot ChatGPT, expressing concern about the possibility of users becoming emotionally attached to the advanced technology. Experts attributed their concern to the risks of the holographic voice used by the robot, as it has human qualities that can double the strength of the relationship with the user.

Innovative technology companies are racing to develop generative AI models, such as ChatGPT, with the aim of doubling the capabilities of robots to clone human voices for their virtual assistants.

Technicians say companies aim to make the robot’s voice skills more realistic, raising concerns about potential abuses of voice technology and the consequences of how attached the user is to the robot.

They said that a voice with human qualities might tempt some users to become emotionally attached to the chatbot, and that users’ attachment to automated interactions with the AI ​​model would come at the expense of human relationships.

Real-life experiments conducted by specialized companies revealed a tangible emotional connection between human samples that underwent the experiment, to the point of expressing sadness on the last day of the experiment.

• Once the phone is turned on, the person’s data, secrets, and life details are transmitted through applications designed for this purpose.. and introducing the voice feature increases the level of risks.

• Dubai Police monitors the misuse of technology and works to limit it and hold perpetrators accountable.

• Privacy varies according to society, and what can be imposed or marketed in one place may not be appropriate in another.

You may also like

Leave a Comment

Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?
-
00:00
00:00
Update Required Flash plugin
-
00:00
00:00