Top

Police caution against AI-based crimes

Officials advise the public to look out for deep fake AI-based cybercrimes

HYDERABAD: The police and cybersecurity experts warned against falling prey to cybercrimes, wherein criminals use deepfake AI technology to impersonate others, advising the public to exercise caution and learn ways to safeguard against such attempts.

Instances of exploited videos and profiles once limited to celebrities on social media platforms, have now extended to ordinary individuals, experts said.

Cyber expert Anil Rachamalla said, “These fake things are made using special apps that use computers to copy real voices and faces. Some of these apps are meant to be entertaining, but fraudsters are misusing them for unlawful purposes. These tricks can make people believe things that aren’t true. It’s important to remember that not everything on the Internet is trustworthy.”

He said that people need to be cautious and not easily trust things that appear suspicious.

“As more people rely on computers and phones, it's crucial to learn how to stay safe and avoid falling for tricks. Knowing how to protect oneself from these tactics is vital. Unfortunately, many women have become victims of these scams, but many are reluctant to come forward due to concerns within their families,” Rachamalla said.

Additional DCP, cybersecurity, K.V.M. Prasad admitted that many women feel uncomfortable reporting them to the police, but urged victims not to hesitate in filing complaints and assured them that their information would be kept confidential.

“These fraudulent apps allow individuals to alter videos, audio, and pictures to make them appear genuine. Fraudsters utilize these tactics to steal personal information or manipulate people into believing falsehoods. They often start by establishing a friendly connection, and as the relationship deepens, they use profile photos or images obtained from devices to create videos and then demand money under threats,” the additional DCP said.

( Source : Deccan Chronicle. )
Next Story