PT KAI requires a facial scan to take the train, that’s the danger

Cyberthreat.id – After being tested a year ago, PT Kereta Api Indonesia (KAI) expanded the use of facial scanner (face recognition) as a condition for boarding the train.

In an announcement circulating on social media, PT KAI said that starting October 1, 2023, the train boarding process at the North Gate of Bandung Station will fully use the face scanner.

A number of netizens asked PT KAI to reconsider this policy. The reason is that facial scanning data falls under privacy.

“My face cannot be forcibly handed over to anyone, and what’s more, security is not guaranteed,” wrote a user on Twitter, consulted Monday (October 2, 2023).

PT KAI responded to this complaint by saying that those who did not want their faces scanned could still do so. boarding train manually.

A number of other netizens said that facial scanning by PT KAI also took place at several other stations such as Turi Station in Surabaya, Gambir in Jakarta, and Balapan Station in Solo.

Although PT KAI stated that there were options boarding manual, but some netizens said they weren’t aware of this option. In fact, some have admitted to being forced to have their faces scanned because the agent at the counter didn’t mention that there was a manual option for boarding the train.

Other netizens even admitted to being forced to scan their faces when they wanted to board the train at Gambir station.

Those who reject the facial scanner option generally believe that personal data security in Indonesia is still not a concern for authorities, even though the Personal Data Protection Law was passed last year.

“Please delete it facial recognition. Not all innovations are good. Not to mention that the risk of biometric data hacking is 100%. Typically, those sharing data are insiders themselves. “Don’t trust your own employees too much,” replied one user.

“Are you sure our data will be safe?” added another.

PT KAI had not answered this question until the writing of this news. The state-owned company only said that the facial scanner would speed up the process of boarding the train.

However, some netizens even reported finding that their faces were not recognized by the system, even though they were scanned before being entered into the database.

Facial analysis controversy

The use of facial scanning has caused controversy in a number of countries. In Canada, for example, in December 2021, several provincial governments ordered facial recognition company Clearview AI to stop collecting and deleting people’s facial data obtained without their consent.

In a number of countries, the use of facial recognition software has raised concerns that the data could be used to violate people’s rights. That’s why Amazon investors once urged the company to stop selling facial recognition software to government agencies.

Concerns are that facial recognition data is sensitive and needs to be secure. In the UK and Europe, this is covered by the European Union’s General Data Protection Regulation (GDPR) update. In other words, companies that store this data must protect it well, otherwise they will have to pay heavy fines.

In China, facial scanning technology is used for surveillance and tracking of the minority Uyghur population, which has sparked protests from human rights activists.

In Indonesia, police once arrested the wrong person due to an error in facial recognition technology. A man named Abdul Manaf was named as a suspect in the beating of Ade Armando during the April 11 protest last year because his face resembled that of the perpetrator. In fact, Abdul Manaf was never present at the scene when the beating took place. This is why the DPR says facial scanning technology has the potential for criminalization.

In mid-2020, a number of technology companies such as Amazon, Microsoft and IBM stopped providing facial recognition systems to police after several studies showed algorithmic biases that misidentified people from color.

Facebook, which once introduced a facial recognition feature, has now removed it after facing legal action for building a database of people’s faces without their consent.

In 2019, a company operating a facial recognition system in China called SenseNets neglected to protect a database of 2.5 million people. As a result, the biometric data was leaked on the Internet and discovered by a Beland cybersecurity researcher, Victor Gevers, of the GDI Foundation.

“They created an AI-based security software system for facial recognition, crowd analysis and personal verification. And their business IP address and tracking data records of millions of people are fully accessible to all of them,” Gevers said. Forbes.

The database contains KTP numbers, location data from the last 24 hours, gender, nationality, address, password photo and date of birth.

Later, SenseNets protected this database with firewall. However, the information has already leaked.

The incident in China highlights the risks associated with storing sensitive information.

Javvad Malik, security advocate at AlienVault, warned that biometric data leaks could have fatal consequences.

“If password If it leaks, it can always be changed. But if your facial data and personal data are leaked, how can you change it?,” Malik was quoted as saying by Forbes.

As is known, facial data can now be used to open and lock house doors, open cell phones and even bank accounts.

Therefore, Malik reiterated that companies responsible for storing this data should consider implementing security and privacy controls at every stage of the process: from development to deployment, at the endpoint, network level and onwards. to the server.

Whatever the reason, Malik added, companies that collect data from people’s facial scans must ensure the security of their data processing. Otherwise, the impact will be considerable and potentially devastating.[]

Ferdinand Stevens

"Travel nerd. Social media evangelist. Zombie junkie. Total creator. Avid webaholic. Friend of animals everywhere. Future teen idol."

Leave a Reply

Your email address will not be published. Required fields are marked *