After Facebook rejected giving access to law enforcement agencies, US Attorney General William Barr upped the pressure by issuing an industry-wide call.
WASHINGTON: The US government on Friday urged tech giants to allow police to read encrypted messages, saying access was essential to prevent serious crime despite privacy concerns.
After Facebook rejected giving access to law enforcement agencies, US Attorney General William Barr upped the pressure by issuing an industry-wide call.
"Making our virtual world more secure should not come at the expense of making us more vulnerable in the real world," Barr said in a speech in Washington.
Barr dismissed accusations that the government was seeking a "backdoor" to everyone´s private social media messages.
"We are seeking a front door. We would be happy if the companies providing the encryption keep the keys," he said.
Tech giants must abandon "the indefensible posture" that a technical solution was not possible and should develop products to balance cybersecurity with public safety, Barr said.
Facebook already encrypts WhatsApp messages from end-to-end -- meaning only the sender and recipient can read them -- and is working to extend the technology to other apps in its group, including Messenger and Instagram.
Facebook said it was intent on introducing the feature without granting oversight to law enforcement agencies.
"We hope that industry will be an ally, not an adversary," Barr said.
If picked up by ChatGPT’s 500m weekly users, OpenAI’s browser could pose a serious threat to a core stream of...
Yaccarino says it is her decision, though Musk has a history of dismissing deputies suddenly
Change is expected to impact how 170m US users access global content, how non-US creators make money on the platform
Newly found fossils 209m years old and include at least 16 vertebrate species, seven of them previously unknown
X's statement contradicts India's claim that no Indian govt agency ordered Reuters accounts withheld, says Reuters
All tech giants now offer miniature versions with fewer parameters of their respective large language models