5333 private links
When comparing Signal vs Telegram, the Slant community recommends Signal for most people. In the question“What is the best team chat software?” Signal is ranked 2nd while Telegram is ranked 7th. The most important reason people chose Signal is:
Signal uses an advanced end to end encryption protocol that provides privacy for every message every time.
A year ago, we wrote this… //
As brilliant as Orwell was, something continuously struck me as incorrect as I read 1984.
Orwell’s government – was extraordinarily competent in its totalitarian imposition of technological power.
“In Reality – no government in the history of man has ever been even remotely close to that competent.
“For Orwell’s Big Brother dystopia to become Reality – Big Government would need private sector help.
“Enter private sector Big Tech.
“Big Tech has delivered much of the technology Orwell envisioned….
“(I)t’s Big Tech doing the spying – not Big Government….
“The ONLY way Big Government can impose Big Brother – is to partner with Big Tech.”
Flash forward to now. We’re in the midst of the titanically stupid China Virus shutdown.
And every tyrant – at every level of government – is not letting the crisis go to waste. //
“Witnesses at Thursday’s Senate Commerce ‘paper hearing’ on big data and the coronavirus pandemic largely agreed on one major point: The outbreak underscores the need for a federal privacy law.”
Cory Doctorow’s sunglasses are seemingly ordinary. But they are far from it when seen on security footage, where his face is transformed into a glowing white orb.
At his local credit union, bemused tellers spot the curious sight on nearby monitors and sometimes ask, “What’s going on with your head?” said Doctorow, chuckling.
The frames of his sunglasses, from Chicago-based eyewear line Reflectacles, are made of a material that reflects the infrared light found in surveillance cameras and represents a fringe movement of privacy advocates experimenting with clothes, ornate makeup and accessories as a defense against some surveillance technologies. //
The motivation to seek out antidotes to an over-powerful force has political and symbolic significance for Doctorow, an L.A.-based science-fiction author and privacy advocate. His father’s family fled the Soviet Union, which used surveillance to control the masses.
“We are entirely too sanguine about the idea that surveillance technologies will be built by people we agree with for goals we are happy to support,” he said. “For this technology to be developed and for there to be no countermeasures is a road map to tyranny.” //
The lenses of normal sunglasses become clear under any form of infrared light, but the special wavelength absorbers baked into Urban’s glasses soak up the light and turn them black.
Reflectacles’ absorbent quality makes them effective at blocking Face ID on the newest iPhones. While Urban said the glasses aren’t designed to evade facial recognition that doesn’t use infrared light, they will lessen the chance of a positive match in such systems. //
L.A.-based cybersecurity analyst Kate Rose created her own fashion line called Adversarial Fashion to obfuscate automatic license-plate readers. A clothes maker on the side, she imprinted stock images of out-of-use and fake license plates onto fabric to create shirts and dresses. When the wearers walk past the AI systems at traffic stops, the machines read the images on the clothes as plates, in turn feeding junk data into the technology.
vas pup • January 20, 2020 5:07 PM
From the article - looks like the weakest link:
"Clearview’s app carries extra risks because law enforcement agencies are uploading sensitive photos to the servers of a company whose ability to protect its data is untested."
Photos from government databases are uploaded to private servers with untested security. Just speechless.
The New York Times has a long story about Clearview AI, a small company that scrapes identified photos of people from pretty much everywhere, and then uses unstated magical AI technology to identify people in other photos.
His tiny company, Clearview AI, devised a groundbreaking facial recognition app. You take a picture of a person, upload it and get to see public photos of that person, along with links to where those photos appeared. The system -- whose backbone is a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites -- goes far beyond anything ever constructed by the United States government or Silicon Valley giants.
Federal and state law enforcement officers said that while they had only limited knowledge of how Clearview works and who is behind it, they had used its app to help solve shoplifting, identity theft, credit card fraud, murder and child sexual exploitation cases.
[...]
But without public scrutiny, more than 600 law enforcement agencies have started using Clearview in the past year, according to the company, which declined to provide a list. The computer code underlying its app, analyzed by The New York Times, includes programming language to pair it with augmented-reality glasses; users would potentially be able to identify every person they saw. The tool could identify activists at a protest or an attractive stranger on the subway, revealing not just their names but where they lived, what they did and whom they knew.
And it's not just law enforcement: Clearview has also licensed the app to at least a handful of companies for security purposes.
Ron Wyden of Oregon, Chris Van Hollen of Maryland, Chris Coons of Delaware, Gary Peters of Michigan, and Ed Markey sent a followup letter to Amazon emphasizing that they are worried about the protection of Ring users’ data and are “concerned about media reports suggesting a lack of respect of the privacy of Ring customers.” Specifically, the letter details concerns that Ring employees in Ukraine were given “virtually unfettered access” to every piece of footage taken from every Ring camera around the world.
Ring’s business model has been a particularly challenging privacy concern to address. Because Ring sells itself as a personal home security system, but has the ability to function like a large-scale CCTV network accessible to police, Amazon can fall back on the idea that consumers chose to build this surveillance network, one porch at a time. When EFF brought its concerns about police partnerships and the harms of ubiquitous surveillance directly to Ring, our concerns were dismissed.
Whether you're new to Windows 10 or have been using it for years, take a minute to lock down your privacy.
Rebecca Wexler has an interesting op-ed about an inadvertent harm that privacy laws can cause: while law enforcement can often access third-party data to aid in prosecution, the accused don't have the same level of access to aid in their defense:
The proposed privacy laws would make this situation worse. Lawmakers may not have set out to make the criminal process even more unfair, but the unjust result is not surprising. When lawmakers propose privacy bills to protect sensitive information, law enforcement agencies lobby for exceptions so they can continue to access the information. Few lobby for the accused to have similar rights. Just as the privacy interests of poor, minority and heavily policed communities are often ignored in the lawmaking process, so too are the interests of criminal defendants, many from those same communities.
In criminal cases, both the prosecution and the accused have a right to subpoena evidence so that juries can hear both sides of the case. The new privacy bills need to ensure that law enforcement and defense investigators operate under the same rules when they subpoena digital data. If lawmakers believe otherwise, they should have to explain and justify that view.
For more detail, see her paper.
A lesson of the FaceApp Challenge? One non-regulatory way to help correct the excesses of Big Tech is for us to become smarter users. //
Say, hypothetically, an app offers a glimpse at what you might look like several decades down the road, all in exchange for one picture. It seems innocent enough.
Before forking over your image, however, consider the totally random possibility that a Russian company would be granted “a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your User Content and any name, username or likeness provided in connection with your User Content in all media formats and channels now known or later developed, without compensation to you.” Hypothetically, of course.
"Clickable" endnotes for Schneier's book... lots of security info to learn here.
Secure your digital self: auditing your cloud identity
An Apple iCloud service hack highlights the need for personal cloud security.
We put more and more of ourselves in the cloud every day. E-mail, device settings, data synchronization between devices, and access to much of our digital selves is tied to a handful of cloud service accounts with Google, Apple, Microsoft, Dropbox, and others. As demonstrated dramatically over the last week, those accounts are easily put at risk if they’re too interconnected—especially since the weakest link in cloud security may be the employees of the providers themselves.
That’s what happened with Wired’s Mat Honan this weekend, when a hacker was apparently able to convince Apple technical support that he was Honan and reset Honan’s iCloud account password. That bit of social engineering allowed hackers to then get access to Honan’s Gmail and Twitter accounts, as well as his access to Gizmodo's Twitter account. He also lost control over his iOS-based devices and was even locked out of his personal computer.
As smart speakers and connected devices continue to gain popularity, it’s clear that voice interaction is the next great leap forward in UX design. But how can we as designers help brands responsibly use Amazon’s Alexa, Google’s Assistant, and Apple’s Siri to reach audiences in the clearly private space of the home? If privacy is mostly about perception, we will need to find ways of building trust through absolute transparency, sharing with customers what personal data is being collected and how it is being used. Moreso, we will need to focus product design on giving customers control over their own information by adopting best practices like cookie disclaimers and GDPR compliance.
There’s still a lot to figure out with voice-assisted interfaces, but if the development of IoT platforms follows the path of reinforcing trust, the next decade can hopefully avoid an erosion of privacy and instead bring about its restoration.