5331 private links
https://www.mattblaze.org/blog/neinnines/
Compromised reveals that the FBI found that during at least some of the time the illegals were under investigation, the Russian numbers intended for them were sent not by a transmitter in Russia (which might have difficulty being reliably received in the US), but relayed by the Cuban shortwave numbers station. This is perhaps a bit surprising, since the period in question (2000-2010) was well after the Soviet Union, the historic protector of Cuba's government, had ceased to exist.
The Cuban numbers station is somewhat legendary. It is a powerful station, operated by Cuba's intelligence directorate but co-located with Radio Habana's transmitters near Bauta, Cuba, and is easily received with even very modest equipment throughout the US. While its numbers transmissions have taken a variety of forms over the years, during the early 2000's it operated around the clock, transmitting in both voice and morse code. The station was (and remains) so powerful and widely heard that radio hobbyists quickly derived its hourly schedule. During this period, each scheduled hourly transmission consisted of a preamble followed by three messages, each made up entirely of a series of five digit groups (with by a brief period of silence separating the three messages). The three hourly messages would take a total of about 45 minutes, in either voice or morse code depending on the scheduled time and frequency. Every hour, the same thing, predictably right on schedule (with fill traffic presumably substituted for the slots during which there was no actual message).
If you want to hear what this sounded like, here's a recording I made on October 4, 2008 of one of the hourly voice transmissions, as received (static and all) in my Philadelphia apartment: www.mattblaze.org/private/17435khz-200810041700.mp3. The transmission follows the standard Cuban numbers format of the time, starting with an "Atenćion" preamble listing three five-digit identifiers for the three messages that follow, and ending with "Final, Final". In this recording, the first of the three messages (64202) starts at 3:00, the second (65852) at 16:00, and the third (86321) at 29:00, with the "Final" signoff at the end. The transmissions are, to my cryptographic ear at least, both profoundly dull and yet also eerily riveting.
I just made a backup of an entire hard drive (50GB) over ssh via:
dd if=/dev/hda | buffer -s 64k -S 10m | ssh myuser@myhost "cat > ~/image.img"
What's now the best way to check the integrity ...
Turns out that plugging a bunch of computers into our electrical grid that do nothing but draw current and hash through algorithms has had some negative environmental impacts. Recent studies suggest that Bitcoin-related power consumption has reached record highs this year — with more than seven gigawatts of power being pulled in the pursuit of the suspect digital currency. Today’s bitcoin mining operations can be as small as a single user running a dedicated desktop machine to 50,000 state-of-the-art rigs installed in a Kazakhstan warehouse with the goal of hashing through the Bitcoin consensus algorithm faster than your competition in order to maximize the number of block rewards you receive. //
A study from the Cambridge Center for Alternative Finance released on Monday estimates that the global bitcoin mining industry uses 7.46 GW, equivalent to around 63.32 terawatt-hours of energy consumption. The study also notes that miners are paying around $0.03 to $0.05 per kWh this year. Given that a March estimate put the cost to mine a full bitcoin is around $7,500, the average miner still stands to make over $4,000 in profit from the operation. //
The current total amount of processing power dedicated to mining, known as the hashrate, is currently hovering around 120 exahash per second (EH/s). However industry analysts argue that that figure is soon to increase.
“By our assessment, the Bitcoin network can exceed 260EH/s in Hashrate in the next 12–14 months,” according to a July study from Bitooda. “Led by a modest increase in available power capacity from 9.6 to 10.6GW and an upgrade cycle that will replace older generation S9 class rigs with newer S17 and next-generation S19 class rigs.”
DiceKeys is a physical mechanism for creating and storing a 192-bit key. The idea is that you roll a special set of twenty-five dice, put them into a plastic jig, and then use an app to convert those dice into a key. You can then use that key for a variety of purposes, and regenerate it from the dice if you need to. //
Note: I am an adviser on the project. //
Q • August 24, 2020 6:57 AM
I would very very uncomfortable about having not only the physical key available to any thief or agent that decides they want it, but also having it available in a vulnerable leaky phone and app that all measure of companies and people can spy upon. Just no.
Stuff like this should only be in my head IMO. Then I get to decide if it ever gets revealed to someone else.
Ari Trachtenberg • August 24, 2020 7:20 AM
"mathematically unguessable key" - really Bruce? You know better than to taunt the hackers with hyperbolic language. If anything, the key is "practically unguessable" - but mathematically, it is certainly guessable.
z • August 24, 2020 7:46 AM
Good, cheap and simple random number generator. But it's not that special it's just very practical and it gets rid of most trust issues. But Dice quality can effect entropy, so don't let any academic measure your dice :-D
Sam • August 24, 2020 9:54 AM
What problem is this solving exactly? There are many problems across the entire crpyto ecosystem, but "keys not being random enough" is pretty damn low on the list, and certainly not outweighed by sending your key through a computer-vision based web app.
There are lots of very smart people doing fascinating work on cryptographic voting protocols. We should be funding and encouraging them, and doing all our elections with paper ballots until everyone currently working in that field has retired.
Prepare for another attack on encryption in the U.S. The EARN-IT Act purports to be about protecting children from predation, but it's really about forcing the tech companies to break their encryption schemes:
The Graham-Blumenthal bill would finally give Barr the power to demand that tech companies obey him or face serious repercussions, including both civil and criminal liability. Such a demand would put encryption providers like WhatsApp and Signal in an awful conundrum: either face the possibility of losing everything in a single lawsuit or knowingly undermine their users' security, making all of us more vulnerable to online criminals. //
Matthew Green has a long explanation of the bill and its effects:
The new bill, out of Lindsey Graham's Judiciary committee, is designed to force providers to either solve the encryption-while-scanning problem, or stop using encryption entirely. And given that we don't yet know how to solve the problem -- and the techniques to do it are basically at the research stage of R&D -- it's likely that "stop using encryption" is really the preferred goal. //
So in short: this bill is a backdoor way to allow the government to ban encryption on commercial services. And even more beautifully: it doesn't come out and actually ban the use of encryption, it just makes encryption commercially infeasible for major providers to deploy, ensuring that they'll go bankrupt if they try to disobey this committee's recommendations.
It's the kind of bill you'd come up with if you knew the thing you wanted to do was unconstitutional and highly unpopular, and you basically didn't care. //
Undermining trust is a dangerous thing. Remember that.
The Whisper Secret-Sharing App Exposed Locations
This is a big deal:
Whisper, the secret-sharing app that called itself the "safest place on the Internet," left years of users' most intimate confessions exposed on the Web tied to their age, location and other details, raising alarm among cybersecurity researchers that users could have been unmasked or blackmailed.
[...]
The records were viewable on a non-password-protected database open to the public Web. A Post reporter was able to freely browse and search through the records, many of which involved children: A search of users who had listed their age as 15 returned 1.3 million results.
[...]
The exposed records did not include real names but did include a user's stated age, ethnicity, gender, hometown, nickname and any membership in groups,
Privacy experts warn that the EARN IT Act is yet another a thinly-veiled attempt by government officials to kill encryption. Why is this so bad? As Senator Ron Wyden explains, "You can't only build a backdoor for the good guys ... Once you weaken encryption with a backdoor, you make it far easier for criminals and hackers and predators to get into your digital life."
And that's exactly what's happened when our government has inserted backdoors into encrypted services before: malicious hackers have gained access to communication systems, power grids, and even nuclear facilities. Even worse, when criminals and authoritarian governments know that platforms like Facebook Messenger and WhatsApp are not safe for them to use, they simply turn to less regulated alternatives. For all these reasons and more, many current and former security officials support making encryption stronger, not weaker.
But DOJ officials have shrugged off these legitimate concerns, defending their dangerous intentions with a pattern of lies and untruths that ignore the risks of breaking encryption and overstate the impact that breaking encryption would actually have on criminal investigations.
Our law enforcement and intelligence agencies routinely abuse their overly-broad, unconstitutional surveillance powers to spy on journalists and racial justice advocates. Giving our government even more powers will only result in more abuses against vulnerable individuals and political dissenters.
Semaphor is now free to use
As the world deals with the COVID-19 virus, many companies are faced with the challenges of remote work for the first time. SpiderOak is pleased to offer Semaphor free of charge to any organization, group, family, or individual who needs secure group messaging and file sharing.
We built Semaphor because, as a distributed team, we needed a way to communicate without the risks of email or off-the-shelf collaboration tools. We made Semaphor, group messaging with private blockchain encryption.
US-based Firefox users get encrypted DNS lookups today or within a few weeks. //
I am of two minds on the privacy benefits of DoH/DoT, but my current feeling is that it's not worth bothering with because the benefits don't fit the common use cases.
On one hand, the idea of concealing your DNS lookups from your ISP feels like a positive one. Your ISP can still sniff your SNI requests and see where you're browsing, so it doesn't necessarily gain you any privacy, but it does at least make it more difficult for them to casually spy on you and aggregate your DNS lookups into a salable package.
On the other hand, giving all of your DNS lookups to Cloudflare or NextDNS potentially allows Cloudflare or NextDNS to....casually spy on you and aggregate your DNS lookups into a salable package. And your ISP can still see your SNI requests. So in a way, you're potentially inviting more people to watch you, not fewer.
I used DoH for most of last year, but there's a pretty strong argument to be made that you're better off running your own local recursive resolver with qname minimization enabled. This means your DNS requests are not encrypted, but it also means that you're directly doing the entire lookup yourself, which greatly reduces your vulnerability to dns poisoning.
More to the point, I'm no longer certain there's much benefit at all of obscuring your DNS lookups if the purpose of that obfuscation is to hide activity from your ISP. A bit more than 95% of sites have a unique page-load fingerprint and that makes figuring out what site you're visiting solely by IP address a trivial task regardless of DNS obfuscation.
With all of that in mind, I've ditched DoH/DoT and just set up unbound in full recursion mode. It's fast and it works great.
If Sen. Lindsey Graham gets his way, the federal government will launch another attack on online privacy. The South Carolina Republican will ask lawmakers to give Attorney General William Barr and the Department of Justice unchecked access to all of your messaging, file-sharing, and video-sharing…
In the 2018 midterm elections, West Virginia became the first state in the U.S. to allow select voters to cast their ballot on a mobile phone via a proprietary app called "Voatz." Although there is no public formal description of Voatz's security model, the company claims that election security and integrity are maintained through the use of a permissioned blockchain, biometrics, a mixnet, and hardware-backed key storage modules on the user's device. In this work, we present the first public security analysis of Voatz, based on a reverse engineering of their Android application and the minimal available documentation of the system. We performed a clean-room reimplementation of Voatz's server and present an analysis of the election process as visible from the app itself.
We find that Voatz has vulnerabilities that allow different kinds of adversaries to alter, stop, or expose a user's vote,including a sidechannel attack in which a completely passive network adversary can potentially recover a user's secret ballot. We additionally find that Voatz has a number of privacy issues stemming from their use of third party services for crucial app functionality. Our findings serve as a concrete illustration of the common wisdom against Internet voting,and of the importance of transparency to the legitimacy of elections. //
The company's response is a perfect illustration of why non-computer non-security companies have no idea what they're doing, and should not be trusted with any form of security.
Does anyone know how exactly they backdoored the machines?
Yes and no, what we do know is that some of the machines had to be both secure and insecure, so that they could interoperate without raising any "red flags" by not interworking with secure machines...
We know from the book "Spy Catcher" written by Peter Wright published in the early 1980's that one method was to supply an "algorithmicaly secure machine" but with an "acoustic side channel" that leqked key information in it. Basically MI5 had gained audio access via an "infinity device" to the "Crypto Cell" at the Egyptian Embassy in London. They could thus hear the mechanical cipher machine running. Whilst it did not give the "key" what it did do was give the "wheel" starting points, turn over points, and which were rotated at any time. This reduced the "attack space" GCHQ had to deal with from "months to minutes".
As for effecting the "key stream" back in WWII the stratigic not tactical German high level cipher machine was the Lorenz teletype cipher machine. It used 12 cipher wheels with "movable lugs" on the wheel periphery, that caused a "key stream" to be built by XORing the lug positions. The wheels sizes were essentialy "prime to each other" thus whilst they were only 30-60 steps each their combined sequence was the multiple of their step sizes which was immense. At Bletchly Park the traffic from these machines was codenamed "Fish" and the machine "Tunny". The work of two men broke the machine sight unseen due to a mistake made by a German operator. There are various pages up on the web that will give you as little or as much information on it as you would like.
But what you need to remember is that,
1, The failings of the Lorenz machine are shared by many other machine ciphers not just mechanical ones.
2, Virtually all machine ciphers pre AES have both strong and weak keys with a range in between.
The US Field Cipher based on the Boris Haglin coin counting mechanism suffered from the second issue, in fact it had rather more weak keys than strong. This was not a problem for the US military as they "Issued key scheduals centrally" thus knowing what were strong keys and what were weak keys they only ever used the strong keys. The knowledge of weak and strong was as far as we can tell worked out by William F. Friedman, and it was deliberatly implemented as such by him. That is, the big weakness of any field cipher machine is the enemy will capture it and may well end up using it or copy it's design to make their own machines (see the history of Enigma type "rotor" machines to see that in action).
Thus the reasoning was either the enemy is smart and will know about the strong keys and weak keys in which case nothing won or lost. However if they do not and assume all keys are the same, then your cryptanalysis team has just been given a great big bonus to make thier lifes easier. What was not known then and still not widely recognised was the British invention of Traffic Analysis in all it's forms and the huge card file database they used with it. This enabled them to identify specific traffic circuits and individual operators without the use of cryptanalysis. Which gave not just vast amounts of "probable plaintext" but also "probable cillies" and other bad operator habits. All of which made breaking of even strong keys very very much easier. Thus traffic under weak keys becomes a leaver to put in the cracks of strong keys...
What is also known is that Crypto AG supplied customers not just with the actual crypto machines but a whole lot of key generation support... This was in the form of manuals and machines, all of which pushed Crypto AG customers into producing either "weak key scheduals" or "known key scheduals" but the actual encryption machines worked identically to those who used "secure key scheduals" thus were fully compatible, so no red flags raised.
The thing that we forget these days is that designing crypto kit is actually a hard process. Whilst it's easy to come up with complex algorithms, they are almost impossible to implement in a mechanical system that is reliable in use. Likewise for their pencil and paper analogs. Also they are eye wateringly expensive to make. If you are ever lucky enough to get your hands on just a single Enigma rotor you will see it is superbly engineered from many many parts each one of which requires a great deal of engineering thus there are hundreds of hours of work in each Enigma machine even though the outer wooden box might look crude to modern eyes. Thus only fairly simple algorithms got implemented based on minor variations to odometer or coin counting mechanisms.
Untill DES came along nearly all "electronic" cipher machines were based on simple circuits like shift registers and SR latches. In most respects many were just simple copies of mechanical cipher algorithms. So the likes of a Lorenz wheel became a "ring counter with reset" and the lugs replaced by a "plug board" the algorithm remained the same, along with all it's weaknesses... Even when put in software in 4 and 8 bit CPU systems or later micro controlers those old defective mechanical algorithms came along as "counters mod N" driving "lookup tables"... In part this happened due to "inventory costs" if you've invested a fortune in mechanical cipher systems you want your new shiny electronic systems to be compatible, likewise those that are CPU based. It's the same old "legacy issue" that almost always works more for your enemy than it does for your security.
But acoustic side channels are known to be not the only ones. Even theoreticaly secure One Time Pad/Tape systems are practically insecure when implemented in machine form. The UK high level super encipherment machine known as Rockex used by the Diplomatic Wireless Service (DWS) and designed by Canadian engineer "Pat" Bailey suffered from this as I mentioned years ago on this blog. In essence the Pad/Tape "additive" was done in a circuit using Post Office Type 600 relays. Even though the open to close times could be adjusted there was always a slight time asymmetry that got out onto the telephone pair used to connect to the telex network. This time asymmetry could be used to determine the "addative" thus strip it off leaving the plaintext...
One solution to this is to use a "shift register" or secondary relay that "reclocked" the data signal so that the time asymmetry seen on the line was not that of the relay doing the encipherment, but the time asymmetry of the reclocking relay. In essence the contacts of the reclocking relay were "open" during the critical time period of the encipherment relay changed state.
Which in theory should have made it secure... But open relay contacts like open switch contacts can be "jumped" because in reality they are small value capacitors. This is what the "infinity device" was all about. It enabled you to put a high frequency signal on the telephone pair that would see the encryption relay change state through the open contacts of the reclocking relay... So you needed to add extra circuitry to prevent the time based side channel from the encryption relay being seen on the line. Thus leaving out that extra circuitry made a very secure system nearly totaly insecure to anyone with the appropriate device in line, yet it retained total data level compatability with it's secure counterparts, so again no "red flag" waved.
@Erwin,
"Schweizer Allzweck-Taschenmesser" = The Swiss All-Purpose Pocket-Knife.
I can't stop laughing. Does anyone else get the irony?
The Swiss cryptography firm Crypto AG sold equipment to governments and militaries around the world for decades after World War II. They were owned by the CIA:
But what none of its customers ever knew was that Crypto AG was secretly owned by the CIA in a highly classified partnership with West German intelligence. These spy agencies rigged the company's devices so they could easily break the codes that countries used to send encrypted messages.
This isn't really news. We have long known that Crypto AG was backdooring crypto equipment for the Americans. What is new is the formerly classified documents describing the details:
The decades-long arrangement, among the most closely guarded secrets of the Cold War, is laid bare in a classified, comprehensive CIA history of the operation obtained by The Washington Post and ZDF, a German public broadcaster, in a joint reporting project.
The account identifies the CIA officers who ran the program and the company executives entrusted to execute it. It traces the origin of the venture as well as the internal conflicts that nearly derailed it. It describes how the United States and its allies exploited other nations' gullibility for years, taking their money and stealing their secrets.
The operation, known first by the code name "Thesaurus" and later "Rubicon," ranks among the most audacious in CIA history.
Jim Sanborn, who designed the Kryptos sculpture in a CIA courtyard, has released another clue to the still-unsolved part 4. I think he's getting tired of waiting.
Did we mention Mr. Sanborn is 74?
Holding on to one of the world's most enticing secrets can be stressful. Some would-be codebreakers have appeared at his home.
Many felt they had solved the puzzle, and wanted to check with Mr. Sanborn. Sometimes forcefully. Sometimes, in person.
Mr. Sanborn has set up systems to allow people to check their proposed solutions without having to contact him directly. The most recent incarnation is an email-based process with a fee of $50 to submit a potential solution. He receives regular inquiries, so far none of them successful.
The ongoing process is exhausting, he said, adding "It's not something I thought I would be doing 30 years on."
Attack demoed less than 24 hours after disclosure of bug-breaking certificate validation.
Yesterday's Microsoft Windows patches included a fix for a critical vulnerability in the system's crypto library.
A spoofing vulnerability exists in the way Windows CryptoAPI (Crypt32.dll) validates Elliptic Curve Cryptography (ECC) certificates.
An attacker could exploit the vulnerability by using a spoofed code-signing certificate to sign a malicious executable, making it appear the file was from a trusted, legitimate source. The user would have no way of knowing the file was malicious, because the digital signature would appear to be from a trusted provider.
A successful exploit could also allow the attacker to conduct man-in-the-middle attacks and decrypt confidential information on user connections to the affected software.
That's really bad, and you should all patch your system right now, before you finish reading this blog post.
This is a zero-day vulnerability, meaning that it was not detected in the wild before the patch was released. It was discovered by security researchers. Interestingly, it was discovered by NSA security researchers, and the NSA security advisory gives a lot more information about it than the Microsoft advisory does.
Exploitation of the vulnerability allows attackers to defeat trusted network connections and deliver executable code while appearing as legitimately trusted entities. Examples where validation of trust may be impacted include:
- HTTPS connections
- Signed files and emails
- Signed executable code launched as user-mode processes
Early yesterday morning, NSA's Cybersecurity Directorate head Anne Neuberger hosted a media call where she talked about the vulnerability and -- to my shock -- took questions from the attendees. According to her, the NSA discovered this vulnerability as part of its security research. (If it found it in some other nation's cyberweapons stash -- my personal favorite theory -- she declined to say.) She did not answer when asked how long ago the NSA discovered the vulnerability. She said that this is not the first time it sent the Microsoft a vulnerability to fix, but it was the first time it has publicly taken credit for the discovery. The reason is that it is trying to rebuild trust with the security community, and this disclosure is a result of its new initiative to share findings more quickly and more often.
Barring any other information, I would take the NSA at its word here. So, good for it.
None of us who favor strong encryption is saying that child exploitation isn't a serious crime, or a worldwide problem. We're not saying that about kidnapping, international drug cartels, money laundering, or terrorism. We are saying three things. One, that strong encryption is necessary for personal and national security. Two, that weakening encryption does more harm than good. And three, law enforcement has other avenues for criminal investigation than eavesdropping on communications and stored devices. This is one example, where people unraveled a dark-web website and arrested hundreds by analyzing Bitcoin transactions. This is another, where policy arrested members of a WhatsApp group.
So let's have reasoned policy debates about encryption -- debates that are informed by technology. And let's stop it with the scare stories.
EDITED TO ADD (12/13): The DoD just said that strong encryption is essential for national security.
All DoD issued unclassified mobile devices are required to be password protected using strong passwords. The Department also requires that data-in-transit, on DoD issued mobile devices, be encrypted (e.g. VPN) to protect DoD information and resources. The importance of strong encryption and VPNs for our mobile workforce is imperative. Last October, the Department outlined its layered cybersecurity approach to protect DoD information and resources, including service men and women, when using mobile communications capabilities.
[...]
As the use of mobile devices continues to expand, it is imperative that innovative security techniques, such as advanced encryption algorithms, are constantly maintained and improved to protect DoD information and resources. The Department believes maintaining a domestic climate for state of the art security and encryption is critical to the protection of our national security.
A secure pseudorandom number generator
Designed by Niels Ferguson and Bruce Schneier
About Fortuna
What's a PRNG? It's a mechanism for generating random numbers on a computer. They're called pseudorandom, because you can't get truly random numbers from a completely non-random thing like a computer. In theory, true random numbers only come from truly random sources: atmospheric noise, radioactive decay, political press announcements. If a computer generates the number, another computer can reproduce the process.
A PRNG is the unsexy part of a cryptographic system. People don't think much about them, but they're used just about everywhere in cryptography. Random numbers are in session keys, initialization vectors, public-key generation, and many other places. If the random numbers are insecure, then the entire application is insecure. Algorithms and protocols can't cover for bad random numbers. When a couple of Berkeley students broke the security on Netscape Navigator, it was the PRNG they broke. (See attacks on PRNGs.)
Fortuna is a PRNG; it generates cryptographically secure pseudorandom numbers on a computer. It can also be used as a real random number generator, accepting random inputs from analog random sources. We wrote Fortuna because after analyzing existing PRNGs and breaking our share of them, we wanted to build something secure.
Fortuna is superior to the past ad hoc PRNGs that have been easily compromised. We are releasing Fortuna copyright-free, at no charge, in the public domain for general business use.