5333 private links
A bounty of $12,288 has been announced for the first person to crack the NIST elliptic curves seeds and discover the original phrases that were hashed to generate them.
A fun, free platform for learning modern cryptography
Learn about modern cryptography by solving a series of interactive puzzles and challenges. Get to know the ciphers and protocols that secure the digital world by breaking them.
You play an Uplink Agent who makes a living by performing jobs for major corporations. Your tasks involve hacking into rival computer systems, stealing research data, sabotaging other companies, laundering money, erasing evidence, or framing innocent people. You use the money you earn to upgrade...
Zero knowledge refers to policies and architecture that eliminate the possibility for a password manager to access your password.
Seems that there is a deliberate backdoor in the twenty-year-old TErrestrial Trunked RAdio (TETRA) standard used by police forces around the world. //
Looks like the encryption algorithm was intentionally weakened by intelligence agencies to facilitate easy eavesdropping. //
And I would like to point out that that’s the very definition of a backdoor.
Why aren’t we done with secret, proprietary cryptography? It’s just not a good idea. //
Clive Robinson • July 26, 2023 11:51 AM
@ Bruce, ALL,
Re : It started in WWII.
“Why aren’t we done with secret, proprietary cryptography? It’s just not a good idea.”
Remember this actually goes back well into the last century, that is it’s more than 20years old.
johnwalker
When I wrote “The Digital Imprimatur” almost twenty years ago (published on 2003-09-13), I was motivated by the push for mandated digital rights management with hardware enforcement, attacks on anonymity on the Internet, the ability to track individuals’ use of the Internet, and mandated back-doors that defeated encryption and other means of preserving privacy against government and corporate surveillance. //
This time it’s called “Web Environment Integrity 1” (WEI), and it comes, not from Microsoft but from the company that traded in their original slogan of “Don’t be evil 1” for “What the Hell, evil pays a lot better!”—Google.
So, what is WEI? Let’s start with a popular overview from Ars Technica.
The cryptomine’s operator was likely motivated to maximize cryptocurrency gains by negating the cost of operating the mine. Cryptomines notoriously run off an excess of electricity, with all the world’s cryptomines requiring more energy than the entire country of Australia, the White House reported last year. Where Cohasset is located in the Boston area, “electricity costs have exceeded the national average” by at least 48 percent over the past five years, the US Bureau of Labor Statistics reported.
In summary: Secure messaging wasn't pervasive at all, and the existing options were either overly technical, had a bad user experience or never made it out of beta. That's why Manuel wrote the first version of Threema for himself and his friends, and released it for iOS in December 2012. //
On the protocol side: In 2012, TLS was in a bit of a bad state. Mobile operating systems commonly offered no modern ciphersuites at all and were sometimes plagued with bad random number generators. (For example: Android 4.0.4 didn't even support TLS 1.1 yet.)
Last August, LastPass reported a security breach, saying that no customer information—or passwords—were compromised. Turns out the full story is worse: //
To date, we have determined that once the cloud storage access key and dual storage container decryption keys were obtained, the threat actor copied information from backup that contained basic customer account information and related metadata including company names, end-user names, billing addresses, email addresses, telephone numbers, and the IP addresses from which customers were accessing the LastPass service.
The threat actor was also able to copy a backup of customer vault data from the encrypted storage container which is stored in a proprietary binary format that contains both unencrypted data, such as website URLs, as well as fully-encrypted sensitive fields such as website usernames and passwords, secure notes, and form-filled data.
That’s bad. It’s not an epic disaster, though.
These encrypted fields remain secured with 256-bit AES encryption and can only be decrypted with a unique encryption key derived from each user’s master password using our Zero Knowledge architecture. As a reminder, the master password is never known to LastPass and is not stored or maintained by LastPass. //
John Thurston • December 26, 2022 1:31 PM
“I think the question of why everything in the credentials store was not encrypted is interesting. What possible advantage is there of not just encrypting the whole thing under your master password.”
Because this is how Lastpass is able to offer to supply uid:pwd values when you have not unlocked your vault. If this information was kept encrypted, then the browser extensions would not know when to prompt you to unlock to supply the creds.
I’ve never liked this ‘feature’, but there’s nothing I can do about it. //
Wladimir Paöant • December 27, 2022 6:56 AM
I would have been less problematic had LastPass not messed up. They:
- Failed to upgrade many accounts from 5,000 to 100,100 iterations.
- Didn’t keep up with cracking hardware improvements (100k iterations are really on the lower end today).
- Didn’t bother existing their new password complexity rules for existing accounts.
- Didn’t bother encrypting URLs despite being warned about it continuously, allowing attackers to determine which accounts are worth the effort to decrypt.
Their statement is misleading, they downplay the issues. I’ve summed it up on my blog here: https://palant.info/2022/12/26/whats-in-a-pr-statement-lastpass-breach-explained/ //
Diaconis says that he will have "seven shuffles suffice" carved on his tombstone. He is referring to his most famous realisation: that it takes seven "riffle shuffles" to sufficiently randomise a deck of cards. The riffle shuffle is the familiar technique, used by casinos and serious card players, in which the deck is cut in two and then thumbed together with a satisfying zip, often ending with a bridge finish that gathers the cards together into a neat pile.
The riffle shuffle is the unruly twin of the perfect shuffle. Instead of perfectly interleaving the two halves of the decks, the halves are mixed together in disorderly clumps, planting a seed of randomness that progressively mixes the cards with each shuffle.
After one or two riffle shuffles, some cards will remain in their original sequence. Even after four or five shuffles – far more than most casinos typically use – the deck will retain some trace of order. But once you shuffle the deck seven times, the cards become truly mixed, at least as far as most statistical tests can prove. Beyond that point, further mixing will not do much. "It's just as close to random as can be," Diaconis says.
Time Stamp Authority
freeTSA.org provides a free Time Stamp Authority. Adding a trusted timestamp to code or to an electronic signature provides a digital seal of data integrity and a trusted date and time of when the transaction took place.
IdenTrust Timestamping Authority Server is a service that binds the digital certificate used to sign a digital file with the data being signed, creating a unique sequence of characters or encoded information known as hash, and also identifies when a certain event has occurred. The result is a trusted accurate digital date and time stamp seal embedded within the digital file that contains X.509 digital signatures. Any change in the timestamped file will break the timestamp seal alerting the user that the file is no longer in its original state. //
Users who wish to add timestamping to PDF files that are signed with IdenTrust personal or business digital certificates must add 'http://timestamp.identrust.com' as the Timestamping Server to their local Adobe® Acrobat or Adobe® Reader configuration. Our article How to Add IdenTrust Timestamping Authority Server to Adobe will guide you through this process.
Users who wish to add IdenTrust Timestamping Authority Server to Microsoft® MS-Office® digitally signed documents can do it following the instructions in our article Apply IdenTrust Timestamping Authority to Microsoft Office Digitally Signed Documents.
Clive Robinson • September 15, 2022 3:03 AM
@ Winter,
Re : Blockchain efficiency
“Blockchains are transparent, robust, and fast.”
No they are not.
To be transparent they need to be “public” and few people actually want every financial move they make being made naked and open to all.
Whilst they look like they are robust they are not as data structures, or systems. The only robustness they bring to the table over existing systems is by the public duplication. Which is problematical as who is going to pay for the infrastructure many times that of Googles current setup, just to implement one such? Remember you would need a minimum of four such systems[1] and all the high security communications to support it, which would make the NSA envious.
As for fast, the current systems due to the moronic “Proof of XXX” attached are so slow transactions are at best just a handful a second. Even without that “Proof of XXX” the number of global transactions at any one time numbers up in the tens of millions a second, something most do not realise.
But people do not appreciate the combined,
- Gate Keeper Effect
- Ripple across Effect.
This will create a significant time delay which has consequences in that high speed transactions can be done and compleated long before the blockchain gets updated, thus “High Frequency Fraud” will be a result. This will require “back-out” mechanisms that don’t exist because they destroy the blockchain security model.
Then of course any system with locked in time delay and capacity issues, is a “Sitting Duck” target for extortion by “Denial of Service”(DoS) attack.
To be honest I’m surprised there has been no real concerted effort to Ransom one of the crypto-coin blockchains by a DDoS…
As has been pointed out the idea of a global blockchain is a “Crypto-Anarchists” dream and every one elses very real nightmare.
Because like it or not, it will become not some kind of libertarian freedom, but a tool of near total oppression as it will have all the failings of hierarchical systems[2] that certain entities will lust after to control. We actually see this with blockchain gate keepers already.
[1] There is a problem with blockchains in that if someone gets more than 50% control they can “own it”. This means you need three at all times sharing effectively equitably. Add in the fact “at all times” need 100% availability, and no single system has 100% reliability means you need an absolut minimum of four, preferably more.
[2] Mankind has known many of the failings of single and hierarchical systems for as long as there has been any kind of social structure. War is just one obvious side effect, slavery or forced servitude yet another the list of hierarchical system failings is both long and grevious. For centuries at the very least people have sort out ways to robustly maintain the desirable effects of social cohesion, yet get rid of hierarchies, or atleast their many undesirable side effects, and so far the failure to do this is effectively 100%…
At a high level of abstraction, here's how any blockchain works: Someone on the network proposes a block containing a list of recent transactions. Then other network participants verify that the block follows the network's rules. If a sufficient number of other network participants accept the block, it becomes the "official" next block in the chain. As long as most network participants are honest, users can have confidence that transactions accepted by a majority of the network won't be removed or modified later.
The big challenge for any blockchain project is preventing a malicious party from creating many sock puppet accounts to "stuff the ballot box," outvote the honest participants and thereby tamper with past transactions. Bitcoin's pseudonymous founder Satoshi Nakamoto's big insight—the one that made bitcoin possible—was that this problem could be solved using the principle of "one hash, one vote." On the bitcoin network, whoever has the most computing power—specifically, the capacity to compute SHA-256 hashes—has the most influence over which blocks get added to the blockchain. As long as honest miners have more hash power than malicious miners, users can be confident in the integrity of the blockchain—and hence in the integrity of payments made using the bitcoin network. (Check out our in-depth bitcoin explainer for details on how this works.)
When Vitalik Buterin launched Ethereum in 2015, he used a variant of Nakamoto's scheme. By that point, bitcoin mining was already dominated by specialized silicon optimized for computing huge numbers of SHA-256 hashes, locking ordinary bitcoiners out of the mining game. So Buterin developed a new mining algorithm designed to be "memory-hard"—and therefore difficult to accelerate with custom hardware. As a result, Ethereum mining is still largely performed using off-the-shelf graphics cards, allowing ordinary Ethereum users to participate. //
Buterin has long recognized the environmental downsides of proof-of-work mining. Several years ago, he announced plans to transition Ethereum to proof-of-stake, which has been pioneered by several lesser-known cryptocurrencies.
While proof-of-work mining operates on the principle of "one hash, one vote," proof-of-stake is based on "one coin, one vote." Anyone who wants to participate in Ethereum's validation process must post ether as collateral, a process known as "staking." The more ether someone stakes, the more influence they have over which blocks get added to the Ethereum blockchain.
Every 12 seconds, a pseudorandom number generator selects a subset of stakers to form a committee to decide on the next block. One of them is designated to propose the next block, while the rest, called validators, verify that the new block follows all the rules of the Ethereum network. For example, if the block contains a payment transaction, the validators check that the source address has the required funds, that the transaction has the correct digital signatures, and so forth. If two-thirds of validators approve a block, it becomes part of the official blockchain.
Validators that faithfully follow these rules earn additional ether as a reward for their efforts, with the size of their reward proportional to the ether they've staked. On the other hand, if a validator tries to cheat—for example, by validating two different, incompatible blocks for the same blockchain "slot"—they will face financial penalties. If another validator posts evidence of such a cheating attempt, some of the cheater's collateral will be destroyed ("slashed," in Ethereum jargon), and the whistleblower will get a reward.
The moral is the need for cryptographic agility. It’s not enough to implement a single standard; it’s vital that our systems be able to easily swap in new algorithms when required. We’ve learned the hard way how algorithms can get so entrenched in systems that it can take many years to update them: in the transition from DES to AES, and the transition from MD4 and MD5 to SHA, SHA-1, and then SHA-3.
We need to do better. In the coming years we’ll be facing a double uncertainty. The first is quantum computing. When and if quantum computing becomes a practical reality, we will learn a lot about its strengths and limitations. It took a couple of decades to fully understand von Neumann computer architecture; expect the same learning curve with quantum computing. Our current understanding of quantum computing architecture will change, and that could easily result in new cryptanalytic techniques.
The second uncertainly is in the algorithms themselves. As the new cryptanalytic results demonstrate, we’re still learning a lot about how to turn hard mathematical problems into public-key cryptosystems. We have too much math and an inability to add more muddle, and that results in algorithms that are vulnerable to advances in mathematics. More cryptanalytic results are coming, and more algorithms are going to be broken.
We can’t stop the development of quantum computing. Maybe the engineering challenges will turn out to be impossible, but it’s not the way to bet. In the face of all that uncertainty, agility is the only way to maintain security.
SIKE Broken
SIKE is one of the new algorithms that NIST recently added to the post-quantum cryptography competition.
It was just broken, really badly.
We present an efficient key recovery attack on the Supersingular Isogeny Diffie-Hellman protocol (SIDH), based on a “glue-and-split” theorem due to Kani. Our attack exploits the existence of a small non-scalar endomorphism on the starting curve, and it also relies on the auxiliary torsion point information that Alice and Bob share during the protocol. Our Magma implementation breaks the instantiation SIKEp434, which aims at security level 1 of the Post-Quantum Cryptography standardization process currently ran by NIST, in about one hour on a single core.
Clive Robinson • August 4, 2022 10:27 AM
@ Peter Galbavy, ALL,
“while I haven’t the first clue about the underlying math it read to me like someone who’s installed an amazingly fancy and highly secure”
The fundamental protocol for SIKE is “Supersingular Isogeny Diffie-Hellman”(SIDH). Which is a tads difficult to fully explain even by mathmaticians…
The problem is that even mathmaticians can be unaware of the more referified parts of their art…
So it turns out it was not “secure”, and it had been known by some mathmetitians how to attack it since the late 1990’s…
Which is kind of a decade or two and some before anyone decided to use SIKE as a one way function for crypto. Which kind of makes it realy “face palm” embarrassing… //
Clive Robinson • August 4, 2022 5:09 PM
@ SpaceLifeForm, ALL,
Re : Move to ECC with a SafeCurve.
Err the maths behind this attack on SIKE is also being looked at for breaking ECC (read ARS article for more details).
So far the search for an ECC attack of use is “ongoing”. This attack on SIKE is almost certainly going to renew attempts to break ECC. Doing so would give the successful person(s) a “Golden Ticket” C.V.[1]
Let’s put it this way I suspect ECC now has a very short shelf life… Maybe half a decade at most life left would be my advice to the cautious. So looking for a replacment should begin right away.
It could be a heck of a sight less, as there is a chance a successful attack may already have been discovered but not yet recognised. So it might come fast very fast.
Hency my earlier comments about thinking on how to replace the curent asymetric “Key Exchange” and “Signing” systems.
Because if we loose them before we replace them to a non QC algorithm then it’s going to be brutal very fast…
Think no secure online,
1, Banking / Finance.
2, Online Shoping/Commerce.
3, Software Patching.
4, Secure Communications.
5, Privacy.
You might remember I’ve been talking off and on about replacing privacy wrecking CA heirarchics for years, as well as private/secure “Rendezvous Protocols”. Because it’s been kind of obvious that Asymetric Crypto has a serious flaw in the “Secure Trap Door” assumption that was always weak. Worse the statments of David Deutsch[2] in the mid 1980’s about “Quantum Computing”(QC) and his proof that it was going to be significantly improved compared to “Clasical Computing”(CC) was a large “Red Flag”…
Even our host @Bruce Schneier some time ago (around AES comp time ending) made the point we realy needed to stop playing with crypto algorithms and get on with the far harder task of “Key Managment”(KeyMan). Which even today is mostly not done (seen by some as either impossible or a career killer).
The best we have for Private/Secure key transfer in emergancies is the provably secure yet both fragile and awkward to scale “One Time Pad”.
Every system so far thought up that does not use asymetric crypto, needs a “Secure Side Channel” to at the very minimum set up a “Root of Trust”.
A Two-Party “Root of Trust” transfer without OWF’s with Secure Trap Door functions is currently assumed not to scale[3]. The alternative three or more party systems are provably not secure under the usuall assumptions (it’s why we spend time talking about “End To End Encryption”(E2EE)).
[1] As some know the sort of mathamatician who is most likely to do this is of PhD Research age, so upto middle to late 30’s. So a “Golden Ticket” is their most likely route to “secure academic employment” for the rest of their life.
[2] Basic bio of David Deutsch,
https://quantumzeitgeist.com/david-deutsch-the-father-of-quantum-computing-but-who-is-he/
[3] I’m of a different opinion, think of three impartial entities, that randomly select some “Number Used Once”[nonce] and send them to the two parties that wish to communicate. The two parties then use those points as being on the circumference of a circle and use the radius or center they calculate as a symetrical key. This system shows it is possible to have a system where none of the impartial third parties can know what the key is. But also that the scaling problem can be significantly reduced, to the point where OTP can be used by individuals to the impartial third parties.
PaperBack is a free application that allows you to back up your precious files on the ordinary paper in the form of the oversized bitmaps. If you have a good laser printer with the 600 dpi resolution, you can save up to 500,000 bytes of uncompressed data on the single A4/Letter sheet. Integrated packer allows for much better data density - up to 3,000,000+ (three megabytes) of C code per page.
You may ask - why? Why, for heaven's sake, do I need to make paper backups, if there are so many alternative possibilities like CD-R's, DVD±R's, memory sticks, flash cards, hard disks, streamer tapes, ZIP drives, network storages, magnetooptical cartridges, and even 8-inch double-sided floppy disks formatted for DEC PDP-11? (I still have some). The answer is simple: you don't. However, by looking on CD or magnetic tape, you are not able to tell whether your data is readable or not. You must insert your medium into the drive (if you have one!) and try to read it.
Paper is different. Do you remember the punched cards? EBCDIC and all this stuff. For years, cards were the main storage medium for the source code. I agree that 100K+ programs were... unhandly, but hey, only real programmers dared to write applications of this size. And used cards were good as notepads, too. Punched tapes were also common. And even the most weird codings, like CDC or EBCDIC, were readable by humans (I mean, by real programmers).
Of course, bitmaps produced by PaperBack are also human-readable (with the small help of any decent microscope). I'm joking. What you need is a scanner attached to PC. Actual version is for Windows only, but it's free and open source, and there is nothing that prevents you from porting PaperBack to Linux or Mac, and the chances are good that it still will work under Windows XXXP or Trillenium Edition. And, of course, you can mail your printouts to the recipients anywhere in the world, even if they have no Internet access or live in the countries where such access is restricted by the regiment.
Oh yes, a scanner. For 600 dpi printer you will need a scanner with at least 900 dpi physical (let me emphasize, physical, not interpolated) resolution.
Have I already mentioned that PaperBack is free? I release it under the GNU General Public License, version 3. This means that you pay nothing for the program, that the sources are freely available, and that you are allowed - in fact, encouraged - to modify and improve this application.
In the not-too-distant future—as little as a decade, perhaps, nobody knows exactly how long—the cryptography protecting your bank transactions, chat messages, and medical records from prying eyes is going to break spectacularly with the advent of quantum computing. On Tuesday, a US government agency named four replacement encryption schemes to head off this cryptopocalypse.
Some of the most widely used public-key encryption systems—including those using the RSA, Diffie-Hellman, and elliptic curve Diffie-Hellman algorithms—rely on mathematics to protect sensitive data. These mathematical problems include (1) factoring a key's large composite number (usually denoted as N) to derive its two factors (usually denoted as P and Q) and (2) computing the discrete logarithm that key is based on.
The security of these cryptosystems depends entirely on how difficult it is for classical computers to solve these problems. While it's easy to generate keys that can encrypt and decrypt data at will, it's impossible from a practical standpoint for an adversary to calculate the numbers that make them work.
In 2019, a team of researchers factored a 795-bit RSA key, making it the biggest key size ever to be solved. The same team also computed a discrete logarithm of a different key of the same size.
The researchers estimated that the sum of the computation time for both of the new records was about 4,000 core-years using Intel Xeon Gold 6130 CPUs (running at 2.1 GHz). Like previous records, these were accomplished using a complex algorithm called the Number Field Sieve, which can be used to perform both integer factoring and finite field discrete logarithms.
Quantum computing is still in the experimental phase, but the results have already made it clear it can solve the same mathematical problems instantaneously. Increasing the size of the keys won't help, either, since Shor's algorithm, a quantum-computing technique developed in 1994 by American mathematician Peter Shor, works orders of magnitude faster in solving integer factorization and discrete logarithmic problems. //
algorithms are vulnerable and have been cautioning the world to prepare for the day when all data that has been encrypted using them can be unscrambled. Chief among the proponents is the US Department of Commerce's National Institute of Standards and Technology (NIST), which is leading a drive for post-quantum cryptography (PQC).
On Tuesday, NIST said it selected four candidate PQC algorithms to replace those that are expected to be felled by quantum computing. They are: CRYSTALS-Kyber, CRYSTALS-Dilithium, FALCON, and SPHINCS+. //
pabs Seniorius Lurkius et Subscriptor
plectrum wrote:
jhodge wrote:
Is this intended to displace AES? AES is hardware accelerated pretty much everywhere, so replacing it will mean a whole lot of hardware is about to become obsolete. Not right away, but possibly by enough to shorten the lifecycle of equipment being sold now unless the transition period is long.AES is symmetric crypto. These algorithms relate to public key aka asymmetric crypto. AES is not affected by factorisation or discrete logs being broken.
To elaborate on this comment a bit more. Symmetric encryption and cryptographic hashes will also be affected by quantum computers (see Grover's algorithm), but this can be generally* addressed by doubling the size of the key or digest, respectively.
The security of asymmetric encryption relies on the assumption that the discrete logarithm problem and integer factorization problem do not have polynomial-time solutions, and Shor's algorithm breaks that assumption.
- Technically there is still an issue with the birthday bound for block ciphers like AES with a small block size, but that can be addressed without requiring an entirely new approach.
How Apple, Google, and Microsoft will kill passwords and phishing in one stroke
You've heard for years that easier, more secure logins are imminent. That day is here.
Let’s Encrypt is a free, automated, and open certificate authority (CA), run for the public’s benefit and it’s been a huge change to the whole industry. Now, when everyone has adopted the idea of free SSL certificates, the logical evolution step is at hand — managed certificates. What are the options across major cloud providers?