5333 private links
Appendix A of NIST 800-63B addresses this well. Avoiding password patterns is specifically why they recommend against arbitrary password complexity requirements and arbitrary password expiration times. Passwords should only be force expired on events. Timed expiration and complexity rules encourage patterns that make the password weaker. //
Levenshtein distance as a proxy for password strength is extremely limited, for the reasons that schroeder has outlined. And the user experience will probably be poor.
But the question is still academically interesting, and may be useful as a component for some use cases - so it still deserves a thorough answer. :D
Generally, the minimum Levenshtein distance between a previous password and a new one should be the same size as the minimum length of a randomly-generated password that would resist brute force (adjusted to the attacker's threat model). But it's important to note that even this minimum is often inadequate. //
Best case - the passwords are entirely randomly generated, both initially and later. In this case, Levenshtein distance isn't as relevant (or rather, measuring the Levenshtein distance between two randomly generated passwords is just a proxy for measuring the quality of their randomness). //
If we assume all worst cases (human-generated password, poor password-selection strategy, etc.), the answer should be clear: the amount of change in the password should itself be enough to withstand attack on its own - totally independent of the previous password (as if the first password was empty), and as if it was stored using a very fast hash like MD5. This is reduced to how fast we can reasonably expect a randomly generated password to be exhausted for a given threat model, and is covered well elsewhere (but can often hover around 13-15 random characters). //
In other words ... just like Shannon entropy, usefulness of Levenshtein distance as a measure of password strength is really only useful in cases where the passwords are already randomly generated. And if you're doing that, you already have the information necessary to keep them strong, without trying to measure the difference between two of them.
In even other words ... very low Levenshtein distances are an OK proxy for password change weakness, but high Levenshtein distances are a very poor proxy for password change strength. So my answer tries to find a balance between the two.