Hacker News new | past | comments | ask | show | jobs | submit login

This feels like the modern day equivalent of the secret to alchemy being found in some musty trunk in the attic. Does any of this actually advance the field?



That's a good question, and I'm not entirely sure of the answer. I'm pretty sure my uncle believed it did, but I'm not sure how in touch he was with the state of the art, as he didn't use the internet. His idea of research was photocopying articles (and whole books, in particular I remember vol 2 of Knuth's TAOCP) from the British library.

My plan is to get some of these algorithms running and compare running times with the other open source ones available.


While perhaps not an advance, the algorithms implemented are state-of-the-art (GNFS, Pollard's p-1 test, Williams' p+1 test, cyclotomic polynomial test). But whatever the running times, it is an astounding artifact.


There is a year or two of work on top of the documentation I have included. Also, I think it is likely that there are some incremental improvements to the existing algorithms that my uncle put in place without necessarily documenting. Whilst these might not be theoretically interesting, their combined effect could be significant. There may also be some new maths in there too, as he was undoubtedly a brilliant mathematician.


Do you have any kind of bio on him? Even a picture?


I'm hoping to get both of these. It needs some collaboration with my family to do it properly.


It's remarkable that we refer to algorithms invented as "late" as the 90's as state-of-the-art. But sadly, it is so.


Even more remarkable is that they build directly upon the 400-year-old method of Fermat (http://en.wikipedia.org/wiki/Fermat%27s_factorization_method).


Well, directly is a bit of a stretch. It's probably more fair to say they're directly based off the Powers-Lehmer-Morrison-Brilhart (also Kraitchik, for the discrete log) method, which introduced the concept of factor bases and combining relations to find a difference of squares.


The officially published ones. It remains an open question what the NSA knows.

I was going to back up the idea that the NSA might potentially be ahead of the public community of number theorists and cryptographers by citing the relative sizes of the two groups. Unfortunately, those numbers are classified. :(


The NSA recently stated that they are only a couple of years ahead. And by that I take it they mean their computers are bigger. Recall that they are a military outfit. Only a small part of what they do is crypto related. Despite them having 40,000 employees or so, one shouldn't read that as meaning they have 40,000 people working on breaking RSA. They do many other things.


Did you find any copies of a data input file for (at least some of) the Fortran programs, called recl.dat?

EDIT: I am so intrigued by this that I asked the question above without saying thank you for releasing this. So thank you! I don't know much Fortran but if ever there was something to induce me to dive in, it's this. I've compiled his test program using gfortran and am looking into some of the others.


Yes, it seems to contain a lot of prime numbers (36,500 to be precise). I haven't uploaded any data files since the total size is 3Gb.


Can you make a torrent of the data files and put the .torrent file on GitHub?


I've collected what I think are the most important files. They compress very well because they seem to be very inefficiently stored (lots of repetition). The compressed collection is now on github.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: