Hashtopolis Forum

Full Version: Giant hashlitsts
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I am interested in using Hahstopolis to crack the haveibeenpwned v4 hashlist. I have recovered a large potion of it already, however, there are about  220,729,511 remaining.

The MySQL server Hashtopolis is currently on is halfway decent but not awesome.

When I added 5,000,000 lines the clients never got a valid hashlist and when I used 1,000,000 hashes it never gets past the keyspace calculation on the client so nothing is attempted.

Would it be possible to build (myself) a mass-import tool that creates multiple smaller hashlists that are put directly into the database then create a superhahslist (manually) and then create a supertask (manually) to handle it?

I have a number of rules to run through and doing it manually is tedious.
Since hashtopolis stores all hashes in one table, I doubt that splitting it into multiple smaller hashlists will make any difference.
MySQL is the bottleneck here and we are aware of that. However currently there are no plans to address that anytime soon.
What you can do is through more resources at it and fine-tune your MySQL settings.
To quote a poweruser: "don't get me wrong, I made hashtopolis handle a large list but you really don't want to".

With that being said, I don't want you to give up cracking the Pwned Passwords. In fact I encourage everyone to have a closer look. You will find... interesting... "passwords".
But hashtopolis is not the best tool for this job. Have a look at MDXfind. Also keep in mind, that all of the hashes were generated from plaintext leaks.
So it's easier to track them down (google is your friend, don't ask here) and use them as a wordlist than trying to crack them using rules.
Last but no least: all of the four Pwned Passwords versions can be found on hashes.org with an overall success rate of 99+%
Played around with smaller hashlists yesterday and was still getting the hashlist is empty error with as little as 100,000 hashes.

I'll try it out on a local machine that has a better setup for larger MySQL databases and test further.

I was hoping that the way hashtopolis works with supertatks is that it would treat the whole thing as one hashlist to attack but send out smaller chunks of hashes to the clients. e.g. 100,000 * 1,000 instead of 1,000,000 (or whatever optimal chunk size I come up with)

End goal is to release via IPFS a lookup table. Not a rainbow table, but 16.8M files sorted by the hash's first three hex pairs that are only a couple of kbytes each which are trivial to transfer and grep for the hash. For example: 75/11/5d.txt would contain "75115dbbb07a5ae772a653329dda3dfe3955ec2e:hashtopolis"