Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Hash data too long
#1
I have a password-protected ZIP file.

I generated the hash via John the Ripper:

Code:
/johntheripper_directory/run/zip2john central.zip > central.zip.hash

I cannot attach the hash file here, but the length of the hash file is 465,959 bytes.

When I try to upload the file as a new hash list, the system got 500 Internal Server Error. I checked the Apache2 error log, it said:

Code:
SQLSTATE[22001]: String data, right truncated: 1406 Data too long for column 'hash' at row 1 in AbstractModelFactory.class.php:367

I checked the DB schema, the column has TEXT data type, which the max length is 65,535, exceeding the length of the given hash.
I think LONGTEXT should be used instead, which the max length is 4,294,967,295 characters.


btw, which type of hash should I use? It's a ZIP file generated by the "zip" command in macOS Terminal. When issuing the "zip2john" command, the following line appears:


Code:
ver 2.0 efh 5455 efh 7875 central.zip->floyd.wav PKZIP Encr: 2b chk, TS_chk, cmplen=232921, decmplen=358408, crc=70B1D6B


13600 - WinZip
17200 ~ 17230 - PKZIP


Attached Files
.zip   central.zip (Size: 227.68 KB / Downloads: 0)
Reply
#2
There is a max length of 1024 by default for hashes (if I remember correctly). Theoretically it can be changed to higher numbers in the server config. But Hashcat might refuse anyway to load your PKZIP hash, as it's can be too long to be useful to attack on GPU. John can attack it, as it's running on CPU only, but with Hashcat it can be that you can't attack such a long hash.

So, make sure first, that you can even load the hash in Hashcat (with the appropriate 172xx variant you need for your hash). If it is within the limit of Hashcat, you can change the type of the column to LONGTEXT.
Reply
#3
Okay, I probably used a wrong tool to generate the hash. So if I have the ZIP file above, how do I start cracking the password using Hashcat / Hashtopolis?
Reply
#4
You used the correct tool. As your ZIP has only a single large file in there, the hash gets that large. As I wrote above, it might be that your hash is not crackable with Hashcat / Hashtopolis due to the length. Take the hash and try it out locally with Hashcat, if it is able to load it there, it will also be possible in Hashtopolis, if not, your way to go is to use John the Ripper with CPU.
Reply
#5
I tried -m 13600, 17200 - 17230 for the zip file using the Hashcat directly.

Only 17200 and 17220 reported the hash is okay. However, hashcat reports the following warning for NVIDIA CPU:

Code:
* Device #2: Skipping hash-mode 17220 - known CUDA/OpenCL Runtime/Driver issue (not a hashcat issue)
            You can use --force to override, but do not report related errors.

Then I force to use the CPU device using the following command, the hashcat command works:

Code:
hashcat -a 3 -m 17200 -D 1 --force test.hash wordlist.txt

Seems Hashtopolis might support such long hash in a very specific way. Also for the know CUDA / OpenCL Runtime issue, it might need to be handled by the agent.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)