GithubHelp home page GithubHelp logo

hashcat / hashcat-utils Goto Github PK

View Code? Open in Web Editor NEW
1.3K 90.0 350.0 166 KB

Small utilities that are useful in advanced password cracking

License: MIT License

Makefile 2.36% C 94.46% Perl 3.04% Python 0.15%

hashcat-utils's Introduction

hashcat-utils

Hashcat-utils are a set of small utilities that are useful in advanced password cracking

Brief description

They all are packed into multiple stand-alone binaries.

All of these utils are designed to execute only one specific function.

Since they all work with STDIN and STDOUT you can group them into chains.

Detailed description

See the hashcat wiki page of hashcat-utils: https://hashcat.net/wiki/doku.php?id=hashcat_utils

Compile

Simply run make

Binary distribution

Binaries for Linux, Windows and OSX: https://github.com/hashcat/hashcat-utils/releases

hashcat-utils's People

Contributors

anthraxx avatar chrislundquist avatar chunshengzhao avatar davidbolvansky avatar firefart avatar jamazi avatar josephshanak avatar jsteube avatar matrix avatar mdawsonuk avatar mladejir avatar mou-security avatar mubix avatar philsmd avatar roycewilliams avatar singe avatar xanadrel avatar zerbea avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hashcat-utils's Issues

feature: multibyte cutb

When input is UTF-8, cutting on character count - even when it is not aligned with byte count - would be useful. This could be added to cutb as a flag, or could be a separate utility.

This script produces some of the desired behavior:

$ cat multicutb
#!/bin/bash

if [ -z "$1" -o -z "$2" ]; then
    echo "Usage: $0 [offset] [length]"
    echo "(similar to cutb from hashcat utils)"
    exit 1
fi
offset=$1
length=$2

grep -Po "^.{$offset}\K.{$length}"
$ echo Τηεοδ29 | multicutb 1 3
ηεο

(but doesn't cover the negative-offset functionality)

cap2hccapx: What does it require to detect networks?

I have a pcap containing only handshakes (using Wireshark's eapol display filter). When run with this file as input, cap2hccapx prints "Networks detected: 0" and writes nothing. When run with the exact same capture session but no filter (all the packets captured during the session, no eapol filter) it works as expected. What packets other than the handshake does cap2hccapx require to detect networks? And, if you happen to know, what Wireshark filter will include these?

cap2hccapx: Input cap file limitation

When using large cap files for translating into the hccapx format I get errors:

$ cap2hccapx test.cap test.hccapx
test.cap: Value too large for defined data type

test.cap is around 3.3 GB. The binary was compiled from latest source.

[Improvement] Total hashes to compute

Can you add a tool to get the "total hashes" displayed in Progress in hashcat ?
Because I distribute work with -s -l parameters and I would like to verify server-side if work has been done correctly.

So the tool could work like this, you give attack type (here bruteforce), current mask (so current length), skip parameters and limit parameter and it could display the first hash and the last hash number.

Command: "progress -m 0 -a 3 -s 5000 -l 1000000 ?a?a?a?a?a?a?a?a"
Output: "481863613928816/482539223750000"

For exemple.
Thank you.

"Written 0 WPA Handshakes to ..."

When I try to convert some .pcap files, I am unable to get any WPA handshakes I obtained using Pwnagotchi. For some networks it does, but for most of them it says 0 WPA handshakes. Is it because they're not using the WPA standard and I am using the wrong utility. I have been using the cap2hccapx.exe command.

cap2hccapx - add new feature: ignore replaycount

hashcat has a great new function: --nonce-error-corrections
cap2hccapx doesn't support this new function fully, as the tool checks the replaycount and only strips that packets containing the right value.
Please add option to ignore replaycount check to take advantage of this new great hashcat features.
Best regrads
ZerBea

Cleanup-rules not perfect.. filtering out GOOD rules

This trac ticket #649 was transferred from trac to github:

date: 2015-08-04
reporter: minga
ticket body:
" Cleanup-rules is filtering out some rules oclhashcat thinks are valid.

For example:
$0 $0 $0 $0 $0 $0 $0 (length 20)

It is rejected because it is too long - but in reality, it shrinks to:
$0$0$0$0$0$0$0 which is 14 chars - and DOES work.


some other examples:

$ $0 $3 (append SPACE 0 3)

Notice there are TWO spaces after the first dollar sign. I think this is the
root cause of all the rules that were rejected, but are still valid.

More double spaces:

o8 ]
o8 *01
o8 } *10
o8 *23
o8 $4
o8 ,4 -9
o8 ,5
o8 +6
o8 ] o1e
o8 o6n
o8 o72 Z1
o8 r x32
o8 srR
o8 Y2
o8 y3
o8 Z1 o72

+0 +0 +0 +0 +0 +0 +0 +0 (too long)
$1 $2 $3 $4 $! $@ $# $$ (too long)

Someone's rule generate space-seperates all rules - which is causing the problem."

This ticket was accepted by "atom" on the trac ticketing system.

I think there must be a bug that counts even each and every space (instead of just the rules).
Thx

BUG report

I was recently using fuzz to conduct security testing on hashset, and found a stack overflow error in cap2hccapx.bin. The specific information is as follows:

./cap2hccapx.bin poc  /dev/null

=================================================================
==32351==ERROR: AddressSanitizer: stack-buffer-overflow on address 0x7fffffffe18f at pc 0x0000004d1324 bp 0x7ffffffedab0 sp 0x7ffffffedaa8
READ of size 2 at 0x7fffffffe18f thread T0
    #0 0x4d1323 in process_packet /work/autofz/github/hashcat-utils/src/cap2hccapx.c:741:50
    #1 0x4d1323 in main /work/autofz/github/hashcat-utils/src/cap2hccapx.c:1118:5
    #2 0x7ffff6ee583f in __libc_start_main (/lib/x86_64-linux-gnu/libc.so.6+0x2083f)
    #3 0x41c068 in _start (/work/autofz/github/hashcat-utils/src/cap2hccapx.bin+0x41c068)

Address 0x7fffffffe18f is located in stack of thread T0 at offset 66511 in frame
    #0 0x4c874f in main /work/autofz/github/hashcat-utils/src/cap2hccapx.c:875

  This frame has 10 object(s):
    [32, 64) 'zero.i.i' (line 593)
    [96, 195) 'auth_packet_orig.i.i' (line 612)
    [240, 290) 'essid.i' (line 747)
    [336, 675) 'excpkt.i' (line 855)
    [752, 758) 'bssid.i' (line 675)
    [784, 834) 'essid' (line 902)
    [880, 904) 'pcap_file_header' (line 924)
    [944, 960) 'header' (line 987)
    [976, 66511) 'packet' (line 1017) <== Memory access at offset 66511 overflows this variable
    [66768, 67161) 'hccapx' (line 1288)
HINT: this may be a false positive if your program uses some custom stack unwind mechanism, swapcontext or vfork
      (longjmp and C++ exceptions *are* supported)
SUMMARY: AddressSanitizer: stack-buffer-overflow /work/autofz/github/hashcat-utils/src/cap2hccapx.c:741:50 in process_packet
Shadow bytes around the buggy address:
  0x10007fff7be0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
  0x10007fff7bf0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
  0x10007fff7c00: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
  0x10007fff7c10: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
  0x10007fff7c20: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
=>0x10007fff7c30: 00[07]f2 f2 f2 f2 f2 f2 f2 f2 f2 f2 f2 f2 f2 f2
  0x10007fff7c40: f2 f2 f2 f2 f2 f2 f2 f2 f2 f2 f2 f2 f2 f2 f2 f2
  0x10007fff7c50: f2 f2 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8
  0x10007fff7c60: f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8
  0x10007fff7c70: f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8 f8
  0x10007fff7c80: f8 f8 f8 f8 f3 f3 f3 f3 f3 f3 f3 f3 00 00 00 00
Shadow byte legend (one shadow byte represents 8 application bytes):
  Addressable:           00
  Partially addressable: 01 02 03 04 05 06 07
  Heap left redzone:       fa
  Freed heap region:       fd
  Stack left redzone:      f1
  Stack mid redzone:       f2
  Stack right redzone:     f3
  Stack after return:      f5
  Stack use after scope:   f8
  Global redzone:          f9
  Global init order:       f6
  Poisoned by user:        f7
  Container overflow:      fc
  Array cookie:            ac
  Intra object redzone:    bb
  ASan internal:           fe
  Left alloca redzone:     ca
  Right alloca redzone:    cb
  Shadow gap:              cc
==32351==ABORTING

The poc that triggers the error is as follows:https://github.com/Sunzyuu/seed/blob/main/hashset_cap2hccapx_poc
I hope my report will be of some help to hashset, thank you!

pull request for makefile for mac?

If I git clone https://github.com/hashcat/hashcat-utils.git && cd hashcat-utils/src/ && make on my mac, it appears the default Makefile setup does not correctly move the *.bin files to the ./bin folder. I've tried editing the Makefile to fix this on a fork I made of this repo, and it was successful for me. I edited Makefile line7 from
all: clean
to
all: clean native
mv *.bin ../bin
cp -a *.pl ../bin
This worked for me. But I'm only just a learner in these matters, and I expect that my edits are probably not the best solution. If what I've done is a solution, should I submit a pull request for this? Otherwise, could you please advise how to correct this behavior on macs?

Ideally, I'd like to see this repo in homebrew. But I've had a very, very difficult time creating my own formula for this, and I suspect it's because the Makefile behavior here is not working. So, first things first.

don't compile Mac OS

src badmonstr$ sudo make -v
Password:
GNU Make 3.81
Copyright (C) 2006 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.
There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A
PARTICULAR PURPOSE.

This program built for i386-apple-darwin11.3.0

unable to 'make'

When I run make it just removes all the files in bin and stops. I tried just compiling cap2haccapx.c manually but it's still not working.

(I'm fairly new to linux)

combinator and combinator3 character limit warnings

Could we possibly get a warning message printed to STDERR when the LEN_MAX is exceeded per line 161 in combinator? There has been an issue where this limit is exceeded and combinations are not printing out, but also no warning message is displayed. This could also be implemented into combinatorX where it looks like there is a 64 character limit by default.

Thanks so much for the hard work the hashcat team has put into these tools.

Failed to run file

Error:

Failed to execute process 'cap2hccapx.exe'. Reason:
exec: Exec format error
The file 'cap2hccapx.exe' is marked as an executable but could not be run.

Why Combinator is not making all possible combinations?

Hi,

I made three wordlists: 12345 (885 lines/words), qaz (16 lines), wsx (16 lines).

First I combine qaz with wsx (result: qazwsx)

Combining 12345 with qazwsx is giving me wordlist with 226560 lines (correct)

Also using combinator3 combining 12345 with qaz and with wsx is giving me wordlist with 226560 lines (correct)

But when I first combine 12345 with qaz (result 12345qaz (14160 lines, correct))

And then combine 12345qaz with wsx I got wordlist with only 226048 lines (512 lines missing).

Can someone explain me why I'm getting two different results (they should be exactly the same)

Is there any restriction in single password length? I noticed that Combiner is dropping passwords when they are longer than 36 signs.

I've attached the three base wordlist.

12345.txt
missing lines.txt
qaz.txt
wsx.txt

cap2hccapx - priority to get correct essid and essid_len

cap2hccapx evaluates the beacon and the proberesponse to get the essid and essid_len. That's not enough:
On hidden ssid's, mailformed packets, and/or packet loss of proberesponse cap2hccapx fails to get the necessary essid.
It's a better way to get essid and essid_len from reassociationrequest or associationrequest which are sent immediately before the eapol sequence.
cap2hccapx should use the following priority (from high to low) te get essid and essid_len:

  1. re-associationrequest
  2. associationrequest
  3. proberesponse
  4. directed proberequest (proberequest to mac_ap - not to BROADCAST)
  5. beacon

Best regards
ZerBea

add --skip to combinator3 (and combinator2) utility

This feature request was originally reported on the (now obsolete and offline) trac ticketing system of hashcat.net.


Ticket details:
Original reporter (OP): devilsadvocate
Title: Combinator3 feature request
Ticket number: 655
Date reported: 2015-08-30T09:54:46+02:00
Description: [Please a]dd the ability to specify a starting line or word (scanf) for file1. This is useful [for] when the program has to be aborted or killed after running for several days and then restarted. The existing behavior is to start the triple combinating all over again from the beginning (line 1).

This feature may also be desirable for [c]ombinator (only 2 files) as well.

Comments:
1. devilsadvocate The workaround is obvious though. "tail output.txt" shows where it left off. File 1 can be copied, renamed, and edited in order to delete the words it has already processed. "./combinator3.bin newfile1.txt file2.txt file3.txt > newoutput.txt" can be used. "cat output.txt newoutput.txt > newtriplecombination.txt" and then "uniq newtriplecombination.txt > newtriplecombination_unique_values.txt" will do the trick.

If changing combinator or combinator3 is too much labor, then this workaround will do in a pinch.

keyspace program is outdated

As brought up in this thread: https://hashcat.net/forum/thread-7437.html

Keyspace program from hashcat-utils gives different results than hashcat itself.
Tested on hashcat 3.6.0, I expect it to be even worse in newer versions of hashcat.

Example:

hashcat64.exe -m 13600 ?l?l?l?l?l?l?l?d?d --keyspace -a 3
30891577600
keyspace.exe -m 13600 ?l?l?l?l?l?l?l?d?d
45697600

Adding @x purge rule

Just a friendly reminder that the @ rule (RULE_OP_MANGLE_PURGECHAR) needs to be implemented in utils.

"Unsupported linktype detected" while using cap2hccapx to convert wireshark pcap to hccapx

I captured a pcap using Wireshark and feeding it into cap2hccapx resulted in "Unsupported linktype detected".

$ cap2hccapx.bin input.pcap output.hccapx <bssid>
input.pcap: Unsupported linktype detected

According to source

fprintf (stderr, "%s: Unsupported linktype detected\n", in);
it failed because the linktype in header is not DLT_IEEE802_11, DLT_IEEE802_11_PRISM, nor DLT_IEEE802_11_RADIO.

Hexdump of the pcap header shows linktype = C0000000 (decimal 192, little endian)

D4C3B2A1 02000400 00000000 00000000 00000400 C0000000
                                             LINKTYPE

http://www.tcpdump.org/linktypes.html shows linktype 192 is DLT_PPI (Per-Packet Information information). I have no idea what this is, but is this a bug or truly unsupported scenario?

undefined reference to `__builtin_bswap16'

Currently the compilation fails in CentOS 6.9 64bit with the following error:

[user@CUDA src]$ make
rm -f ../bin/*
rm -f *.bin *.exe
cc -Wall -W -pipe -O2 -std=c99  -o cap2hccapx.bin cap2hccapx.c
cap2hccapx.c: In function ‘handle_auth’:
cap2hccapx.c:430: warning: implicit declaration of function ‘__builtin_bswap16’
/tmp/ccmTwrAB.o: In function `main':
cap2hccapx.c:(.text+0x325): undefined reference to `__builtin_bswap16'
cap2hccapx.c:(.text+0x33c): undefined reference to `__builtin_bswap16'
cap2hccapx.c:(.text+0x89c): undefined reference to `__builtin_bswap16'
cap2hccapx.c:(.text+0x8b4): undefined reference to `__builtin_bswap16'
cap2hccapx.c:(.text+0x8cb): undefined reference to `__builtin_bswap16'
collect2: ld returned 1 exit status
make: *** [native] Error 1
[user@CUDA src]$ gcc --version
gcc (GCC) 4.4.7 20120313 (Red Hat 4.4.7-18)
Copyright (C) 2010 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

[user@CUDA src]$ 

Networks named 💥🖥💥 Ⓟ➃ⓌⓃ🅟❶

💥🖥💥 Ⓟ➃ⓌⓃ🅟❶ is the name of a network in my area. I was curious to see how the special characters affected these tools. I get this message
tricky-02.cap: Oversized packet detected Networks detected: 0
when doing
cap2hccapx.exe tricky-02.cap tricky-02.hccapx
on a cap file without a handshake.

I tried the 1.9 version on a file where I am certain there was a handshake and simply got this:
Networks detected: 0
I know that airodump captured the handshake. Could this be an issue with the odd characters of the ssid or am I missing something?

cap2hccapx bad qos manipulation

Hello.

cap2hccapx does not manipulate qos packets properly if it contains not only 3 address but 4 address.
I corrected this by changing this portion of code (line=804) :

else if ((frame_control & IEEE80211_FCTL_FTYPE) == IEEE80211_FTYPE_DATA)
  {
    // process header: ieee80211

    //int set = 0;

    //if (frame_control & IEEE80211_FCTL_TODS)   set++;
    //if (frame_control & IEEE80211_FCTL_FROMDS) set++;

    //if (set != 1) return;

    // find offset to llc/snap header

    int llc_offset;

    if ((frame_control & IEEE80211_FCTL_STYPE) == IEEE80211_STYPE_QOS_DATA)
    {
      llc_offset = sizeof (ieee80211_qos_hdr_t);
      u16 tmp = IEEE80211_FCTL_TODS | IEEE80211_FCTL_FROMDS;
      if ((frame_control & tmp) == tmp) llc_offset += 6;
    }
    else
    {
      llc_offset = sizeof (ieee80211_hdr_3addr_t);
    }

    // process header: the llc/snap header

tested and works properly.
Thanks

cap2hccapx should use default "EAPOL-Key Timeout"

For the EAPOL-Key Timeout value, the default is 1 second or 1000 milliseconds.
The available values in in version 6.0 are between 200 and 5000 milliseconds, while codes prior to 6.0 allow for values between 1 and 5 seconds.
What this means is when it comes time to exchange the EAPOL keys between the AP and client, the AP will send the key and wait up to 1 second by default for the client to respond.
Best regards
ZerBea

cap2hccapx please increase DB_ESSID_MAX

cpa2hccapx is limited to a maximum of 1000 essid's.
"State of the art" capturing tools are able to capture more than 1000 nets / essid's in a very short time.
Please increase DB_ESSID_MAX up to 50000 (should be a good value).

Best regards
ZerBea

consider tagging a new release

Hey 😄

would be awesome if you could consider tagging a new release, not too many stuff was implemented in the meanwhile but a release containing the segfault fix would not hurt 🐱

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.