Author | Post | ||
sniperkid |
where can i download a big dictionary 30mb or more? (for free ). |
||
07.01.2005 14:48:02 |
|
||
CrackX |
http://www.hackemate.com.ar/wordlists/ its free but you got to register I think |
||
07.01.2005 15:10:08 |
|
||
unknown user |
http://www.theargon.com/archives/wordlists/theargonlists/ have phun :> |
||
07.01.2005 15:11:14 |
|
||
occasus |
Hmmm... Let's say I have a 1.2 GB wordlist (file.lst or file.txt), how can you make it work for example with ARCHPR? I can't load it with the programs and neither split it because my hexeditors say that there is not enough memory?!? Anyone any idea? |
||
12.01.2005 11:03:56 |
|
||
S0410N3 |
Hi occasus. Here is a little php script to split your list file into multiple smaller files. I have written it quickly so there is no error checking but it works. If you have problem ask me. Of course you need php. I consider the words in the file are separated by CR/LF or LF (\r\n or \n). Create a php file and put it in the same directory as your list file. then execute it in this directory (php.exe filename.php) Hope this will help you. If not no problem it gives to the others an example of manipulating files with php <? // define the number of words per new file created by the split $maxwords=1000000; // change the name to your full word list file $fp=fopen("fullwordlist.txt","r"); $nb=1; // you can change the begining name of the new files created $fp2=fopen("wordlistsplit".$nb.".txt","w"); $pos=0; while(!feof($fp)) { $word=trim(fgets($fp,1024)); fputs($fp2,$word."\r\n"); // if under linux you should prefer \n alone if($pos==$maxwords) { fclose($fp2); $nb++; $fp2=fopen("wordlistsplit".$nb.".txt","w"); $pos=0; } else $pos++; } fclose($fp2); fclose($fp); ?> |
||
12.01.2005 13:10:56 |
|
||
occasus |
This is really something cool . Many many thanks I will try it asap. |
||
12.01.2005 14:45:13 |
|