► How to see the hackers illegal phish pages password file using Google,
which all the passwords of victims are stored. As we all know 99 %
hackers over internet are novice hackers or simply script kiddies. They
don’t know the concepts and they just follow or use the
material available on internet to create Phishers to Hack Facebook,
Gmail or simply passwords. Which always result in creating Phisher
successfully but not a secure one because none tells the after phisher
technique like how to make phishers undetectable, how to protect
password files where all hacked passwords are stored etc. As most of us
know that Phishing is the easiest method over internet to hack Facebook
and Email account passwords, so most novice hackers opt this option to
hack victims passwords. In fact some professional hackers uses Phishing
technique too but they are bit advanced and prefer tabnabbing over
Phishing. We will learn something better that no body does or tells.
What loophole we are digging today?? Any idea?? No ?? So go on
►LOOPHOLE!!
Most of us who are webmasters i.e. people who design websites knows the
concept of Google indexing but others might not have that good idea. So
let me explain first. How any website results appears in Google search
results? You made the website and how does Google knows you website? All
search engines uses spider and crawler software over the web to index
the new websites or latest changes in the existing websites in order to
give users the best latest results. And indexing of website depends on a
file located at root level of all web hosting websites, if its not
present Google treats as full index. Most people think that robots.txt
file is used to tell Google to index your website but actually
robots.txt file used to tell Google that what you want to index from
your website and what you dont want. By default, robot file allow full
website indexing i.e. all files are indexed even password and database
files. Woooo! Here the loophole lies. Most almost all hackers uses free
web hosting websites to run the Phish pages and all free web hosting
websites have default robot.txt file which means when hacker uploads its
phish pages, its indexed by Google. And we all know what we use to
extract smart information from Google? Off course its Google Dorks. Now
we have to learn how to make our own dorks to extract hackers phish page
information.
Hackers edits Form redirects of any login page
and change the request mode from POST to GET in order to retrieve
passwords in plain text and then they stores it in simple text and html
files. Today we are going to learn how to extract those password files
using Google. Well there is not much to learn in it because i had
already searched the DORK for you, what you have to do is just enter the
same in Google and you will have access to all Hackers Phish pages
password files containing all the hacked password they hacked till now.
►Dork to extract all Hackers Phish Pages Password file :
inurl:”passes” OR inurl:”pass”
OR inurl:”passwords” OR
inurl:”credentials” -search -download -techsupt -git
-games -gz -bypass -exe filetype:txt @yahoo.com OR @gmail OR @hotmail
OR @rediff
► Just open the Google search and enter the above dork into it, you will get all Phish page password files.
►LOOPHOLE!!
Most of us who are webmasters i.e. people who design websites knows the concept of Google indexing but others might not have that good idea. So let me explain first. How any website results appears in Google search results? You made the website and how does Google knows you website? All search engines uses spider and crawler software over the web to index the new websites or latest changes in the existing websites in order to give users the best latest results. And indexing of website depends on a file located at root level of all web hosting websites, if its not present Google treats as full index. Most people think that robots.txt file is used to tell Google to index your website but actually robots.txt file used to tell Google that what you want to index from your website and what you dont want. By default, robot file allow full website indexing i.e. all files are indexed even password and database files. Woooo! Here the loophole lies. Most almost all hackers uses free web hosting websites to run the Phish pages and all free web hosting websites have default robot.txt file which means when hacker uploads its phish pages, its indexed by Google. And we all know what we use to extract smart information from Google? Off course its Google Dorks. Now we have to learn how to make our own dorks to extract hackers phish page information.
Hackers edits Form redirects of any login page and change the request mode from POST to GET in order to retrieve passwords in plain text and then they stores it in simple text and html files. Today we are going to learn how to extract those password files using Google. Well there is not much to learn in it because i had already searched the DORK for you, what you have to do is just enter the same in Google and you will have access to all Hackers Phish pages password files containing all the hacked password they hacked till now.
►Dork to extract all Hackers Phish Pages Password file :
inurl:”passes” OR inurl:”pass” OR inurl:”passwords” OR inurl:”credentials” -search -download -techsupt -git -games -gz -bypass -exe filetype:txt @yahoo.com OR @gmail OR @hotmail OR @rediff
► Just open the Google search and enter the above dork into it, you will get all Phish page password files.
No comments:
Post a Comment