Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

maybe google in their fight against SOPA can come to the rescue and release a database publicly of all its crawled websites, which can be syndicated somehow via p2p. i mean they gotta have this info.


There's no need to ask permission from Google. You can already search the web using fully decentralized search engines like grub.org and yacy.net.


This is a horrible idea. What more could a cracker want than a list of IP addresses that are known web servers. Sure, the big guys are safe. But what about the mom and pop shop that pays $20 a month to host their website on a server that is running a 3 year old version of apache.

EDIT: Can someone show me specifically what I said that is objectively technically incorrect?

You may disagree with security through obscurity, and I agree with you that obscurity isn't a very strong defense mechanism, but that doesn't change the technical merit of it.


Sorry, what?

What is preventing any 'cracker' from a list of IP addresses of webservers? You can VERY easily find out the IP address of anything you please...


There is nothing preventing a cracker from getting a list of IP addresses. There are a number of ways to get an IP address of a specific server: whois on the domain, even pinging the domain on a *nix OS kicks back an IP address. Doing a trace route same thing.

That's not what I'm talking about though. If google released a list of all crawled websites and their IP addresses, if I was inclined I could then take those IP's and scan them for known vulnerabilities. Essentially what you're giving an attacker is a prequalified list of IP addresses to attack instead of attacking a specific range.


whois or ping are not a good way to find the ip of a hostname. Use either a dns library or dig.


Thanks. I've never had issues with using whois or ping. The point still stands, there are plenty of ways of getting an ip address without needing google's help.


>If google released a list of all crawled websites and their IP addresses, if I was inclined I could then take those IP's and scan them for known vulnerabilities

You can already do that. You want a list of IPs running websites? So scan port 80? Your port 80 scan will be finding hosts and adding them to the list faster than the vuln scanning can grab them off the list.


If your point is that 'it's not impossible to do..' I agree.

My point is that this is more efficient, not that it is some super secret 1337 way to hack.


No, my point is that it is already completely trivial, and getting a list from google wouldn't even be worth doing if they did provide it, as you can easily and quickly generate a more extensive list with a single command.


What if I'm a link spammer, then I'd love to have google's list. How can the list be more extensive than google? That is verifiably false. For example, generate a list of IP addresses(500,000+) that have the following characteristics:

1. Have websites that been indexed by google with a PR 2+.

2. Run Web Servers

3. Bonus Points if you can tell me which web servers run the most PR 2+ sites.

Now hopefully you can see how useful this list would be! I don't think nmap will do that for you.


You are very confused. Asking google for a list of IPs does not provide you with their page rank. IPs are not domains, nor are they websites.


> You are very confused. Asking google for a list of IPs does not provide you with their page rank. IPs are not domains, nor are they websites

No, you are again correct. But if I know the IP address and the domains associated with it's very easy to determine PR 2+ websites and also from there figure out which ones have multiple websites hosted on them.

Will you promise you're not trolling me?


I guess you haven't heard of SHODAN yet then... http://www.shodanhq.com/search?q=HTTP




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: