English 中文(简体)
writing a http sniffer
原标题:

I would like to write a program to extract the URLs of websites visited by a system (an IP address) through packet capture.. I think this URL will come in the data section ( ie not in any of the headers - ethernet / ip / tcp-udp ).. ( Such programs are sometimes referred to as http sniffers , i m not supposed to use any available tool ). As a beginner , I ve just now gone through this basic sniffer program : sniffex.c.. Can anyone please tell me in which direction i should proceed..

最佳回答

Note: In the info below, assume that GET also includes POST and the other HTTP methods too.

It s definitely going to be a lot more work than looking at one packet, but if you capture the entire stream you should be able to get it from the HTTP headers sent out.

Try looking at the Host header if that s provided, and also what is actually requested by the GET. The GET can be either a full URL or just a file name on the server.

Also note that this has nothing to do with getting a domain name from an IP address. If you want the domain name, you have to dig into the data.

Quick example on my machine, from Wireshark:

GET http://www.google.ca HTTP/1.1
Host: www.google.ca
{other headers follow}

Another example, not from a browser, and with only a path in the GET:

GET /ccnet/XmlStatusReport.aspx HTTP/1.1
Host: example.com

In the second example, the actual URL is http://example.com/ccnet/XmlStatusReport.aspx

问题回答

No, there is not enough information. A single IP can correspond to any number of domain names, and each of those domains could have literally an infinite number of URLs.

However, look at gethostbyaddr(3) to see how to do a reverse dns lookup on the ip to at least get the canonical name for that ip.

Update: as you ve edited the question, @aehiilrs has a much better answer.

What you might want is a reverse DNS lookup. Call gethostbyaddr for that.

If you are using Linux, you can add a filter in iptables to add a new rule which looks for packets containing HTTP get requests and get the url.

So rule will look like this.

For each packet going on port 80 from localhost -> check if the packet contains GET request -> retrieve the url and save it

This approach should work in all cases, even for HTTPS headers.

Have a look at PasTmon. http://pastmon.sourceforge.net

I was researching on something similar and came across this. Hope this could be a good start if you are using linux - justniffer.

http://justniffer.sourceforge.net/

There is also a nice http traffic grab python script that would help if you are looking to get information from HTTP requests.





相关问题
How to set response filename without forcing "save as" dialog

I am returning a stream in some response setting the appropriate content-type header. The behavior I m looking for is this: If the browser is able to render content of the given content type then it ...

Which Http redirects status code to use?

friendfeed.com uses 302. bit.ly uses 301. I had decided to use 303. Do they behave differently in terms of support by browsers ?

Does HttpWebRequest send 200 OK automatically?

Background: I am implementing Paypal IPN handler. This great article on Paypal states that I am required to send a 200 OK back to Paypal after I read the response. The processing of IPN request is ...

Java HTTPAUTH

我试图把桌面应用程序连接起来,我是同D.icio.us api @ Delicious Alan书写的,简单地向他们提供我的用户名和密码,并请他把书记上写给我......。

Finding out where curl was redirected

I m using curl to make php send an http request to some website somewhere and have set CURLOPT_FOLLOWLOCATION to 1 so that it follows redirects. How then, can I find out where it was eventually ...

热门标签