When the World Wide Web was little and called the ARPAnet, resolving computers to their IP addresses wasn’t a big deal. In fact because the network consisted of only a few hundred hosts, a single file called HOSTS.TXT was sufficient. This file contained the name to address mapping of every computer on the ARPAnet. Unix computers hacked the HOSTS.TXT and built it’s own version and stored it into /etc/hosts – all was fine and dandy.
The HOSTS.TXT was maintained by a Network Information Centre and distributed by a single host. Any client would pick up a fresh copy every few days to see if any new hosts had been added to the network. Slowly there were problems as the network got bigger – here’s some of the biggies:
- Traffic – the toll on the SRI-NIC (the computer which held the master copy of HOSTS.TXT) became unbearable. Network traffic and CPU utilization was overloading the host.
- Name Collisions – No two hosts on a network can be the same. There was no system to enforce this uniqueness of host names – duplicates started to appear in the host list as it got bigger.
- Consistency – making sure that everyone had the correct version of HOSTS.TXT became extremely difficult. Machines on the far edges of the network would take so long to get an update that it was
It didn’t work, name resolution started to cause havoc on the network as it grew, mailservers fell over as duplicates appeared. Hundreds of versions of the HOSTS.TXT file caused loads of issues and the reliability of the network plummeted. A new system was needed and it was needed fast, that system was delivered by a chap called Paul Mockapetris. He released two RFCs – 882 and 883 which were the first definition of the Domain Name System – or as we mostly refer to it as DNS. These RFCs have now been superceded many times as security, administration and implementation problems have been identified and rectified.
The Internet as we know it relies not on some huge text file but the Name resolution delivered by the Domain Name System. DNS is simply a huge distributed database, local control of this data is allowed. However this data is accessible across the whole network through a client/server set up. Now this is where the history lesson finishes – I don’t want to start talking about Name Servers, resolvers or caching as you can find that stuff in other places.
Here on theninjaproxy.org we like our information is little more practical – so lets have a look at a little legacy of the HOSTS.TXT file that is used as a first step of resolution by Windows TCP/IP.
There’s the little fellow – a text file called hosts which contains your computers first port of call in Name resolution before it uses methods like DNS for example.
It can be used to block or filters websites, hackers use it to infect clients with viruses and trojans by redirecting to nasty sites. Also plenty of places still use it to make web based applications work properly or to redirect clients to specific computers.
It’s quite simple to use – here’s a brief illustration. We are going to redirect a web site to a different place using the hosts file –
Let’s redirect our web surfer to somewhere pleasing to the eye – playboy.com. First we find the IP address of the site by pinging it -184.108.40.206. Next we need to make some simple modifications to our hosts file – you’ll usually need administration access to alter this file.
You can see we have added a line telling the computer that the site www.google.com can be found at the address 220.127.116.11 (oh no it can’t!).
Of course you’ve guessed what will happen when anyone tries to visit Google on this computer!
Sometimes doesn’t work as great on the bigger sites that rotate their IPs over lots of servers and you may have to clear your cache with CCleaner beforehand. But you get the idea, another slight modification is that you can use the hosts file to block access to sites to. Instead of redirecting a site to different IP address you can just redirect to your local computer using 127.0.0.1.
For example perhaps you are getting pissed about all the adverts that are served on websites from ad.doubleclick.net, simply add this line to your hosts file.
This will have the effect of blocking access to that website (and blocking it’s adverts). It’s a crude but reasonably effective way of blocking access to specific websites on a particular computer. Many companies or schools use this method on public facing or ‘kiosk’ machines.
Unfortunately hackers also use this method too, viruses modify your hosts file to redirect your machine to malicious websites instead of popular sites like Facebook or similar. So it’s always worth checking out your hosts file occasionally to see all is in order.