index

Article posted on: 2021-08-01 22:10
Last edited on: 2021-08-13 20:57
Written by: Sylvain Gauthier

hosts: boost your productivity with this simple trick!

The Internet is a fantastic tool but it’s also the worst time sink ever created by humans. The last 25 years have seen web devs iterating tirelessly over their designs, with strong financial incentives, to make their CSS/javascript-bloated, resource hungry obese web pages even more addictive to scroll through.

The red notification, the infamous infinite scroll down, the notification bell, all of those mechanisms are literally evolved over thousands of iterations to be as eye-catchy, as attention hungry as possible to our poor, overloaded brain.

Some people mitigate this productivity disaster by devising some silly routines like “20 minutes work, 5 minutes twitter/reddit/facebook/4chan/whatever”. Others just gave up during the ubiquitous lockdowns and just spend their days on Youtube, their only brain power going into phrasing carefully their status message of the day to make their colleagues think they’re getting stuff done while retaining plausible deniability.

I say fuck that.

The best way to stop you from wasting time on all those useless garbage time-consuming websites is to deny yourself their access completely. With the hosts file.

What’s the hosts file

You ask? On linux, it’s usually /etc/hosts. It’s a simple text file containing redirection information on a domain name level. It’s basically a way to override the DNS queries.

When you refresh your Daily Programming Thread on 4chan.org/g/, your browser will issue a DNS query on 4chan.org to get the IP address of the server. Except first, it’s going to check if 4chan.org is in the hosts file.

And if it is, it’s going to use the indicated IP address next to it instead of asking the DNS. So yeah, if the IP address happens to be a non resolvable one like 0.0.0.0, it’s just gonna fail.

So basically, you put all those evil little domains in this file, like:

0.0.0.0 4chan.org
0.0.0.0 facebook.com
0.0.0.0 reddit.com
0.0.0.0 twitter.com

But there are so many of those websites, how do I get all of them?

That’s why some people decided that it was more efficient to generate this file. That will allow you to have thousands of domains blocked in the hosts easily.

Furthermore, you can put the many thousands malware/tracking/ad servers in there for much improved security. Some good lads are maintaining lists of such garbage URLs, some of them are even updated regularly:

There are many more of those URLs lists. Some people thought it would be more efficient to have them all in some sort of super hosts file to rule them all.

Now, this abomination happens to be the most popular of such projects on github, mind you.

Some ideas are good, like selecting categories of websites you want to ban and so on. But – call me a reactionary I don’t really care – I do not see any reasonable explanation as to why you would need thousands of Python lines of code, a FUCKING DOCKERFILE, a Code of Conduct (“be nice to each other <3 <3"), a clusterfuck of JSON/cfg configuration files, a freaking continuous integration system -- like what the actual fuck are you even testing mate -- to simply concatenate a bunch of text files together, even if you need to download them, reformat them and remove doubles. A hosts file is literally the simplest format possible, ` `, one entry per line.

Ah yes, and with all that fancy stuff the repository ends up actually containing all the hosts files it’s supposed to download, and even has pre-generated hosts with all the possible combinations of banned categories, like “gambling-social”, “gambling-social-porn”, “social-porn”, etc.

What. The. Fuck.

My solution

So anyway, after a few minutes of alternating between cringing very hard, shaking my fists at clouds, asking God if dockerfiles everywhere down to the most trivial projects are really part of His plan, and hysterically laughing, I wrote a tiny Makefile and 10 lines of bash to do exactly the same thing but much better, faster and cleaner. The result is here and it’s beautiful.

After a few months of having blacklisted all those time sinks (I mean, relapse is always lurking one root login + rm /etc/hosts away), I can totally recommend it. It’s a bit frustrating at the beginning but on the long run it’s a massive productivity gain.

Cheers and see you in the next post for the release of a cool little project.