Posts

Showing posts from October, 2015

List Of $1 Hosting Websites - 2015

List Of $1 Hosting Websites

http://www.aicheapwebhosting.com/ http://www.dollarseohosting.com/ http://www.3ix.org/ http://123systems.net/ http://hostripples.com/https://www.hobohost.com/ http://www.mytruehost.com/ http://www.linkalone.com/ http://www.hostica.com/ http://www.hostso.com/ http://www.vpshostingdeal.com/ http://hostbandit.com/ http://www.zettahost.com/1-dollar-web-hosting http://www.hostbudget.com/1dollar-web-hosting.html http://www.1dollar-webhosting.com/ http://www.hostingdude.com/ http://www.levelhosting.ca/hosting.html ttp://www.stablehost.com/ https://www.geekstorage.com/order/web-hosting.html http://www.cirtexhosting.com/ http://www.hawkhost.com/ http://www.eleven2.com/ http://www.alwayswebhosting.com/ http://www.ninjalion.com/shared.php

How to split a huge size log file and only get what you want in Linux?

Today I encounter a task as follow:

I need to extract a 7.3GB Apache access log and ONLY grab the access log starting from 30 August and up to today.

I don't need to say it is non-sense to open a 7.3 GB file in a text editor, even in vi under linux, the RAM that is used is huge, and it just doesn't work that way.

So to solve this problem, here are the steps I took, thanks to the following reference:

http://stackoverflow.com/questions/3066948/how-to-file-split-at-a-line-number


Find out the exact string first occurrence in the log file and print the first 5 lines.  Actually I can only print one line but I just like 5:grep -n "30/Aug/2015" access_log | head -n 5The returned line will be:61445828:203.129.95.51 - - [30/Aug/2015:00:00:01 +0800] "GET <somewebsite>/index.htm HTTP/1.1" 200 10824The first item: 61445828 is the line numberCount the total number of lines in access log, and get the number:wc -l access_logThe return is: 64328208 access_log.oldNow,…