Darwin Biler has a nice tutorial about parsing and sending Laravel Logs to ElasticSearch using Logstash.


When your laravel.log file size is way too big for you to analyze, it might help to load it to ElasticSearch and analyze it there. This article will go over the basics of how you can do that using LogStash.

Laravel's error and logging features allows us to log application-specific events that might be proved useful in analyzing behaviour of our application. But problem arise when lets say you have 1 GB size of log file, or 10-15 different laravel application each one producing enormous amount of log data, suddenly answering questions like following is a very difficult task:

How many error that is related to PDOException had occurred last weekend?

Compare the amount of Log::warning generated compared to last month

Sort the list of laravel applications by number of Log::critical recorded in descending order between March 1 to March 15, 2016

Answering to this kind of inquiries might be impossible w/o any sort of tools at hand. Some people actually tries to explicitly record this events in the business logic of Laravel app itself, but that practice is really bad since that weighs down the performance of the application.

It is more ideal for you to accumulate those data in a simple log file and forward it to a background processing server that can then further process the information into a useful form. This way, your application entirely focuses in what it is supposedly doing -- serve http request in the most fastest and efficient manner.

Logstash allows us to process those gigantic log files and break them down into manageable parts. It can also monitor the log files for any new entry and automatically proc