Their certifications in the SEO tools market are beyond censure, which made the (moderately) ongoing dispatch of their Log File Analyzer – presently up to version 2.0 – an energizing possibility. Logfile examination is an inexorably surely known component of a technical SEO audit, assisting with portraying how GoogleBot and other client specialists are finding, slithering, and storing content and pages on a site.
Bringing in Data
LFA permits you to straight-up import the crude .log file as downloaded straightforwardly from the worker. No compelling reason to agonizing over converting or designing the data first basically simplified or utilize the peruse function.
Additionally, the product takes into account bringing in of URL records in CSV or xls/xlsx (Excel files pre and post 2004) – these would then be able to be utilized to perform coordinate comparisons with the data from the log file – especially valuable in the event that you have a subset of significant URLs that you’re hoping to investigate.
Breaking down Data
The “Outline” tab is a magnificent dashboard with all the significant metrics you’re probably going to follow while performing a log file investigation. It likewise incorporates helpful line charts that separate the response codes, occasions and URLs got to over the time of the log. These can be sifted by time span and furthermore by the specific Bot whose conduct you’re keen on.
The “URL” see contains the accompanying:
- Last Response Code
- Time of Last Response
- Average Bytes
- Average Response Time (in milliseconds)
Ease Of Use
The quantity of Bot Requests (this is additionally separated into additional segments for commonly examined bots: Googlebot (and portable/smartphone variations), Bingbot, Yandex, Baidu).
For more particular investigation the tool’s other, more engaged, tabs offer extended data:
Examine response codes for each mentioned URL over the time of the log file – for instance: has a substantial page mentioned multiple times responded back with 96 200 OK responses or has it been an inconsistent combination of 200 OK and 404 Not Found? The basic valid/bogus ‘Inconsistent’ banner will help you track these pages down for additional examination.
Separate your data fundamentally by the User-Agent instead of the specific URL – discover which specialists are responding the fastest and what header responses are being returned in what amount.
Find the page that gave the link to make this solicitation, how rapidly your site responded, and the quantity of mistakes. Valuable for discovering where your reference traffic is coming from.
As opposed to depending on the client specialist to distinguish the solicitation creator, this separates it further to the individual IP address. Bizarre data here could (yet doesn’t generally!) speak to negative SEO assaults or other pernicious aim.
The full and rather scaring rundown of Events throughout the time-frame of the log, recognized by timestamp, strategy, response code, and client specialist.
Imported URL data:
As mentioned already, this considers the bringing in of a more curated URL list which would then be able to be utilized as comparison data in any of different tabs through the ‘View’ dropdown to discover URLs that are absent from either dataset or to join the to sources.
Each site you dissect is put away in a different Project file, however risks are you will need to yield sections of your data, and, fortunately, this is straightforward and adaptable in the Screaming Frog LFA tool.
Generally speaking I discovered ScreamingFrog’s Log File Analyzer a fantastic method to play out a SEO-centered log file investigation work. The way that it’s fabricated explicitly in light of a technical SEO client was, as far as I might be concerned, an enormous boon and I’d prescribe it to anyone who is hoping to attempt this sort of investigation.