I work in a hosted environment where we have multiple servers in a load-balancing farm, each server having nearly 40 sites. I am looking for a way to be able to attach a sensor that behaves like the HTTP sensor, but have it only look at a single server-specific instance of the site. From what I can tell, we can only put the full URL of a site, but this can be troublesome when only one instance of a single site goes down on one of the servers.
Any ideas or input would be appreciated!
Article Comments
Is there a possibility of having an HTTP sensor with GET requests? We have another monitoring product that utilizes this method of being able to do an HTTP check against the IP of one of the load balanced servers, and the following GET request would target the host header of the site.
Example:
HTTP://10.10.10.10 GET / HTTP/1.0^Host:www.mytestsite.com^^
Apr, 2012 - Permalink
In the case of the HTTP Advanced Sensor, I don't see a method of customizing a GET request method (I currently use HTTP Advanced Sensor for most of our HTTP monitoring). Would this be something that could be added to a future release?
Apr, 2012 - Permalink
Dear Sam,
using customized GET requests is not possible with the sensors that come with PRTG. For such special request, however, there is the PRTG API which you can use to write your own executable or script. PRTG can run your file and display the returned results in a sensor of the type "EXE/Script".
For detailed information about custom sensors, see PRTG Manual: Custom Sensors.
Apr, 2012 - Permalink
I resolved this by downloading a windows port of curl and put together the powershell script below.
Then create each sensor based on the “EXE/Script” custom sensor and pass in two parameters:
'%host' pageurl, e.g. '%host' /index.html
PowerShell code:
$hostIP = $args[0] $url = "http://$hostIP" + $args[1] # expect 2nd argument to have leading slash $output = $(C:\curl-7.46.0-win64-mingw\bin\curl -i -k $url -H "Host: www.luton.gov.uk" -s) $status = $output | select-string "HTTP/1.1" if ($status -like "*200 OK") { $retval = 0 } else { $retval = 2 } write-host ("{0}:{1}" -f $retval, $status)
Hope this is helpful to someone
Jan, 2016 - Permalink
I have changed the code of
frankthomas
(thanks for your first input!!!). Now you can use this same script for every url. You have add the url (www.mysite.nu) into the parameters field.
The parameters field should look like this:
%host www.mysite.nu /index.html (or leave blank after "/") |
$hostIP = $args[0] $H = '"Host: ' + $args[1] + '"' $url = "http://$hostIP" + $args[2] # expect 2nd argument to have leading slash $output = $(C:\curl-7.53.1-win64-mingw\bin\curl -i -k $url -H $H -s) $status = $output | select-string "HTTP/1.1" if ($status -like "*200 OK") { $retval = 0 } else { $retval = 2 } write-host ("{0}:{1}" -f $retval, $status) [int]$v = $retval if ($v -eq 0) { $x=[string]$v+":OK" write-host $x } ElseIf ($v -eq 2) { $x=[string]$v+": NOT OK: " + $args[1] + " [" + $hostIP + "]" write-host $x } exit $retval
Apr, 2017 - Permalink
In this case your individual web servers' content must be accessible (additionally) via a direct URL which you can then monitor with PRTG using the HTTP sensors.
Of course, monitoring the individual web servers directly is also a good idea (for example, via WMI or SSH) to make sure all performance measurements are in a desired range.
Feb, 2012 - Permalink