Checking A List of Sites...

classic Classic list List threaded Threaded
9 messages Options
Reply | Threaded
Open this post in threaded view
|

Checking A List of Sites...

Marc Pinnell-3
Hey all,

I have a script that runs through the sites that I host checking for response time, whether they are up and operating properly (by verifying a string exists), etc. Results are written to a file for later retrieval. Been wondering if there was a better, more efficient manner to do this. I've noticed that many times the "timer" results that this script reports do not match the same load times I get with say Chrome.

My code is below, feel free to pick it apart! :)

//NOW CHECK THE SITE
protect => {
        local(
                sTime = _date_msec,
                lc = 1
                )
        $urlReturn = (
                        include_url('http://' + #oneSite + '/',
           -sendmimeheaders=array('User-Agent'='Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6'),
           -timeout = 3,
           -retrieveMimeHeaders = 'site_headers',
           -options=array(CURLOPT_FOLLOWLOCATION=1),
           -string=TRUE
           ));

        //CHECK FOR STRING
        with oneString in $matchstrings do {
                if($urlReturn >> #oneString) => {
                        $sverified = 'Y'
                        loop_abort
                }
        }

        //PROCESS RESPONSE HEADER
        $scode = string_extract($site_headers->split('\r\n')->first->asString, -startposition=10, -endposition=12)

        $ltime = decimal(_date_msec - #sTime)
}


marc


Marc Pinnell
1027 Design
PO Box 990872
Redding, CA 96099-0872
530.941.4706
fax: 866.232.5300
www.1027Design.com



#############################################################

This message is sent to you because you are subscribed to
  the mailing list Lasso [hidden email]
Official list archives available at http://www.lassotalk.com
To unsubscribe, E-mail to: <[hidden email]>
Send administrative queries to  <[hidden email]>
Reply | Threaded
Open this post in threaded view
|

Re: Checking A List of Sites...

stevepiercy
Load time is different from response time.  Are you comparing
the exact same things, including any network overhead?

--steve


On 2/25/15 at 9:00 PM, [hidden email] (Marc
Pinnell) pronounced:

>Hey all,
>
>I have a script that runs through the sites that I host
>checking for response time, whether they are up and operating
>properly (by verifying a string exists), etc. Results are
>written to a file for later retrieval. Been wondering if there
>was a better, more efficient manner to do this. I've noticed
>that many times the "timer" results that this script reports do
>not match the same load times I get with say Chrome.
>
>My code is below, feel free to pick it apart! :)
>
>//NOW CHECK THE SITE
>protect => {
>local(
>sTime = _date_msec,
>lc = 1
>)
>$urlReturn = (
>include_url('http://' + #oneSite + '/',
>-sendmimeheaders=array('User-Agent'='Mozilla/5.0 (Macintosh; U;
>Intel Mac OS X 10.6; en-US; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6'),
>-timeout = 3,
>-retrieveMimeHeaders = 'site_headers',
>-options=array(CURLOPT_FOLLOWLOCATION=1),
>-string=TRUE
>));
>
>//CHECK FOR STRING
>with oneString in $matchstrings do {
>if($urlReturn >> #oneString) => {
>$sverified = 'Y'
>loop_abort
>}
>}
>
>//PROCESS RESPONSE HEADER
>$scode =
>string_extract($site_headers->split('\r\n')->first->asString,
>-startposition=10, -endposition=12)
>
>$ltime = decimal(_date_msec - #sTime)
>}
>
>
>marc
>
>
>Marc Pinnell
>1027 Design
>PO Box 990872
>Redding, CA 96099-0872
>530.941.4706
>fax: 866.232.5300
>www.1027Design.com
>
>
>
>#############################################################
>
>This message is sent to you because you are subscribed to
>the mailing list Lasso [hidden email]
>Official list archives available at http://www.lassotalk.com
>To unsubscribe, E-mail to: <[hidden email]>
>Send administrative queries to  <[hidden email]>

-- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
Steve Piercy              Website Builder              Soquel, CA
<[hidden email]>               <http://www.StevePiercy.com/>


#############################################################

This message is sent to you because you are subscribed to
  the mailing list Lasso [hidden email]
Official list archives available at http://www.lassotalk.com
To unsubscribe, E-mail to: <[hidden email]>
Send administrative queries to  <[hidden email]>
Reply | Threaded
Open this post in threaded view
|

Re: Checking A List of Sites...

Bil Corry-3
Not to mention Chrome is loading sub-resources (CSS, JS, images, etc) and executing on them.

- Bil

> On Feb 26, 2015, at 7:22 AM, Steve Piercy - Website Builder <[hidden email]> wrote:
>
> Load time is different from response time.  Are you comparing the exact same things, including any network overhead?
>
> --steve
>
>
> On 2/25/15 at 9:00 PM, [hidden email] (Marc Pinnell) pronounced:
>
>> Hey all,
>>
>> I have a script that runs through the sites that I host checking for response time, whether they are up and operating properly (by verifying a string exists), etc. Results are written to a file for later retrieval. Been wondering if there was a better, more efficient manner to do this. I've noticed that many times the "timer" results that this script reports do not match the same load times I get with say Chrome.
>>
>> My code is below, feel free to pick it apart! :)
>>
>> //NOW CHECK THE SITE
>> protect => {
>> local(
>> sTime = _date_msec,
>> lc = 1
>> )
>> $urlReturn = (
>> include_url('http://' + #oneSite + '/',
>> -sendmimeheaders=array('User-Agent'='Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6'),
>> -timeout = 3,
>> -retrieveMimeHeaders = 'site_headers',
>> -options=array(CURLOPT_FOLLOWLOCATION=1),
>> -string=TRUE
>> ));
>>
>> //CHECK FOR STRING
>> with oneString in $matchstrings do {
>> if($urlReturn >> #oneString) => {
>> $sverified = 'Y'
>> loop_abort
>> }
>> }
>>
>> //PROCESS RESPONSE HEADER
>> $scode = string_extract($site_headers->split('\r\n')->first->asString, -startposition=10, -endposition=12)
>>
>> $ltime = decimal(_date_msec - #sTime)
>> }
>>
>>
>> marc
>>
>>
>> Marc Pinnell
>> 1027 Design
>> PO Box 990872
>> Redding, CA 96099-0872
>> 530.941.4706
>> fax: 866.232.5300
>> www.1027Design.com
>>
>>
>>
>> #############################################################
>>
>> This message is sent to you because you are subscribed to
>> the mailing list Lasso [hidden email]
>> Official list archives available at http://www.lassotalk.com
>> To unsubscribe, E-mail to: <[hidden email]>
>> Send administrative queries to  <[hidden email]>
>
> -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
> Steve Piercy              Website Builder              Soquel, CA
> <[hidden email]>               <http://www.StevePiercy.com/>
>
>
> #############################################################
>
> This message is sent to you because you are subscribed to
> the mailing list Lasso [hidden email]
> Official list archives available at http://www.lassotalk.com
> To unsubscribe, E-mail to: <[hidden email]>
> Send administrative queries to  <[hidden email]>

#############################################################

This message is sent to you because you are subscribed to
  the mailing list Lasso [hidden email]
Official list archives available at http://www.lassotalk.com
To unsubscribe, E-mail to: <[hidden email]>
Send administrative queries to  <[hidden email]>
Reply | Threaded
Open this post in threaded view
|

Re: Checking A List of Sites...

jasonhuck
Have you considered a service such as Pingdom?

https://www.pingdom.com/



On Thu, Feb 26, 2015 at 2:20 AM, Bil Corry <[hidden email]> wrote:

> Not to mention Chrome is loading sub-resources (CSS, JS, images, etc) and
> executing on them.
>
> - Bil
>
> > On Feb 26, 2015, at 7:22 AM, Steve Piercy - Website Builder
> <[hidden email]> wrote:
> >
> > Load time is different from response time.  Are you comparing the exact
> same things, including any network overhead?
> >
> > --steve
> >
> >
> > On 2/25/15 at 9:00 PM, [hidden email] (Marc Pinnell)
> pronounced:
> >
> >> Hey all,
> >>
> >> I have a script that runs through the sites that I host checking for
> response time, whether they are up and operating properly (by verifying a
> string exists), etc. Results are written to a file for later retrieval.
> Been wondering if there was a better, more efficient manner to do this.
> I've noticed that many times the "timer" results that this script reports
> do not match the same load times I get with say Chrome.
> >>
> >> My code is below, feel free to pick it apart! :)
> >>
> >> //NOW CHECK THE SITE
> >> protect => {
> >> local(
> >> sTime = _date_msec,
> >> lc = 1
> >> )
> >> $urlReturn = (
> >> include_url('http://' + #oneSite + '/',
> >> -sendmimeheaders=array('User-Agent'='Mozilla/5.0 (Macintosh; U; Intel
> Mac OS X 10.6; en-US; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6'),
> >> -timeout = 3,
> >> -retrieveMimeHeaders = 'site_headers',
> >> -options=array(CURLOPT_FOLLOWLOCATION=1),
> >> -string=TRUE
> >> ));
> >>
> >> //CHECK FOR STRING
> >> with oneString in $matchstrings do {
> >> if($urlReturn >> #oneString) => {
> >> $sverified = 'Y'
> >> loop_abort
> >> }
> >> }
> >>
> >> //PROCESS RESPONSE HEADER
> >> $scode = string_extract($site_headers->split('\r\n')->first->asString,
> -startposition=10, -endposition=12)
> >>
> >> $ltime = decimal(_date_msec - #sTime)
> >> }
> >>
> >>
> >> marc
> >>
> >>
> >> Marc Pinnell
> >> 1027 Design
> >> PO Box 990872
> >> Redding, CA 96099-0872
> >> 530.941.4706
> >> fax: 866.232.5300
> >> www.1027Design.com
> >>
> >>
> >>
> >> #############################################################
> >>
> >> This message is sent to you because you are subscribed to
> >> the mailing list Lasso [hidden email]
> >> Official list archives available at http://www.lassotalk.com
> >> To unsubscribe, E-mail to: <[hidden email]>
> >> Send administrative queries to  <[hidden email]>
> >
> > -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
> > Steve Piercy              Website Builder              Soquel, CA
> > <[hidden email]>               <http://www.StevePiercy.com/>
> >
> >
> > #############################################################
> >
> > This message is sent to you because you are subscribed to
> > the mailing list Lasso [hidden email]
> > Official list archives available at http://www.lassotalk.com
> > To unsubscribe, E-mail to: <[hidden email]>
> > Send administrative queries to  <[hidden email]>
>
> #############################################################
>
> This message is sent to you because you are subscribed to
>   the mailing list Lasso [hidden email]
> Official list archives available at http://www.lassotalk.com
> To unsubscribe, E-mail to: <[hidden email]>
> Send administrative queries to  <[hidden email]>
>

#############################################################

This message is sent to you because you are subscribed to
  the mailing list Lasso [hidden email]
Official list archives available at http://www.lassotalk.com
To unsubscribe, E-mail to: <[hidden email]>
Send administrative queries to  <[hidden email]>
Reply | Threaded
Open this post in threaded view
|

Re: Checking A List of Sites...

Marc Pinnell-3
Yes, I have. I actually use it as a basic check on Lasso 8. However, to have it monitor all the sites on the server would be a bit steep at $200/month.

marc


On Feb 26, 2015, at 6:06 AM, Jason Huck <[hidden email]> wrote:

> Have you considered a service such as Pingdom?
>
> https://www.pingdom.com/
>
>
>
> On Thu, Feb 26, 2015 at 2:20 AM, Bil Corry <[hidden email]> wrote:
>
>> Not to mention Chrome is loading sub-resources (CSS, JS, images, etc) and
>> executing on them.
>>
>> - Bil
>>
>>> On Feb 26, 2015, at 7:22 AM, Steve Piercy - Website Builder
>> <[hidden email]> wrote:
>>>
>>> Load time is different from response time.  Are you comparing the exact
>> same things, including any network overhead?
>>>
>>> --steve
>>>
>>>
>>> On 2/25/15 at 9:00 PM, [hidden email] (Marc Pinnell)
>> pronounced:
>>>
>>>> Hey all,
>>>>
>>>> I have a script that runs through the sites that I host checking for
>> response time, whether they are up and operating properly (by verifying a
>> string exists), etc. Results are written to a file for later retrieval.
>> Been wondering if there was a better, more efficient manner to do this.
>> I've noticed that many times the "timer" results that this script reports
>> do not match the same load times I get with say Chrome.
>>>>
>>>> My code is below, feel free to pick it apart! :)
>>>>
>>>> //NOW CHECK THE SITE
>>>> protect => {
>>>> local(
>>>> sTime = _date_msec,
>>>> lc = 1
>>>> )
>>>> $urlReturn = (
>>>> include_url('http://' + #oneSite + '/',
>>>> -sendmimeheaders=array('User-Agent'='Mozilla/5.0 (Macintosh; U; Intel
>> Mac OS X 10.6; en-US; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6'),
>>>> -timeout = 3,
>>>> -retrieveMimeHeaders = 'site_headers',
>>>> -options=array(CURLOPT_FOLLOWLOCATION=1),
>>>> -string=TRUE
>>>> ));
>>>>
>>>> //CHECK FOR STRING
>>>> with oneString in $matchstrings do {
>>>> if($urlReturn >> #oneString) => {
>>>> $sverified = 'Y'
>>>> loop_abort
>>>> }
>>>> }
>>>>
>>>> //PROCESS RESPONSE HEADER
>>>> $scode = string_extract($site_headers->split('\r\n')->first->asString,
>> -startposition=10, -endposition=12)
>>>>
>>>> $ltime = decimal(_date_msec - #sTime)
>>>> }
>>>>
>>>>
>>>> marc
>>>>
>>>>
>>>> Marc Pinnell
>>>> 1027 Design
>>>> PO Box 990872
>>>> Redding, CA 96099-0872
>>>> 530.941.4706
>>>> fax: 866.232.5300
>>>> www.1027Design.com
>>>>
>>>>
>>>>
>>>> #############################################################
>>>>
>>>> This message is sent to you because you are subscribed to
>>>> the mailing list Lasso [hidden email]
>>>> Official list archives available at http://www.lassotalk.com
>>>> To unsubscribe, E-mail to: <[hidden email]>
>>>> Send administrative queries to  <[hidden email]>
>>>
>>> -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
>>> Steve Piercy              Website Builder              Soquel, CA
>>> <[hidden email]>               <http://www.StevePiercy.com/>
>>>
>>>
>>> #############################################################
>>>
>>> This message is sent to you because you are subscribed to
>>> the mailing list Lasso [hidden email]
>>> Official list archives available at http://www.lassotalk.com
>>> To unsubscribe, E-mail to: <[hidden email]>
>>> Send administrative queries to  <[hidden email]>
>>
>> #############################################################
>>
>> This message is sent to you because you are subscribed to
>>  the mailing list Lasso [hidden email]
>> Official list archives available at http://www.lassotalk.com
>> To unsubscribe, E-mail to: <[hidden email]>
>> Send administrative queries to  <[hidden email]>
>>
>
> #############################################################
>
> This message is sent to you because you are subscribed to
>  the mailing list Lasso [hidden email]
> Official list archives available at http://www.lassotalk.com
> To unsubscribe, E-mail to: <[hidden email]>
> Send administrative queries to  <[hidden email]>

Marc Pinnell
1027 Design
PO Box 990872
Redding, CA 96099-0872
530.941.4706
fax: 866.232.5300
www.1027Design.com



#############################################################

This message is sent to you because you are subscribed to
  the mailing list Lasso [hidden email]
Official list archives available at http://www.lassotalk.com
To unsubscribe, E-mail to: <[hidden email]>
Send administrative queries to  <[hidden email]>
Reply | Threaded
Open this post in threaded view
|

Re: Checking A List of Sites...

Marc Pinnell-3
In reply to this post by stevepiercy
Not sure if I am really getting apples to apples. I am under the impression that include_url would download the entire page, including graphics. I should have been a little clearer on my statement about Chrome. In a number of cases my load times (from clicking on URL to finished spinning wheel) is SHORTER than the include_url reported time. Not always (probably not even often), but at times. So those could be flukes.

My overall question is/was more about the use of include_url to perform the check for a specific string. Are there better ways to get the same result?

Marc


On Feb 25, 2015, at 10:22 PM, Steve Piercy - Website Builder <[hidden email]> wrote:

> Load time is different from response time.  Are you comparing the exact same things, including any network overhead?
>
> --steve
>
>
> On 2/25/15 at 9:00 PM, [hidden email] (Marc Pinnell) pronounced:
>
>> Hey all,
>>
>> I have a script that runs through the sites that I host checking for response time, whether they are up and operating properly (by verifying a string exists), etc. Results are written to a file for later retrieval. Been wondering if there was a better, more efficient manner to do this. I've noticed that many times the "timer" results that this script reports do not match the same load times I get with say Chrome.
>>
>> My code is below, feel free to pick it apart! :)
>>
>> //NOW CHECK THE SITE
>> protect => {
>> local(
>> sTime = _date_msec,
>> lc = 1
>> )
>> $urlReturn = (
>> include_url('http://' + #oneSite + '/',
>> -sendmimeheaders=array('User-Agent'='Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6'),
>> -timeout = 3,
>> -retrieveMimeHeaders = 'site_headers',
>> -options=array(CURLOPT_FOLLOWLOCATION=1),
>> -string=TRUE
>> ));
>>
>> //CHECK FOR STRING
>> with oneString in $matchstrings do {
>> if($urlReturn >> #oneString) => {
>> $sverified = 'Y'
>> loop_abort
>> }
>> }
>>
>> //PROCESS RESPONSE HEADER
>> $scode = string_extract($site_headers->split('\r\n')->first->asString, -startposition=10, -endposition=12)
>>
>> $ltime = decimal(_date_msec - #sTime)
>> }
>>
>>
>> marc
>>
>>
>> Marc Pinnell
>> 1027 Design
>> PO Box 990872
>> Redding, CA 96099-0872
>> 530.941.4706
>> fax: 866.232.5300
>> www.1027Design.com
>>
>>
>>
>> #############################################################
>>
>> This message is sent to you because you are subscribed to
>> the mailing list Lasso [hidden email]
>> Official list archives available at http://www.lassotalk.com
>> To unsubscribe, E-mail to: <[hidden email]>
>> Send administrative queries to  <[hidden email]>
>
> -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
> Steve Piercy              Website Builder              Soquel, CA
> <[hidden email]>               <http://www.StevePiercy.com/>
>
>
> #############################################################
>
> This message is sent to you because you are subscribed to
> the mailing list Lasso [hidden email]
> Official list archives available at http://www.lassotalk.com
> To unsubscribe, E-mail to: <[hidden email]>
> Send administrative queries to  <[hidden email]>

Marc Pinnell
1027 Design
PO Box 990872
Redding, CA 96099-0872
530.941.4706
fax: 866.232.5300
www.1027Design.com



#############################################################

This message is sent to you because you are subscribed to
  the mailing list Lasso [hidden email]
Official list archives available at http://www.lassotalk.com
To unsubscribe, E-mail to: <[hidden email]>
Send administrative queries to  <[hidden email]>
Reply | Threaded
Open this post in threaded view
|

Re: Checking A List of Sites...

stevepiercy
include_url is for a request of a single URL, not subrequests
for assets.  It's a wrapper for libcurl with basic features.

For similar comparisons, in Chrome developer tools:

     Network > [X] Disable cache
     Network > select the requested resource > Timing

     For http://www.lassosoft.com/
     592ms

Use curl from the command line to get timing, or adapt this to
Lasso 9's curl methods.
http://curl.haxx.se/docs/httpscripting.html#See_the_Timing

     $ curl --trace-ascii d.txt --trace-time http://www.lassosoft.com/
     $ cat d.txt
     30.079817 - 29.288454 = 791 ms

You could, if you want, use curl to get all subrequests and
timings, but that's probably not what you want.

Back to the original request, yes, include_url can be used as
part of checking for the existence of a specific string in the response.

I give a single URL to monitoring services (like Pingdom) where
I need to ensure that both Lasso is running and Lasso's database
connectors can reach data sources.

--steve


On 2/26/15 at 3:29 PM, [hidden email] (Marc
Pinnell) pronounced:

>Not sure if I am really getting apples to apples. I am under
>the impression that include_url would download the entire page,
>including graphics. I should have been a little clearer on my
>statement about Chrome. In a number of cases my load times
>(from clicking on URL to finished spinning wheel) is SHORTER
>than the include_url reported time. Not always (probably not
>even often), but at times. So those could be flukes.
>My overall question is/was more about the use of include_url to
>perform the check for a specific string. Are there better ways
>to get the same result?
>
>Marc
>
>
>On Feb 25, 2015, at 10:22 PM, Steve Piercy - Website Builder
><[hidden email]> wrote:
>
>>Load time is different from response time.  Are you comparing the exact same things,
>including any network overhead?
>>
>>--steve
>>
>>
>>On 2/25/15 at 9:00 PM, [hidden email] (Marc Pinnell) pronounced:
>>
>>> Hey all,
>>>   I have a script that runs through the sites that I host
>>>checking for response time,
>whether they are up and operating properly (by verifying a
>string exists), etc. Results are written to a file for later
>retrieval. Been wondering if there was a better, more efficient
>manner to do this. I've noticed that many times the "timer"
>results that this script reports do not match the same load
>times I get with say Chrome.
>>>   My code is below, feel free to pick it apart! :)
>>>   //NOW CHECK THE SITE
>>> protect => {
>>> local(
>>> sTime = _date_msec,
>>> lc = 1
>>> )
>>> $urlReturn = (
>>> include_url('http://' + #oneSite + '/',
>>> -sendmimeheaders=array('User-Agent'='Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6;
>en-US; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6'),
>>> -timeout = 3,
>>> -retrieveMimeHeaders = 'site_headers',
>>> -options=array(CURLOPT_FOLLOWLOCATION=1),
>>> -string=TRUE
>>> ));
>>>   //CHECK FOR STRING
>>> with oneString in $matchstrings do {
>>> if($urlReturn >> #oneString) => {
>>> $sverified = 'Y'
>>> loop_abort
>>> }
>>> }
>>>   //PROCESS RESPONSE HEADER
>>> $scode = string_extract($site_headers->split('\r\n')->first->asString,
>-startposition=10, -endposition=12)
>>>   $ltime = decimal(_date_msec - #sTime)
>>> }
>>>    marc
>>>    Marc Pinnell
>>> 1027 Design
>>> PO Box 990872
>>> Redding, CA 96099-0872
>>> 530.941.4706
>>> fax: 866.232.5300
>>> www.1027Design.com
>>>     #############################################################
>>>   This message is sent to you because you are subscribed to
>>> the mailing list Lasso [hidden email]
>>> Official list archives available at http://www.lassotalk.com
>>> To unsubscribe, E-mail to: <[hidden email]>
>>> Send administrative queries to  <[hidden email]>
>>
>>-- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
>>Steve Piercy              Website Builder              Soquel, CA
>><[hidden email]>               <http://www.StevePiercy.com/>
>>
>>
>>#############################################################
>>
>>This message is sent to you because you are subscribed to
>>the mailing list Lasso [hidden email]
>>Official list archives available at http://www.lassotalk.com
>>To unsubscribe, E-mail to: <[hidden email]>
>>Send administrative queries to  <[hidden email]>
>
>Marc Pinnell
>1027 Design
>PO Box 990872
>Redding, CA 96099-0872
>530.941.4706
>fax: 866.232.5300
>www.1027Design.com
>
>
>
>#############################################################
>
>This message is sent to you because you are subscribed to
>the mailing list Lasso [hidden email]
>Official list archives available at http://www.lassotalk.com
>To unsubscribe, E-mail to: <[hidden email]>
>Send administrative queries to  <[hidden email]>

-- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
Steve Piercy              Website Builder              Soquel, CA
<[hidden email]>               <http://www.StevePiercy.com/>


#############################################################

This message is sent to you because you are subscribed to
  the mailing list Lasso [hidden email]
Official list archives available at http://www.lassotalk.com
To unsubscribe, E-mail to: <[hidden email]>
Send administrative queries to  <[hidden email]>
Reply | Threaded
Open this post in threaded view
|

Re: Checking A List of Sites...

stevepiercy
In reply to this post by Marc Pinnell-3
master.com.  One custom URL per site.  Free.

--steve


On 2/26/15 at 3:25 PM, [hidden email] (Marc Pinnell) pronounced:

> Yes, I have. I actually use it as a basic check on Lasso 8. However, to have it monitor
> all the sites on the server would be a bit steep at $200/month.
>
> marc
>
>
> On Feb 26, 2015, at 6:06 AM, Jason Huck <[hidden email]> wrote:
>
> > Have you considered a service such as Pingdom?
> >
> > https://www.pingdom.com/
> >
> >
> >
> > On Thu, Feb 26, 2015 at 2:20 AM, Bil Corry <[hidden email]> wrote:
> >
> >> Not to mention Chrome is loading sub-resources (CSS, JS, images, etc) and
> >> executing on them.
> >>
> >> - Bil
> >>
> >>> On Feb 26, 2015, at 7:22 AM, Steve Piercy - Website Builder
> >> <[hidden email]> wrote:
> >>>
> >>> Load time is different from response time.  Are you comparing the exact
> >> same things, including any network overhead?
> >>>
> >>> --steve
> >>>
> >>>
> >>> On 2/25/15 at 9:00 PM, [hidden email] (Marc Pinnell)
> >> pronounced:
> >>>
> >>>> Hey all,
> >>>>
> >>>> I have a script that runs through the sites that I host checking for
> >> response time, whether they are up and operating properly (by verifying a
> >> string exists), etc. Results are written to a file for later retrieval.
> >> Been wondering if there was a better, more efficient manner to do this.
> >> I've noticed that many times the "timer" results that this script reports
> >> do not match the same load times I get with say Chrome.
> >>>>
> >>>> My code is below, feel free to pick it apart! :)
> >>>>
> >>>> //NOW CHECK THE SITE
> >>>> protect => {
> >>>> local(
> >>>> sTime = _date_msec,
> >>>> lc = 1
> >>>> )
> >>>> $urlReturn = (
> >>>> include_url('http://' + #oneSite + '/',
> >>>> -sendmimeheaders=array('User-Agent'='Mozilla/5.0 (Macintosh; U; Intel
> >> Mac OS X 10.6; en-US; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6'),
> >>>> -timeout = 3,
> >>>> -retrieveMimeHeaders = 'site_headers',
> >>>> -options=array(CURLOPT_FOLLOWLOCATION=1),
> >>>> -string=TRUE
> >>>> ));
> >>>>
> >>>> //CHECK FOR STRING
> >>>> with oneString in $matchstrings do {
> >>>> if($urlReturn >> #oneString) => {
> >>>> $sverified = 'Y'
> >>>> loop_abort
> >>>> }
> >>>> }
> >>>>
> >>>> //PROCESS RESPONSE HEADER
> >>>> $scode = string_extract($site_headers->split('\r\n')->first->asString,
> >> -startposition=10, -endposition=12)
> >>>>
> >>>> $ltime = decimal(_date_msec - #sTime)
> >>>> }
> >>>>
> >>>>
> >>>> marc
> >>>>
> >>>>
> >>>> Marc Pinnell
> >>>> 1027 Design
> >>>> PO Box 990872
> >>>> Redding, CA 96099-0872
> >>>> 530.941.4706
> >>>> fax: 866.232.5300
> >>>> www.1027Design.com
> >>>>
> >>>>
> >>>>
> >>>> #############################################################
> >>>>
> >>>> This message is sent to you because you are subscribed to
> >>>> the mailing list Lasso [hidden email]
> >>>> Official list archives available at http://www.lassotalk.com
> >>>> To unsubscribe, E-mail to: <[hidden email]>
> >>>> Send administrative queries to  <[hidden email]>
> >>>
> >>> -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
> >>> Steve Piercy              Website Builder              Soquel, CA
> >>> <[hidden email]>               <http://www.StevePiercy.com/>
> >>>
> >>>
> >>> #############################################################
> >>>
> >>> This message is sent to you because you are subscribed to
> >>> the mailing list Lasso [hidden email]
> >>> Official list archives available at http://www.lassotalk.com
> >>> To unsubscribe, E-mail to: <[hidden email]>
> >>> Send administrative queries to  <[hidden email]>
> >>
> >> #############################################################
> >>
> >> This message is sent to you because you are subscribed to
> >>  the mailing list Lasso [hidden email]
> >> Official list archives available at http://www.lassotalk.com
> >> To unsubscribe, E-mail to: <[hidden email]>
> >> Send administrative queries to  <[hidden email]>
> >>
> >
> > #############################################################
> >
> > This message is sent to you because you are subscribed to
> >  the mailing list Lasso [hidden email]
> > Official list archives available at http://www.lassotalk.com
> > To unsubscribe, E-mail to: <[hidden email]>
> > Send administrative queries to  <[hidden email]>
>
> Marc Pinnell
> 1027 Design
> PO Box 990872
> Redding, CA 96099-0872
> 530.941.4706
> fax: 866.232.5300
> www.1027Design.com
>
>
>
> #############################################################
>
> This message is sent to you because you are subscribed to
>   the mailing list Lasso [hidden email]
> Official list archives available at http://www.lassotalk.com
> To unsubscribe, E-mail to: <[hidden email]>
> Send administrative queries to  <[hidden email]>

-- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
Steve Piercy              Website Builder              Soquel, CA
<[hidden email]>               <http://www.StevePiercy.com/>


#############################################################

This message is sent to you because you are subscribed to
  the mailing list Lasso [hidden email]
Official list archives available at http://www.lassotalk.com
To unsubscribe, E-mail to: <[hidden email]>
Send administrative queries to  <[hidden email]>
Reply | Threaded
Open this post in threaded view
|

Re: Checking A List of Sites...

Brian K. Middendorf-2
In reply to this post by Marc Pinnell-3

> On Feb 26, 2015, at 3:25 PM, Marc Pinnell <[hidden email]> wrote:
>
> Yes, I have. I actually use it as a basic check on Lasso 8. However, to have it monitor all the sites on the server would be a bit steep at $200/month.

You may want to contact Marc Pope.  I know he is working on a similar, but lower cost, service with some innovative features and functionality.

Some I recognize and are of immediate use to me.  Others are focussed on the power users.

I have no doubt he would appreciate your input.

-brian.




#############################################################

This message is sent to you because you are subscribed to
  the mailing list Lasso [hidden email]
Official list archives available at http://www.lassotalk.com
To unsubscribe, E-mail to: <[hidden email]>
Send administrative queries to  <[hidden email]>