PHP Developers Network

A community of PHP developers offering assistance, advice, discussion, and friendship.
 
Loading
It is currently Thu Oct 19, 2017 11:05 am

All times are UTC - 5 hours




Post new topic Reply to topic  [ 34 posts ]  Go to page Previous  1, 2, 3  Next
Author Message
PostPosted: Fri Oct 07, 2016 5:55 pm 
Offline
Forum Contributor

Joined: Fri Jul 18, 2014 1:54 pm
Posts: 170
I submitted a post just before this one with the actual F12 network messages. This is my latest simple try based upon the statement that Chrome only uses this one. I am wondering if I can put it in the body the 2nd time. I am questioning whether I should bother and rather just go back to using the java script but have it bypass if it is running on the local host since I know when the data has changed and if I don't see the change I just press F5.

<html>
<head>
<meta http-equiv="cache-control" content="no-store">
</head>
<body>
lots of stuff :-)
<meta http-equiv="cache-control" content="no-store">
</body>
<head>
<meta http-equiv="cache-control" content="no-store">
</head>
</html>


Top
 Profile  
 
PostPosted: Fri Oct 07, 2016 6:13 pm 
Offline
Forum Contributor

Joined: Fri Jul 18, 2014 1:54 pm
Posts: 170
So I just did a bunch of quick writes to the static web page using notebook++ moving it up to the webhost each time. There is a pattern.
When I first change it and refresh the page I get a code 200 Okay.
From that time on I get a 304 no change.
So it is clearly checking to see if the static web page has changed at the webhost. That would be okay if it always worked but I have seen it not get the new static page changes.
I do not know how to recreate the occasional "200 (from cache)" which I get. My reading suggests this means it checked an expiry date/time and it was not expired so it used cache.
I just tried something and I finally got the "200 (from cache)" message. I closed down the browser then change the static page on the webhost but it appears to have gotten it from cache.

So after a few tests of (closing the browser down, changing the static page and moving it to the webhost, restarting the browser and calling the page) I am consistently getting 200 OK (from Cache) and it is picking up the change I made. So it is not telling the truth. it is actually getting the data from the webhost. So maybe this means (the next time it will get the page from cache if there is no change detected) rather than meaning (it actually got the data from cache which it clearly did not do).

So I just saw this "Cache-Control:max-age=0" and I did a few more tests trying to actually bump this value using meta http-equiv="cache-control" content="max-age=500" />. Google Chrome ignores this. It keeps it at zero. So it seems to want to check the file every time and if there is no change it uses cache. The problem is some times it fails (maybe once a week) and it is not smart enough to load the changed page.

So I have decided to set the javascript code back in and I sent Google a message. Hopefully they are interested enough to get back to me for more details.


Top
 Profile  
 
PostPosted: Sat Oct 08, 2016 3:04 am 
Offline
Spammer :|
User avatar

Joined: Wed Oct 15, 2008 2:35 am
Posts: 6573
Location: WA, USA
Please remove all the <meta> things you added.

I should be more specific:
1. What do the 200 and 304 responses look like when you have not updated the file? Refresh a few times and grab a copy of the headers for both types.
2. What do the 200 and 304 responses look like when you have updated the file? Refresh a few times as well.

In both cases, what "should" be happening is you get a 200 (uncached) the first time you see the file after starting the browser, then either 200 (uncached) or 304 every time after. When you update the file and keep the browser open you should get a 200 (uncached) the next time you refresh, then again a mix of 200s and 304s.

More technically,
1. You load the page, browser requests the page normally, server responds with a 200 and a Last-Modified date.
2a. You refresh, browser requests the page normally, server responds with a 200 and a Last-Modified.
2b. You refresh, browser requests the page with a If-Modified-Since header, server responds with (a) a 304 and a Last-Modified or (b) another 200 with a Last-Modified.
3. That keeps happening until you update the file.
4. You refresh and get a 200 and a Last-Modified (#1).
5. You refresh and get a 200 or 304 (#2a or #2b).

Because contrary to many popular beliefs, you should be using caching. However the type of caching in use should be using 304s: that's when the browser requests the page from the server and is told that nothing has changed from last time. A 200 "from cache" is not good because the browser is not contacting the server and thus cannot know if the file has changed; like I think you saw, that's triggered mostly by Expires, while the good caching is done using If-Modified-Since request and Last-Modified response headers.

If you're seeing a 200 "from cache" then that specifically needs to be addressed, and is likely the source of your caching problem. However without there being an Expires header in the response or one in your HTML (double-check that) then I'm not quite sure what would cause it.

(There's another way to do good caching and that's with If-None-Match/ETag headers, but they don't seem to be in use here so don't worry about them. The same basic 200/304 principles apply to those too.)


Top
 Profile  
 
PostPosted: Sat Oct 08, 2016 10:33 am 
Offline
Forum Contributor

Joined: Fri Jul 18, 2014 1:54 pm
Posts: 170
Thanks requinix, I will send the three full examples then read the rest of your email and send another response. These examples are consistently what happens except over the past few weeks as I made test changes to my pages and shipped them up, chrome will about once a week or so not detect that the page has changed and thus grab the old info from Cache. I think chrome has an intermittent bug.

First I removed all the extra meta statements. They do nothing. Even the one that is suppose to be used by chrome does nothing. I think this is a chrome bug too.

When I first open the browser after I change the file from notepad++ and move it to the webhost computer. closing and reopening the brower always adds (from cache). "Status Code:200 OK (from cache)"
Quote:
Request Method:GET
Status Code:200 OK (from cache)
Remote Address:23.111.72.10:80
Response Headers
Accept-Ranges:bytes
Content-Length:51188
Content-Type:text/html
Date:Sat, 08 Oct 2016 15:19:26 GMT
Last-Modified:Sat, 08 Oct 2016 15:17:58 GMT
Server:Apache
Request Headers
Provisional headers are shown
Upgrade-Insecure-Requests:1
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.143 Safari/537.36


I then reload the page with no change to the file (not to be confused with F5 refresh). "Status Code:304 Not Modified"
Quote:
Request Method:GET
Status Code:304 Not Modified
Remote Address:23.111.72.10:80
Response Headers
view source
Connection:Keep-Alive
Date:Sat, 08 Oct 2016 15:22:41 GMT
Keep-Alive:timeout=5, max=100
Server:Apache
Request Headers
view source
Accept:text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Accept-Encoding:gzip, deflate, sdch
Accept-Language:en-US,en;q=0.8
Cache-Control:max-age=0
Connection:keep-alive
Host:yourcommonsongjams.com
If-Modified-Since:Sat, 08 Oct 2016 15:17:58 GMT
Upgrade-Insecure-Requests:1
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.143 Safari/537.36


Now I change the file without closing down the browser. I just add a blank line somewhere then press the refresh button. "Status Code:200 OK"
Quote:
Request Method:GET
Status Code:200 OK
Remote Address:23.111.72.10:80
Response Headers
view source
Accept-Ranges:bytes
Connection:Keep-Alive
Content-Length:51189
Content-Type:text/html
Date:Sat, 08 Oct 2016 15:24:38 GMT
Keep-Alive:timeout=5, max=100
Last-Modified:Sat, 08 Oct 2016 15:23:43 GMT
Server:Apache
Request Headers
view source
Accept:text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Accept-Encoding:gzip, deflate, sdch
Accept-Language:en-US,en;q=0.8
Cache-Control:max-age=0
Connection:keep-alive
Host:yourcommonsongjams.com
If-Modified-Since:Sat, 08 Oct 2016 15:17:58 GMT
Upgrade-Insecure-Requests:1
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.143 Safari/537.36



Now I press F5
Quote:
Request Method:GET
Status Code:304 Not Modified
Remote Address:23.111.72.10:80
Response Headers
view source
Connection:Keep-Alive
Date:Sat, 08 Oct 2016 15:32:20 GMT
Keep-Alive:timeout=5, max=100
Server:Apache
Request Headers
view source
Accept:text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Accept-Encoding:gzip, deflate, sdch
Accept-Language:en-US,en;q=0.8
Cache-Control:max-age=0
Connection:keep-alive
Host:yourcommonsongjams.com
If-Modified-Since:Sat, 08 Oct 2016 15:23:43 GMT
Upgrade-Insecure-Requests:1
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.143 Safari/537.36


Top
 Profile  
 
PostPosted: Sat Oct 08, 2016 10:55 am 
Offline
Forum Contributor

Joined: Fri Jul 18, 2014 1:54 pm
Posts: 170
I read the rest of your response but I need to read it a few times to try and understand it fully.

So chrome seems to be simply going to the webhost computer and checking the last modified date. Based upon my tests here is the summary.
1/ Status Code:200 OK (from cache) - It gives me this when I close and open the browser again. I double checked this and it just got the actual webpage changes????
2/ Status Code:304 Not Modified - It gives me this if it detects that the file date has not been modified. I assume it is using cache and not getting the page.
3/ Status Code:200 OK - It gives me this if it detects that the file date has been modified. Obviously it is going to get the page here.
4/ Status Code:304 Not Modified - If I press F5 it must be forced to read the file again (not use cache) but it still gives this message if the file has not been modified.

The problem is real as pressing F5 proves:
Every so often chrome thinks there is no change to the file and does not display the latest updates. So I have been pressing F5 when this happens and it then gets my updated page. That is why I put in the javascript code to basically force an F5. I suspect chrome is not detecting a change in the file when I first open the browser but I am not sure. Of course when this occurs I never have the F12 developer tools network information available to me. I tried that last night and got no cigar on that idea :-)

What might be interesting to try would be to leave the javascript code out and change the pages daily hoping to see the bug again then when it finally happens instead of pressing F5 I click the refresh button instead. I can do that because the website is not actually live yet. However it might be live by the following week if I am lucky and at that point I really need to have the javascript code in so that the 21 or so other people looking at the real data do not get tripped up by this bug.


Top
 Profile  
 
PostPosted: Sat Oct 08, 2016 11:15 am 
Offline
Forum Contributor

Joined: Fri Jul 18, 2014 1:54 pm
Posts: 170
Okay I am studying your message more closely now.

Quote:
A 200 "from cache" is not good because the browser is not contacting the server and thus cannot know if the file has changed;

But my extra test in the summary with the bold underline shows that it did pick up the changed web page (It is only once in a while that it has the bug show up and it fails).


Quote:
However without there being an Expires header in the response or one in your HTML (double-check that) then I'm not quite sure what would cause it.

It sounds like you are saying I need an expires header in my HTML. I am not sure what this means. I am doing google searches now to try and learn about this.
Reading this page now. https://gtmetrix.com/add-expires-headers.html and also http://stackoverflow.com/questions/1397 ... es-headers
So I read the page and part of the 2nd one. I don't have a htaccess file in the root directory of the webhost. I only have 8 static pages. This website will only get about 15 to 20 people going in once a day to check the main home page last update dates and if they have changed since the last time they looked then they will click the updates page to see who registered to play and if they are interested in joining in they will look at the schedules page to see if they can join in. So the use is very light. Maybe 100 to 300 hits per week if that. If it gets more popular with more users then the following summer I may put it out on a VPS requiring them to log in like I was doing before. If I do that this time I will be registering them myself rather than having them register themselves.


Top
 Profile  
 
PostPosted: Sat Oct 08, 2016 12:33 pm 
Offline
Spammer :|
User avatar

Joined: Wed Oct 15, 2008 2:35 am
Posts: 6573
Location: WA, USA
bowlesj wrote:
Thanks requinix, I will send the three full examples then read the rest of your email and send another response...


So you got: 200 from cache, 304, 200 no cache, 304.

The first 200 is likely ("should be") due to the old file having various caching settings and such. You would need to refresh until you stopped getting that - it means the browser forgot what it had before and reached out to the server again. The next 304 is because the browser reached out, saw no changes, and reused the cached version. That's good.
After you made the change, the next 200 is the correct behavior: browser reached out again, saw changes, and used the new version. The subsequent 304 is the same as before, where the browser was told there were no changes and served the cached copy.

Without adding any caching things, you should always get an uncached 200 or a 304 for every request afterwards - regardless of whether you restart the browser. If that is not the case (ie, you get a 200 from cache) then there is something still applying caching settings... and in those example request/responses you posted, I'm not seeing it.

bowlesj wrote:
1/ Status Code:200 OK (from cache) - It gives me this when I close and open the browser again. I double checked this and it just got the actual webpage changes????

That is what should be happening. You should never see a 200 from cache - always a regular 200 or a 304.

bowlesj wrote:
2/ Status Code:304 Not Modified - It gives me this if it detects that the file date has not been modified. I assume it is using cache and not getting the page.

Meaning: The browser cached the page from the last time you visited, and noted the Last-Modified date that the server reported. The next time it requested the page it included the modification date and gave the server the option of indicating that the page hasn't changed without returning the page's contents again. 304 means there was no change and the browser should redisplay the cached version from before.

bowlesj wrote:
3/ Status Code:200 OK - It gives me this if it detects that the file date has been modified. Obviously it is going to get the page here.

Yes, either there's no cached version at all or the server responded with an updated copy.

bowlesj wrote:
4/ Status Code:304 Not Modified - If I press F5 it must be forced to read the file again (not use cache) but it still gives this message if the file has not been modified.

I'm not sure that hitting F5 always forces an uncached page, but it does tend to disregard the cache more often when you do.

bowlesj wrote:
The problem is real as pressing F5 proves:
Every so often chrome thinks there is no change to the file and does not display the latest updates.

This is the problem we're trying to solve: why Chrome reuses a cached copy when it should not. I promise you that there is not a bug in Chrome regarding caching - rather there is something else going on we haven't identified yet.

The only thing I can think of is that you're using Javascript to alter the page in some way. Is that the case? I don't know the details on this exact behavior but sometimes the cached version will be the original unaltered page and sometimes it is the altered page. If it is caching the altered page then that could account for what you're seeing.

Question: Doing the F5/update the file/F5 stuff from before, what is the request and response for when you see that the page does not appear to have updated?

bowlesj wrote:
So I have been pressing F5 when this happens and it then gets my updated page. That is why I put in the javascript code to basically force an F5. I suspect chrome is not detecting a change in the file when I first open the browser but I am not sure.

Quick correction: Chrome is not detecting any changes at all. It is only reported by the server: it sends the Last-Modified date that it has, then Chrome stores and uses that. The server is what decides whether to return a 200 or a 304. When I've said that it didn't "see" changes, I'm simplifying: it didn't see that the server said that there were changes.

bowlesj wrote:
It sounds like you are saying I need an expires header in my HTML.

Expires is bad. Don't use it. As far as I've been able to determine with what you've said, Last-Modified is perfectly adequate for your usage.


Top
 Profile  
 
PostPosted: Sat Oct 08, 2016 3:38 pm 
Offline
Forum Contributor

Joined: Fri Jul 18, 2014 1:54 pm
Posts: 170
Okay so let me repeat my understanding based upon what you wrote. The browser sends the page name and the last date it has for the page to the server. The server gets these two pieces of info and detects that the latest page it has on its disk has not changed since the date that the browser has sent and thus the server sends the 304 message to the browser (it does not bother sending the page). The browser uses this message to know it should use the cache version of the page to put on the screen for the user to see. On the other hand, if the server compared the browser date with the server's file date and saw a difference then it would not only send a regular "204 Ok" but it would also send the whole page for the browser to display.

Quote:
If it is caching the altered page then that could account for what you're seeing.

When you say "what I am seeing" I assume you mean my seeing "Status Code:200 OK (from cache)". So maybe here the browser is storing the page in cache first then displaying it and with ""Status Code:200 OK" it is displaying it first them storing it in cache???

Quote:
Question: Doing the F5/update the file/F5 stuff from before, what is the request and response for when you see that the page does not appear to have updated?

I have never seen the F12 developer tools network page when I got an old page from cache rather than my updated page from the server. I think what I will do this week is this. I will disable all the javascript code that forces a refresh. I will also remove all the meta stuff from all pages. I will set up a new test location on my website (you can see what I mean by going to my home page). I will make many changes during the day and see if I can reproduce this problem again. However when I check every single time I will press F12 first. If I get the old data I will immediately capture the info I pasted in the post above and add to this thread. Not only that, instead of pressing F5 I will next press the refresh button and see what message I get and post it here (also of course take note as to whether this brings up the proper latest data to the screen). Lastly maybe I will also press F5 and see what the message is.

Quote:
The only thing I can think of is that you're using Javascript to alter the page in some way. Is that the case?

I just checked the static pages that I copy up to the webhost. The javascript I showed in the first post was added after I detected the problem. Other than this the only javascript that changes the page is one that unhides and hides a table list of people attending the event. The others are functions that open another page. I needed to do that because I was having directory problems with storing the static pages in a different directory. I in theory could show you the actual static web pages except this website does not like the html extension :-) I could paste in snippets of stuff I don't understand that well (mostly html stuff). Actually that stuff is in the code below. I basically get it but I am not sure if it could have an impact. I am using an old version of "WYSIWYG Web Builder" which I like. It serves me well. I think it serves me well :-) I insert PHP code into it and javascript into it where needed.

Syntax: [ Download ] [ Hide ]
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<!-- saved from url=(0058)http://localhost/MJN/frmKaraoke_TableLayout.php?TimeSlot=3 -->
<html><head><meta http-equiv="Content-Type" content="text/html; charset=UTF-8">

<title>Getting Started Picture</title>
<meta name="generator" content="WYSIWYG Web Builder 9 - http://www.wysiwygwebbuilder.com">


Top
 Profile  
 
PostPosted: Sat Oct 08, 2016 5:58 pm 
Offline
Forum Contributor

Joined: Fri Jul 18, 2014 1:54 pm
Posts: 170
I managed to get it reproduced a few times tonight. I have no idea how it does it. It seems to be totally random and it happens on different pages. About the 4th time I finally managed to get the F12 network debug info. Here is something I discovered after writing the stuff below. I thought there may be a time problem on the file as shown by WinSCP versus the times showing in the info below. In one instance if I add 4 hours (central time) to the file time to get the GMT times in the info below I would get the cache time larger/newer than the file time. However in the very last bad one this was not the case. So that theory is out the window.

This is what it gave me when if failed to give the latest update. I used WinSCP to double check to make sure the proper updated page file was on the webhost server while at the same time I was checking the page source code to ensure that the page displayed on my browser did not match. This occurred over and over as I had clicked out to a different page which was on the same browser tab then I had to use the browser back button to come back to this page. It was not until I hit the refresh button that it corrected itself and got the proper page (see below for the F12 debug info).
Quote:
Request URL:http://yourcommonsongjams.com/
Request Method:GET
Status Code:200 OK (from cache)
Remote Address:23.111.72.10:80
Response Headers
Accept-Ranges:bytes
Content-Length:51586
Content-Type:text/html
Date:Sun, 09 Oct 2016 01:51:38 GMT
Last-Modified:Sun, 09 Oct 2016 01:50:17 GMT
Server:Apache
Request Headers
Provisional headers are shown
Upgrade-Insecure-Requests:1
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.143 Safari/537.36


So then I did a page refresh (not an F5). This time it got the proper update and gave this info.
Quote:
Request URL:http://yourcommonsongjams.com/
Request Method:GET
Status Code:200 OK
Remote Address:23.111.72.10:80
Response Headers
view source
Accept-Ranges:bytes
Connection:Keep-Alive
Content-Length:51587
Content-Type:text/html
Date:Sun, 09 Oct 2016 01:56:38 GMT
Keep-Alive:timeout=5, max=100
Last-Modified:Sun, 09 Oct 2016 01:51:21 GMT
Server:Apache
Request Headers
view source
Accept:text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Accept-Encoding:gzip, deflate, sdch
Accept-Language:en-US,en;q=0.8
Cache-Control:max-age=0
Connection:keep-alive
Host:yourcommonsongjams.com
If-Modified-Since:Sun, 09 Oct 2016 01:50:17 GMT
Upgrade-Insecure-Requests:1
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.143 Safari/537.36


Here is another incorrect one.
Quote:
Request URL:http://yourcommonsongjams.com/index.html
Request Method:GET
Status Code:200 OK (from cache)
Remote Address:23.111.72.10:80
Response Headers
Accept-Ranges:bytes
Content-Length:51587
Content-Type:text/html
Date:Sun, 09 Oct 2016 02:16:34 GMT
Last-Modified:Sun, 09 Oct 2016 01:51:21 GMT
Server:Apache
Request Headers
Provisional headers are shown
Referer:http://yourcommonsongjams.com/frmKaraoke_TestPage.html
Upgrade-Insecure-Requests:1
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.143 Safari/537.36


And then a corrected one by pressing the refresh button.
Quote:
Request URL:http://yourcommonsongjams.com/index.html
Request Method:GET
Status Code:200 OK
Remote Address:23.111.72.10:80
Response Headers
view source
Accept-Ranges:bytes
Connection:Keep-Alive
Content-Length:51586
Content-Type:text/html
Date:Sun, 09 Oct 2016 02:19:16 GMT
Keep-Alive:timeout=5, max=100
Last-Modified:Sun, 09 Oct 2016 02:17:52 GMT
Server:Apache
Request Headers
view source
Accept:text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Accept-Encoding:gzip, deflate, sdch
Accept-Language:en-US,en;q=0.8
Cache-Control:max-age=0
Connection:keep-alive
Host:yourcommonsongjams.com
If-Modified-Since:Sun, 09 Oct 2016 01:51:21 GMT
Referer:http://yourcommonsongjams.com/frmKaraoke_TestPage.html
Upgrade-Insecure-Requests:1
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.143 Safari/537.36


One more bad one. I am finished for the night.
Quote:
Request URL:http://yourcommonsongjams.com/index.html
Request Method:GET
Status Code:200 OK (from cache)
Remote Address:23.111.72.10:80
Response Headers
Accept-Ranges:bytes
Content-Length:51586
Content-Type:text/html
Date:Sun, 09 Oct 2016 02:25:20 GMT
Last-Modified:Sun, 09 Oct 2016 02:17:52 GMT
Server:Apache
Request Headers
Provisional headers are shown
Referer:http://yourcommonsongjams.com/frmKaraoke_TestPage.html
Upgrade-Insecure-Requests:1
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.143 Safari/537.36


Top
 Profile  
 
PostPosted: Sat Oct 08, 2016 11:35 pm 
Offline
Spammer :|
User avatar

Joined: Wed Oct 15, 2008 2:35 am
Posts: 6573
Location: WA, USA
bowlesj wrote:
Okay so let me repeat my understanding based upon what you wrote. The browser sends the page name and the last date it has for the page to the server. The server gets these two pieces of info and detects that the latest page it has on its disk has not changed since the date that the browser has sent and thus the server sends the 304 message to the browser (it does not bother sending the page). The browser uses this message to know it should use the cache version of the page to put on the screen for the user to see. On the other hand, if the server compared the browser date with the server's file date and saw a difference then it would not only send a regular "204 Ok" but it would also send the whole page for the browser to display.

Yes.


I did a bit of digging. Turns out that Chrome (and other browsers) will make their own decisions about caching if all they have to work with is a Last-Modified and no Expires or Cache-Control. Chrome's decision (which mirrors other browsers') is to not revalidate until some time has passed.

You mentioned .htaccess at one point? Can you use one? Try adding to it
Code:
Header set Cache-Control no-cache

You were using "no-store" earlier: the difference is that no-store will not allow the browser to cache anything at all and thus always re-download the content, while no-cache allows it to cache the response but still check for updates with the server. Since the content doesn't change often, some degree of caching is good to have.

But still remove the Javascript refresh and the additional <meta>s you had before. Keep the configuration as minimal as possible so there's no risk of one caching directive overriding or conflicting with another.


Top
 Profile  
 
PostPosted: Sun Oct 09, 2016 4:37 am 
Offline
Forum Contributor

Joined: Fri Jul 18, 2014 1:54 pm
Posts: 170
Okay, thanks requinix. I will try those things. I take it you did not see anything special in the messages nor in my intial html code.

Quote:
no-store will not allow the browser to cache anything at all

I assume you mean for the specific page involved. I am wondering what the difference between "no-cache" and the default is?

Quote:
But still remove the Javascript refresh and the additional <meta>s you had before. Keep the configuration as minimal as possible so there's no risk of one caching directive overriding or conflicting with another.

I am wondering if anyone has actually proven this is a problem with heavy research. Logically chrome should simply dump anything it does not understand (essentially treating it as a comment). Therefore it should be taking the last one it understands. So as long as you do not duplicate what it understands in the wrong order you should be okay. Not only that how can one control multiple browsers without including multiple statements? By rights the statements should be grouped by browser helping to keep it all clear and with a comment before and after as I often do. For example: Start chrome commands then the commands then end chrome commands. After this Start firefox commands then the commands then end firefox commands. etc. Actually my very first post in this thread has two examples of what I mean regarding comments. Having said that if the same command has the opposite effect in two different browsers this creates a problem and there needs to be an if statement to first detect which browser is running (does the problem exist and can HTML have if statements?).


Last night I got a string of them all of a sudden where as the day before I could not get any of them at all. Last night I did not get 100% however (I still got a lot of immediate detection of a change to the file). So what was the difference between the two nights with 0% bad the night before and 10% bad last night. The difference was I had netbeans running and all calls were from netbeans. So maybe netbeans is adding in a random factor. I am 62 and I have been programming since age 26 so I know all too well that significant changes to software greatly increase the chances of new bugs. I also know that netbeans is probably very complex and large (it locks up a fair bit). So this is my best guess as to what might be the problem. Because it is only a guess, today with fresh energy I am going to run 10 or 20 changes with netbeans then follow that by 10 or 20 without. The next day I will reverse the order (netbeans 2nd). I am going to do it with different pages but I will keep the page tests in exactly the same order to try and isolate netbeans as the differing factor as best as I can. If I consistently get 100% without netbeans and I get maybe 10% with netbeans then I will assume it is netbeans causing it. I might repeat with your changes to. Once it is all done I will post the results.


Top
 Profile  
 
PostPosted: Sun Oct 09, 2016 5:33 am 
Offline
Spammer :|
User avatar

Joined: Wed Oct 15, 2008 2:35 am
Posts: 6573
Location: WA, USA
bowlesj wrote:
Okay, thanks requinix. I will try those things. I take it you did not see anything special in the messages nor in my intial html code.

Right: there was nothing in the headers telling Chrome to cache results, which meant it was deciding to do so by itself.

bowlesj wrote:
I assume you mean for the specific page involved. I am wondering what the difference between "no-cache" and the default is?

The default is... the browser does whatever it wants. Dynamic pages (eg, PHP) generally don't include any caching instructions so the browser picks some reasonable default behavior. Static pages generally don't include caching instructions either, but they do include things like Last-Modified dates or ETag identifiers and the browser can make decisions according to those.

bowlesj wrote:
I am wondering if anyone has actually proven this is a problem with heavy research. Logically chrome should simply dump anything it does not understand (essentially treating it as a comment). Therefore it should be taking the last one it understands.

Maybe. Or maybe it takes the first one and ignores the rest. Or maybe it tries to find a common denominator between the various settings.

I don't know of any test about that behavior. Wouldn't be surprised if there was. quirksmode might have one.

bowlesj wrote:
So as long as you do not duplicate what it understands in the wrong order you should be okay.

I really don't know. My advice would be to pick one caching strategy (eg, using modification dates) and don't implement the others.

bowlesj wrote:
Not only that how can one control multiple browsers without including multiple statements?

You don't. If you're explicit about caching then all the browsers will behave the same way. Because there's clear defined behavior they're supposed to follow. When you aren't explicit then they won't have any guidelines and will have to make educated guesses about what they should do.


Top
 Profile  
 
PostPosted: Sun Oct 09, 2016 8:49 am 
Offline
Forum Contributor

Joined: Fri Jul 18, 2014 1:54 pm
Posts: 170
Well netbeans and xampp are eliminated as being possible causes of the problem. I did a fresh reboot. No other websites being run. As few programs as possible. Totally random results as far as I can see meaning it does not care what page it is. I will repeat the same tests later today with some or your suggestions and without netbeans and without xampp. If I can't get it to work I will have no choice but put the javascript code in place. It is not as if these are huge pages nor as if there is major user interaction (must think practical). Bottom line is the users have to get the last update every single time without fail. If it fails I might try firefox just out of curiosity (initial pass to set cache then a second with all the changes). Chrome is preferred because of one very important feature however.

Basic test rules:
To change the pages open all pages with notepad++ in the order indicated below so you can do this fast.
Change the page before you open it every single time.
If it fails first double check to make sure you actually moved the page up using winscp.
If you moved it up then double verify by clicking refresh to see if the change comes through.
It must display all page changes perfectly first time to obtain a pass (including all 9 changes to the test page which is a very simple page).

Home page
schedule 1
schedule 2 (you got an error without netbeans)
updates 1
updates 2
layout 1
layout 2
no-stage instructions.
test page 1 (it failed without netbeans and without xampp)
test page 2
test page 3
test page 4
test page 5
test page 6
test page 7
test page 8
test page 9
go out to the test page and come back to the home page with the previous page button but change the home page.
Do this 5 times with 1,2,3,4,5 and set it back.

Ongoing Updates:
Firefox does the same thing.
reading the source for a Chrome commands tutorial: https://developers.google.com/web/funda ... tp-caching
After reading this from the above tutorial
Quote:
The only thing left to do is to ensure that the server is providing the necessary ETag tokens. Check your server documentation for the necessary configuration flags.

It seems that maybe the webhosting company does not have these configuration flags set up properly since neither chrome nor firefox pass the test?


Top
 Profile  
 
PostPosted: Sun Oct 09, 2016 7:41 pm 
Offline
Forum Contributor

Joined: Fri Jul 18, 2014 1:54 pm
Posts: 170
Finally I got the test to work. That is a total of 26 page changes made very quickly and all picked up by chrome. The meta code is below inserted in the main header only.

Syntax: [ Download ] [ Hide ]
<!-- START: HTML code to make sure your page is not stored in cache so that you do not need to tell them to use F5 -->
<!-- https://developers.google.com/web/funda ... tp-caching    NOTE: this is a good tutorial -->
<!-- http://www.i18nguy.com/markup/metatags.html   NOTE: this has a good explaination of all the commands and why they are needed -->
<!-- http://www.htmlgoodies.com/beyond/refer ... he-Huh.htm NOTE: Microsoft IE only you may need to these commands again. -->
<!-- http://stackoverflow.com/questions/4954 ... l-browsers   NOTE: this gives a more complete set of commands to use -->
<meta http-equiv="cache-control" content="no-cache, public, max-age=86400">
<meta http-equiv="expires" content="0">
<meta http-equiv="expires" content="Tue, 01 Jan 1980 1:00:00 GMT">
<meta http-equiv="pragma" content="no-cache">
<!-- END: HTML code to make sure your page is not stored in cache so that you do not need to tell them to use F5 -->
 


However, I did message my webhost support asking them to check that the flags were turned on for cache control. They never got back to me to indicate they did anything so I assume they have not done anything. If that is the case then I think the order of these commands may have been incorrect.

I may run this test a few days in a row including with firefox.


Last edited by bowlesj on Mon Oct 10, 2016 5:26 pm, edited 1 time in total.

Top
 Profile  
 
PostPosted: Mon Oct 10, 2016 11:56 am 
Offline
Forum Contributor

Joined: Fri Jul 18, 2014 1:54 pm
Posts: 170
For a 2nd day in a row, Chrome passed the test shown above with the meta statements shown above (another 26 tests). So I am assuming the problem is solved with Chrome. Firefox failed immediately on the first item in the list (even though I reset all the cache by going through all pages to press the refresh button) so I will be going directly to the source for the firefox as well to try and figure out why. Maybe the page below is the best one. I will look at it later.
https://developer.mozilla.org/en-US/doc ... TP/Caching


Last edited by bowlesj on Mon Oct 10, 2016 5:26 pm, edited 1 time in total.

Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 34 posts ]  Go to page Previous  1, 2, 3  Next

All times are UTC - 5 hours


Who is online

Users browsing this forum: No registered users and 6 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Jump to:  
Powered by phpBB® Forum Software © phpBB Group