Going back to the previous page refreshes the whole page
When browsing certain pages (specifically Reddit.com), going back to the previous page will refresh the whole page, which expands the comments I've collapsed and I can no longer find the comment I was looking at a moment ago.
Currently I do not have any add-ons activated, but if I start Firefox in safe-mode the problem is no longer there.
I have tried to refresh Firefox, but this did not solve my problem.
I have also cleared my cache and this did not solve my problem either.
This is a problem that seems to come and go from time to time, without any reason that I understand.
SuiTobi modificouno o
All Replies (20)
I get around this by opening a New Tab to the link I want to check.
FredMcD said
I get around this by opening a New Tab to the link I want to check.
That's not how I want to browse though and it doesn't really fix the problem, it just accepts it and works around it. I know it can work the way I want, I just don't know how to make it so! :)
SuiTobi said
When browsing certain pages (specifically Reddit.com), going back to the previous page will refresh the whole page, which expands the comments I've collapsed and I can no longer find the comment I was looking at a moment ago. Currently I do not have any add-ons activated, but if I start Firefox in safe-mode the problem is no longer there. I have tried to refresh Firefox, but this did not solve my problem. I have also cleared my cache and this did not solve my problem either. This is a problem that seems to come and go from time to time, without any reason that I understand.
It does not refresh the page if I go back to a pinned page
I'm sometimes puzzled by what Firefox caches and what it doesn't. When I open a comments page, it is served with:
Cache-Control: private, max-age=0, must-revalidate, no-cache
But Firefox stores numerous pages in a "fast back-forward cache" even when the server specifies no-cache, unless it also specifies no-store. So that might explain why I can view images from a reddit page and go back without a reload.
I wonder why yours behaves different in regular and Safe Mode? Could you check the Extensions list? It is Extensions that are disabled in Safe Mode. Either:
- Ctrl+Shift+a
- "3-bar" menu button (or Tools menu) > Add-ons
In the left column, click Extensions. Then on the right side, is there anything that seems as though it might affect caching?
I like a refreshed page if I go back and forth between pinned pages
jscher2000 said
I'm sometimes puzzled by what Firefox caches and what it doesn't. When I open a comments page, it is served with: Cache-Control: private, max-age=0, must-revalidate, no-cache But Firefox stores numerous pages in a "fast back-forward cache" even when the server specifies no-cache, unless it also specifies no-store. So that might explain why I can view images from a reddit page and go back without a reload. I wonder why yours behaves different in regular and Safe Mode? Could you check the Extensions list? It is Extensions that are disabled in Safe Mode. Either:In the left column, click Extensions. Then on the right side, is there anything that seems as though it might affect caching?
- Ctrl+Shift+a
- "3-bar" menu button (or Tools menu) > Add-ons
As mentioned, even without ANY add-ons installed, I still have the problem. That is why I'm really confused as to why it works in safe-mode, since that usually means it's an add-on problem.
Safe Mode affects a range of other things, but they just don't seem very relevant... http://kb.mozillazine.org/Safe_Mode
jscher2000 said
Safe Mode affects a range of other things, but they just don't seem very relevant... http://kb.mozillazine.org/Safe_Mode
Yeah.. That's why I hoped someone else would have a brilliant solution :D
Odd. Now safe-mode doesn't work anymore. I am out of ideas.
I have found a temporary solution to the problem by following this guide.
It's not a safe solution as far as I understand, since it makes Firefox somewhat vulnerable, but it does fix my problem. I will leave the support-thread unsolved for now, in case anyone finds a proper solution.
Oh, you didn't mention using HTTPS, so apparently my testing did not match your scenario.
I think what that rule does is okay in this case. You wouldn't want to override anti-caching on sites that contain personal/confidential information. But the pages you generally find on Reddit are not very sensitive, so if they linger on disk, it's not a big deal, is it?
jscher2000 said
Oh, you didn't mention using HTTPS, so apparently my testing did not match your scenario. I think what that rule does is okay in this case. You wouldn't want to override anti-caching on sites that contain personal/confidential information. But the pages you generally find on Reddit are not very sensitive, so if they linger on disk, it's not a big deal, is it?
Most sites are nowadays aren't they? I figured *that* wasn't the direct problem, especially since Reddit has been using HTTPS for a long while (and my problem only started recently, after a long period where it was gone).
Reddit is basically a forum for threads, posts and discussions which basically uses no personal or confidential information. As far as I understand, this fix only has an effect on Reddit, so it should definitely not be a big deal :)
Reddit now exhibits the same problem as before, even though I'm still using the fix mentioned previously in this thread. Hope someone with some wits can help me out!
Other people are mentioning it here: https://www.reddit.com/r/firefox/comments/2vdd68/improve_reddits_performance_in_firefox/
If you use the Web Console to look at the caching headers, has the add-on stopped working, i.e., stopped modifying the cache-control header? While on a Reddit page, you can open the Web Console by pressing Ctrl+Shift+k or using the Developer menu. Then reload the page and various URLs should be listed in the console. (If URLs don't appear, click the Net button above the console once or twice.) If you scroll back up to the first URL and click it, you should see the Response Headers including the caching line.
Mine shows this:
Cache-Control: max-age=0, must-revalidate
I'm not sure why I am not getting no-cache today.
I didn't check comments pages, which were mentioned in the above thread.
jscher2000 said
Other people are mentioning it here: https://www.reddit.com/r/firefox/comments/2vdd68/improve_reddits_performance_in_firefox/ If you use the Web Console to look at the caching headers, has the add-on stopped working, i.e., stopped modifying the cache-control header? While on a Reddit page, you can open the Web Console by pressing Ctrl+Shift+k or using the Developer menu. Then reload the page and various URLs should be listed in the console. (If URLs don't appear, click the Net button above the console once or twice.) If you scroll back up to the first URL and click it, you should see the Response Headers including the caching line. Mine shows this: Cache-Control: max-age=0, must-revalidate I'm not sure why I am not getting no-cache today. I didn't check comments pages, which were mentioned in the above thread.
Thanks for helping out once again :)
Mine shows this:
Cache-Control: private, s-maxage=0, max-age=0, must-revalidate, max-age=0, must-revalidate
SuiTobi modificouno o
Well, it seems that setting it to null (blank) is not working. Is there any change in the URL? The rule only applies to www.reddit.com (exactly) and not, for example, reddit.com or abc.reddit.com.
jscher2000 said
Well, it seems that setting it to null (blank) is not working. Is there any change in the URL? The rule only applies to www.reddit.com (exactly) and not, for example, reddit.com or abc.reddit.com.
Not that I can see, but I'm not entirely sure I understand what you're asking.
Someone is the discussion thread on reddit mentioned that some parts of the site are now on np.reddit.com instead of www.reddit.com and you may need to add a second rule as a result. But this is all second hand, I haven't tested it myself.
jscher2000 said
Someone is the discussion thread on reddit mentioned that some parts of the site are now on np.reddit.com instead of www.reddit.com and you may need to add a second rule as a result. But this is all second hand, I haven't tested it myself.
It's actually not many parts that use np, and my problem is on the regular www anyways.
Bump for help :(