How to force browsers to fetch new files instead of cache (images, CSS, JS, etc)

I recently ran into a problem with a webpage which had a lot of functionality powered by Javascript. The JS file was then modified to restrict access to a few buttons on the front-end, however, users were able to still access it because their browsers were getting the older cached version of the js file.

Working around this is an easy fix: add GET variables to your file paths.

For example, if you were linking your file:

http://aliishaq.net/js/example.js

Simply add a random string as a GET variable:

http://aliishaq.net/js/example.js?v=123132132

As the browsers will now see this as a new URL, it will attempt to get the file again, which will return in the new file being fetched instead of cache.

If you want the browser to always get the new file, you can use php to attach a random string with the path to return a new file path every time, like so:

http://aliishaq.net/js/example.js?v=<?php echo rand(); ?>

But a better way would be to assign version to your files. For example, your initial file can be

http://aliishaq.net/js/example.js?v=1

and if you make any change to it, you can just change it to something like

http://aliishaq.net/js/example.js?v=2

I hope this is helpful and resolves the kind of problems I faced.