Ok, I have a website where users post some summary information about an article and a link to the actual article. These can then be searched by anyone. Some of the links invariably become dead so I would like to save a copy of them on the server so that if the link is dead a user can click on something like 'View Archived Content' and they can see the original article.
PDF's are no problem. HTML is. I tried using PHP to download the HTML file that the link points to, but then you don't get images. What I'd like to do is something similar to what you do when you save a web page and all the images with it, but I obviously need to do it in PHP.
Any suggestions would be greatly appreciated, thanks!
Saving a cached copy of a link
Moderator: General Moderators
you could use file_get_contents() to get the HTML layout then parse the string and look for images. You can then use file_get_contents() again for the images and save them to your server.
keep in mind, if you're getting this stuff from an external site, you need to make sure you have permission to save the data (images) to your site so as to avoid any copyright violations.
keep in mind, if you're getting this stuff from an external site, you need to make sure you have permission to save the data (images) to your site so as to avoid any copyright violations.