PHP programming forum. Ask questions or help people concerning PHP code. Don't understand a function? Need help implementing a class? Don't understand a class? Here is where to ask. Remember to do your homework!
Moderator: General Moderators
dappa
Forum Newbie
Posts: 5 Joined: Sun Feb 04, 2007 1:36 pm
Post
by dappa » Sun Jul 01, 2007 1:51 pm
How can I fetch whole website and then echo only text and links that site have, excluding all pictures and fancy layout?
So what happens for user is that, he inputs some URL and then see text and links that site contains.
alex.barylski
DevNet Evangelist
Posts: 6267 Joined: Tue Dec 21, 2004 5:00 pm
Location: Winnipeg
Post
by alex.barylski » Sun Jul 01, 2007 2:01 pm
fopen() w/ allow_url_fopen enabled and some regex.
s.dot
Tranquility In Moderation
Posts: 5001 Joined: Sun Feb 06, 2005 7:18 pm
Location: Indiana
Post
by s.dot » Sun Jul 01, 2007 2:01 pm
I don't see a valid reason for doing this.
dappa
Forum Newbie
Posts: 5 Joined: Sun Feb 04, 2007 1:36 pm
Post
by dappa » Sun Jul 01, 2007 2:12 pm
scottayy wrote: I don't see a valid reason for doing this.
Doing this allows me to browse web-sites via my cell phone that have a simple WAP function but no real web-browser.
s.dot
Tranquility In Moderation
Posts: 5001 Joined: Sun Feb 06, 2005 7:18 pm
Location: Indiana
Post
by s.dot » Sun Jul 01, 2007 2:15 pm
Oh.
file_get_contents() and a regex matching <a></a> tags would work.
Chris Corbyn
Breakbeat Nuttzer
Posts: 13098 Joined: Wed Mar 24, 2004 7:57 am
Location: Melbourne, Australia
Post
by Chris Corbyn » Sun Jul 01, 2007 2:47 pm
Check out wget. You can recursively download websites with wget. It's a command line tool though.