$_GET sometimes not working; incorrect url encode method ?

PHP programming forum. Ask questions or help people concerning PHP code. Don't understand a function? Need help implementing a class? Don't understand a class? Here is where to ask. Remember to do your homework!

Moderator: General Moderators

Post Reply
superdez
Forum Commoner
Posts: 33
Joined: Tue Jul 03, 2007 1:36 pm

$_GET sometimes not working; incorrect url encode method ?

Post by superdez »

1.
Sometimes clicking on the link below sends the information to pageb but sometimes it does not. The link holds static information which is passed to pageb. I don't find a reason of why clicking on this link works most of the time but at odd times clicking on the link directs you to pageb correctly but the information that is supposed to be passed is shown empty. I click on it again and it works... This happens on both ie7 and firefox 2. Can anybody tell me why ? (I can't use POST, only GET through a link. I am also using XHTML and UTF-8)

2.
Can anybody tell me which is the correct way to url encode?
Which one should I use: urlencode or rawurlencode. What is the standard? W3C says to use ";" but it seems that most browsers don't use that. W3C says to use "&" to encode "&" instead. But what about the spaces ("+" or "%20") ? Hyphens do not need to be encoded...
What is the max amount of chars I can put on an url ?


pagea: send information to pageb using link/GET:

Code: Select all

<a href="../folder/pageb.php?function=pass&accesscode=<?php print dechex(crc32(md5("John Doe - Kansas" . "50" . "suburb" . "toyota")));?>&name=<?php print urlencode("John Doe - Kansas");?>&age=50&location=suburb&car=toyota">Submit</a>

pageb: receives information passed through by above link. The following is a hash check, if correct it will allow page to load.

Code: Select all

if( $_GET["function"]=="pass"){
if(($_GET["accesscode"])==(dechex(crc32(md5((($_GET["name"]) . ($_GET["age"]) . ($_GET["location"]) . ($_GET["car"]))))))){
}else{
exit;
}
}
User avatar
feyd
Neighborhood Spidermoddy
Posts: 31559
Joined: Mon Mar 29, 2004 3:24 pm
Location: Bothell, Washington, USA

Post by feyd »

Why is this information being passed in the first place? What's wrong with using session data?
superdez
Forum Commoner
Posts: 33
Joined: Tue Jul 03, 2007 1:36 pm

Post by superdez »

A session is created after clicking on the link (when pageb is accessed) which stores all the values in a cookie. I can't create a session on pagea.
User avatar
feyd
Neighborhood Spidermoddy
Posts: 31559
Joined: Mon Mar 29, 2004 3:24 pm
Location: Bothell, Washington, USA

Post by feyd »

Why can't a session be created on page A? Storing all those values in a cookie is not a session, it's a cookie. Two different things.
User avatar
s.dot
Tranquility In Moderation
Posts: 5001
Joined: Sun Feb 06, 2005 7:18 pm
Location: Indiana

Post by s.dot »

This seems like a really awkward validation method. 8O

Why not go with a standard form to post values, and set them in the session on submission of the form?
Set Search Time - A google chrome extension. When you search only results from the past year (or set time period) are displayed. Helps tremendously when using new technologies to avoid outdated results.
superdez
Forum Commoner
Posts: 33
Joined: Tue Jul 03, 2007 1:36 pm

Post by superdez »

I don't see it so weird :?

right, I meant session file not cookie

pagea doesn't create a session because I want it to be the most SEO friendly I can. I just want it to pass information to pageb. Pageb is not available to spiders.

I don't want to use a form because the text in the button won't be "seen" by the webcrawlers. The words in the link will and they are important for my site.
User avatar
s.dot
Tranquility In Moderation
Posts: 5001
Joined: Sun Feb 06, 2005 7:18 pm
Location: Indiana

Post by s.dot »

Why not put the words in h1 tags right above the form? h1 from my experience is more powerful in seo than link text. sessions won't affect your ranking whatsoever :)

All the crawler sees is the web site much as a human would see it.

EDIT| Why isn't pageb available to spiders? If the spider can follow a link to pageb, then it is definitely available. However, spiders cannot POST data.
Set Search Time - A google chrome extension. When you search only results from the past year (or set time period) are displayed. Helps tremendously when using new technologies to avoid outdated results.
superdez
Forum Commoner
Posts: 33
Joined: Tue Jul 03, 2007 1:36 pm

Post by superdez »

The page is quite cluttered already and it wouldn't make much sense puting words on top of it and I would have to place words on the form button anyway even if they can't be crawled.

pageb is banned to spiders with robots.txt as it is contains information irrelevant to spiders.

I realize that having many of these links (which have SEO words) on pagea and then banning pageb to spiders sounds quite contradictory but I think it is better in SEO terms than using a form (because it can't be crawled at all).




Could anybody answer the questions in my first post ? Is is possible that GET sometimes fails because lots of chars are used ?
User avatar
s.dot
Tranquility In Moderation
Posts: 5001
Joined: Sun Feb 06, 2005 7:18 pm
Location: Indiana

Post by s.dot »

Well, robots do no have to follow (and lots of them don't) robots.txt files.

But to answer your original question, yes. There is a limit to the amount of data passed through $_GET. A simple md5(), sha1() or sha256() hash should be enough.
Set Search Time - A google chrome extension. When you search only results from the past year (or set time period) are displayed. Helps tremendously when using new technologies to avoid outdated results.
Post Reply