var date1 = document.adder.dateStart.value;
var date0 = new Date(date1);
date0.setDate(date0.getDate() - 1);
var month0 = date0.getMonth() + 1;
alert (month0);
var day0 = date0.getDate();
alert (day0);
var year0 = date0.getYear();
alert (year0);
I added the alerts to track what was going on and found that Safari returns 105 for the year, whereas Firefox (and apparently all PC browsers) return 5. I'm 100% certain Safari didn't do this when I first wrote this script about a year ago. Does any one have a clue what's going on here?
Thanks in advance!
Last edited by charp on Wed Dec 21, 2005 7:31 pm, edited 1 time in total.
Well I finally solved my own problem. Since no one replied to my post, I'm not sure if my problem stumped everyone or if my question was so lame that you all knew I'd figure it out on my own.
At any rate, here's the link where I found my answer: