I've talked to a few people in various irc channels and the responses range from 'updating that much data is one go is the issue' to 'it sounds like what you're trying to do is a UX nightmare' to 'you should try making a temp table or non-mysql message queue'. While I do think updating this much data in one go would give me problems, I'm unsure how it's a UX nightmare. A UX nightmare would be having 10 rows, times 10 zip codes an having to update each one individually, when at the end of the day synopsis, temp_f and other fields are going to be identical to the fields in this screenshot.
Where I stand currently is... not too far into this, I'm pulling all the zipcodes I need to update into an array and array_slice'ing that down to these 10 rows (again, each zipcode contains 10 rows in the database) the only two things I believe I need to figure out how to do it the way I'm looking for is a.) taking the forms "submitted" fields, and pusing the "synopsis" and "temp_f" data from the 10 submitted rows to the other zipcodes in the array and then b.) figuring out how I would iterate over THAT _POSTed array and doing the insert statement. I actually don't think that's possible witout 3 pageloads. The original form page, the _POSTed serialized array which would then need the affected zipcodes to know what to push into them and then another serialized _POSTed array to do the actual foreach looping of the insert ... on dup key update, no?
So ultimately, is what I'm looking to do (hopefully my second paragraph got the idea across correctly) "bad practice" or "a nightmare" from a coding perspective? I can get behind the latter definitely, but the former I'm unsure of. Keep in mind that this could potentially be updating 10 rows X 50-100 zipcodes at one time... so I'm really looking for advice/possible solutions.
