Preventing Failed Database Connections
Posted: Tue May 19, 2009 1:21 pm
I have a problem of random failed connections to a mySQL database located on a separate server. I am asking for any suggestions on how I can prevent these failed connections or how to deal with them.

My program is an order fulfillment software using PHP, Javascript (DOJO), and MYSQL. The program sits on the company server and downloads orders collected on a remote server that runs the Web site.

Each step of the program is called via AJAX. The response is set back and reported to the user so the user can see how the download program is going. So each time a database needs to be read from or written to, a new connection is made and destroyed.

So if the user performs a download and there are 20 orders sitting on the server waiting to be downloaded there will be 41 connections made and destroyed. Small batches (less than 20 orders) do not usually experience any failed connections but batches over 20 orders usually experience 1 or 2 failed connections. This program on average will need to handle batches of 100 to 200 orders at a time. One of the main problems of a failed connection is that is slows down the download process considerably. I also do not want to stress out the remote server too much because that server runs the Web site and I do not want to cause any downtime to the Web site.
Does anyone have any ideas on a better way of going about this program?
Thank you,
Paul

My program is an order fulfillment software using PHP, Javascript (DOJO), and MYSQL. The program sits on the company server and downloads orders collected on a remote server that runs the Web site.

Each step of the program is called via AJAX. The response is set back and reported to the user so the user can see how the download program is going. So each time a database needs to be read from or written to, a new connection is made and destroyed.

So if the user performs a download and there are 20 orders sitting on the server waiting to be downloaded there will be 41 connections made and destroyed. Small batches (less than 20 orders) do not usually experience any failed connections but batches over 20 orders usually experience 1 or 2 failed connections. This program on average will need to handle batches of 100 to 200 orders at a time. One of the main problems of a failed connection is that is slows down the download process considerably. I also do not want to stress out the remote server too much because that server runs the Web site and I do not want to cause any downtime to the Web site.
Does anyone have any ideas on a better way of going about this program?
Thank you,
Paul