- A client contacts the web host with a XMLHttpRequest to a PHP script.
- The script sets up a dialog with a remote serial server.
- The script sends and receives data as necessary and logs everything.
- The script closes the remote session and returns the XML package to the client.
- The client receives the data, processes it, and may or may not restart from the beginning.
- Any additional client that logs in will ask for data via the same XMLHttpRequest.
- The remote host is a serial server. It doesn't understand the concept of multiple sessions.
- The PHP script that returns the XML must be intelligent enough to know whether a session is already in progress.
- This would be blinkingly easy if I could run the script perpetually. For now, let's assume that I don't have that option.
- Once the request comes in, the script locks a key file.
- When the dialog with the remote serial server is finished, the script unlocks the key file and writes a timestamp to a list index file.
- Any concurrent instance of the script will try to lock the file, but fail.
- It will then read the last timestamp from the list index file.
- It will access that log, read the data, and package up an XML to send back.
- The illusion is that the script is persistently running, sending and receiving data, when in fact, its life is measured in abrupt instances.
EDIT: The careful reader will notice that any concurrent client will be getting "slightly stale data." That's perfectly fine. I'm monitoring conditions that won't change suddenly, and I'm not worried about "slightly stale."