Page 1 of 1

database query

Posted: Fri Oct 05, 2007 12:54 pm
by stender
Hi All, Could anyone help with some code?
I have a list in a 1 column mysql table (around 1million rows)
eg
red
blue
green
orange
purple etc....

I need to submit the list to a remote server (e.g. telnet to abc.co.uk 2034 ) going through the whole table but can only do 100,000 per day or will get banned.

I then need to store the data returned in another table.

e.g. I have a table with a column

red
blue
green
orange
purple etc.... I submit this to abc.co.uk 2345

It returns 1 line with about 5 fields which I want to write to a table with each field in it's own column.

Would anyone be so kind as to tell a complete newb how to go about this?

thanks in advance

Posted: Fri Oct 05, 2007 1:06 pm
by feyd
I, at least, will need some more explanation.

Posted: Fri Oct 05, 2007 1:37 pm
by stender
Sorry!

i have my sql database > table called 'cards' with 1 column (1million rows)
example of rows
red
yellow
blue
green

I wish to telnet to a company server e.g. abc.co.uk 2567 (host and port)

Then each row of the table is submitted to the server and the server returns a string with fields.

e.g. first row i submit red and the server replies - red,Y,Y,N,10-dec-2007
The reply fields represent rthings like in stock, date of expiry etc...

Those reply fields I want to store in another table which will have the 5 fields
red,Y,Y,N,10-dec-2007

The problem also is that i can only submit 100,000 rows a day or will get barred.
So I also need to be able to set how many of the list to do before stopping.

Hope this makes sense.

Posted: Fri Oct 05, 2007 1:55 pm
by s.dot
Make use of the LIMIT aspect of your query. I'd select the first 100,000 rows, then submit them. Then the second set, etc..

Posted: Sat Oct 06, 2007 7:08 pm
by califdon
scottayy wrote:Make use of the LIMIT aspect of your query. I'd select the first 100,000 rows, then submit them. Then the second set, etc..
...but if the database is being updated with changes, the next day the data may be changed and possibly deletions, etc.

There's something rather odd about doing business this way. Even with a million row table, the resources used to return a simple query are virtually insignificant. It would make sense to question the limitation and see if you can get what you need in a straightforward query. Assuming that you are authorized to access this data...