Knowledgebase
Best way to make multiple web requests ?
Posted by makkesk8, 01-12-2013, 10:10 PM | Hi!
I'm really unsure what method I should use to download a string of text of a website, filter it and insert it to a mysql database. And later on another server will access the db and download everything.
I need to do this every minute and I need to make a loop to fetch 1 site but multiple times. Now we talk about 100-500 queries every minute.
I've done this with php right now but I have divided up everything into multiple servers and right now its running about totally ~100 requests every minute to that website in 3 servers (~30 on every server).
Is this the best solution to really do it? Since I will be getting a few low end vm's to host this on very shortly and I really need to know if this is the best way in terms of performance.
Thanks
|
Posted by zsuatt, 01-13-2013, 06:28 AM | Ok, let me get this straight. You would like to load the content of the same site 100-500 requests / min. And right now your doing serial requests right?
One thing I would look into is if you could parallelize requests (multi threading maybe?), since the majority of time spent in a new request is establishing the connection.
Another thing you can try to see if you can use Keep-Alive. This would allow you to do multiple requests in the same connection. Some servers don't like this (depending on the settings) because you are reserving a slot in the server for a long-ish time.
|
Posted by makkesk8, 01-13-2013, 01:28 PM | I have the option to use keep-alive luckily.
|
Posted by zsuatt, 01-14-2013, 04:28 AM | Did that improve the reqs/min a bit?
|
Posted by makkesk8, 01-14-2013, 08:10 AM | No diffrence at all actually :E
|
|
Add to Favourites
Print this Article |
Also Read