Knowledgebase

Stress testing brand new servers. Any tips?

Posted by papi, 11-01-2010, 09:58 AM
So we're replacing our 5 year old Supermicro boxes with new ones. The old ones are 2x2.8Ghz Xeons (yes the original single core xeons) with raptor74GBs in RAID 1 (3ware-2LP) and 1Gb memory. Each server has 100-150 shared hosting accounts and manages just fine most of the time (unless some script or modrewrite rule goes berserk and spawns hundreds of php procs) Anyway, new boxes are way more powerful but I'd still like to make sure we're getting the most out of them in the way of optimizing their performance and also to stress test them for about a week to make sure nothing fails short time after they're put live on the net The new boxes are single processor Supermicro 1RU servers with a Quad core Xeon, 12Gb Ram, 4x300gb SAS 15k7 Cheetahs in RAID 10 (Adaptec 5405 + battery modules) so significantly more oomph despite it being just a single cpu (its usually never the cpu that bogs the server down) Any recommended ways of stress testing them once we've loaded centos5 + cpanel onto them ? just to make sure hardware wise nothing fails. What about optimizing mysql and php ... the latter especially is able to bring any server down with a simple dodgy mod_rewrite rule in my experience. We'll be using suphp, restrict its functions and only install php modules that customers requested ie. NOT every option available via easyapache. Any tips on how to further optimize php, mysql so as to get most out of it and be ready just in case some idiot runs unlimited loops spawning 100s/1000s of php processes which right now always brings the server down as it runs out of memory real quick.

Posted by FLDataTeK, 11-01-2010, 11:45 AM
You can try http://loadimpact.com/index.php The free test will put a decent load on it, but their paid plan will do alot more.. They use alot of servers to test with. When I ran the free test about 40 different servers hit my site.

Posted by eth00, 11-01-2010, 01:23 PM
You may want to checkout cloudlinux, it can do wonders on shared hosting servers to make sure that a single user does not take it out. It is a centos based distro that can be very useful for what you are describing. You also can use some resource monitoring scripts to do the same.

Posted by papi, 11-01-2010, 09:20 PM
cloudlinux looks interesting but its a bit too soon to be swapping OSes ... i'll be sticking with Centos. which scripts for monitoring would you recommend ? Esepcially something to monitor the total memory used, number of processes and so on? I've tried using the built in /etc/security/limits.conf (PAM) on one server that was having problems with one certain web site (too many PHP processes draining all memory) - but it doesn't appear to do anything. eg. "username nproc hard 50" is meant to limit the number of processes that this certain user can have running to 50 but it didn't work. Yes I've made sure the processes in question (/usr/bin/php) were in fact compiled against PAM so it seems as if limits.conf has no effect whatsoever. And Thanks jeremly, I'll use that to test load impact

Posted by drspliff, 11-02-2010, 12:03 AM
RLIMIT_NPROC works well, just slightly different from what I imagine you want, it controls the total number of subprocesses of a specific PID and not on a global level. Say you have suPHP with PAM setup, Apache will be able to spawn as many suPHP processes as it needs, and child processes of suPHP will then be restricted by your limits.conf settings. Personally I'd go with a php-fpm pool per user with a dynamic pool and a reasonable maximum # of responders, preferably in their own cgroup containers with memory/swap & cpu limits... but that's not exactly your average cPanel plug & play setup.

Posted by quad3datwork, 11-02-2010, 02:07 AM
Well, from what I read in articles... you can probably find a way to rent some botnet by the hour to stress test your new server. Just a thought!

Posted by papi, 11-02-2010, 10:22 AM
From what I've seen cpanel's suPHP calls the binary for every PHP request ie. they're not child processes but independant procs for every single request. And I don't understand how limits.conf isn't able to restrict them, can anyone shed any light on this? If anyone can recommend any other tips for optimizing php and mysql for performance, I'm all ears (but won't do anything at the expense of security eg. getting rid of suphp)

Posted by Thanh Tran, 11-02-2010, 11:30 AM
http://weather.ou.edu/~apw/projects/stress/ That is a good utility to put your hardware under various tests. Very simple, but very efficient as well.

Posted by Activeroot, 11-03-2010, 04:59 PM
For hard drive testing bonnie++ is great; http://www.coker.com.au/bonnie++/

Posted by guru4hosting, 11-03-2010, 09:42 PM
Great Tools !!



Was this answer helpful?

Add to Favourites Add to Favourites

Print this Article Print this Article

Also Read


Language:

Client Login

Email

Password

Remember Me

Search