How did I reduce CPU overhead problem caused by MySql?

From last day, We were having problem with a project which was shut down in the middle due to heavy traffic. As a technical manager, I was the person who to take charge over the project bring it down to the track. After doing few benchmark test, I came to know that the MySql access from a PHP file was creating the overhead to the CPU of the server.

Problem of CPU Overhead with MySql

Let me explain the problem clearly first,

We’re working on a website which was similar to and as you can see clearly in that website the most recent data should be fetched form MySql database in each second. in the PHP page, which was being called from Ajax every second, had around 5 SQL queries to retrieve data from server.

First of all, I optimized that page reducing the 5 queries into single query by using left outer joins among 3 tables. And then, I did benchmark test by using Apache benchmark tool(ab) 50 request with concurrency of 50 with the following command.

ab -n 50 -c 50

And then in another SSH shell prompt, I run the top command to view the CPU usages parallelly.

I was still horrified that the CPU usages by mysql after 50 concurrent user was going out of control(100%) despite of optimized query. But many joins have been used in that single query and lots of data were there in those tables so the database overhead was high even though it was a single query.

How did I reduced the CPU overload drastically caused by MySql?

Now, the first challenge was to reduce the database access. It was clear that it was caused by concurrent database overhead in the PHP page which was being called in every second. Here is the simple steps what I did to reduce the database overhead.

  1. I created another PHP file in which I’ve transferred the mysql query causing CPU overhead and called it from Cronjob.
  2. Created a temporary table for storing the output given by PHP page called from CronJob.
  3. Then, I scheduled the CronJob in every second, don’t tell me that CronJob can’t be run less than a minute, take a look at this post before saying this.And from each call, the output data was stored in the temporary table.
  4. And, finally from the Ajax the a new PHP page was called which was only accessing the data from temporary table with single row.

I did the same benchmarking again to the newly made page(which is accessing data only from temporary table) and saw that CPU usages after the this process reduced drastically.

I know I can further reduce the CPU overhead by storing the temporary data in a text file. You guyz know that I’m lazy so that part is left for another day.

40 thoughts on “How did I reduce CPU overhead problem caused by MySql?

  1. Thanks for the tip.
    Included for my future reference

  2. Roshan! That sounds a good idea but keep in mind if your cron-job takes more than one second to execute, wat will happen?

  3. @shafiq – thanks man you’re right about this but this dirty little technique I had to use to reduce overhead as I was taking remaining time from the temporary table there is margin of error of + or – 1 second furthermore I’m thinking to to use another middle PHP file in which I use sleep for 1 second rather than doing it from bash script in this way I just have to run the cronjob in a minute only…

  4. Thx for the tip!Its very usefull, my system runs now better…
    But there must be more ways to reduce the cpu overhead problem…

  5. Or you could use a view.

  6. Kris Brixon

    An option:
    Use the temporary tables, but populate them from triggers instead of a cronjob. Setup a trigger to run the SQL when a relevant field is updated. There is no reason to keep rerunning a SQL statement that returns the same data. If the data on average only changes every 2 seconds then you will reduce the load by half.
    Note, I come from an Oracle world and not MySQL, but from the link above it looks like you have this ability.

    I did not stay on the site long, but it looks like you are just updating prices, if so, make sure the SQL that runs the most frequently is just returning the prices and not everything about the items.

    Another link that looked interesting:

  7. @Krix – thanks for idea of using trigger I may have to try that one as well. but the issuse is with most frequent data …they are updating the highest bid every time user bid so you never know when it’s going to update and I’ve up get data from server in each second, furthermore in last few second of auction.. it’s most necessary……

  8. Anonymous

    @Loïc Hoguin: No, you can’t. A view is not a table. The performance of a view isn’t different to the underlying query. (see

    @Roshan: Anogher suggestions to further reduce load. Don’t poll the DB. It’s much more efficient to push changes as they occur – without any database roundtrips. Have a look at for ideas.

  9. john
  10. @Anon: I guess you gave me yet another reason to love MySQL. (Not that I use it, unless I’m required to.) Thanks for the tip.

  11. Roshan very very good thx…

  12. Clint Rowden

    What about using prepared statements? I’ve read this reduces overhead, though I’m not experienced in using them.

  13. why not use memcached? if data changes so fast why bother saving it every second to the db? use memcached to generate the page and save the data in the db for a “backup”

  14. Lee

    Oh wow, that was really helpful. I’m working on project with a similar concept. Basically, we have a few “broadcasted” events for the page that also needs a more then temporary storage system. The problem came after having a few people come on to test the feature. Unbelievably, our cpu usage went off the chart. I’ve been searching for a solution to this for days now. This post has been a real help.

  15. IMF

    I agree with Bboing memcached is recommended to solve this overload problems.

  16. thanks for the post roshan

  17. Thanks for sharing, this is real useful and somthing you would not expect.

  18. hi good thank you master

  19. Thank you for these great tips.

  20. SD

    I havent implemented the way you have said but i am sure that cronjob will too overhead your CPU uses. This is not the exact answer I was expecting for, which has driven me inside to read more on this title, reducing the CPU overhead.
    Sorry ,but i thought that you would be describing about the feature of database or base of mysql
    like query execution time ,normilazition and so on……
    last question:
    you havent mentioned the sql query execution time. why?

  21. Dinesh

    The techniques you have said are not optimum way to reduce database overhead problem,i think. runnning crone job and accessing file via ajax process seems to be same way of retrieving, server cpu overhead problem can not be reduce until it(database) has managed in well manner . server clustering, normalizing(4f,5f) tables and complex join queries
    on the other hand important parts beetween server and client which bridge the gap beetween them in a optimum way

  22. wish

    May be i am wrong(cause u r genius) BUT::
    This is the very much disappointing answer as for Kris Brixon question.
    You definitely might knew the difference between the trigger and the cronjob.
    I am 100% agree with the Kris Brixon answer because Trigger will execute on some event whenever there is some change in the table. so running the cronjob without any changes in the table data (say: with no new bids in your case) is vanity.

  23. @SD – I agree that that later on this technique has definitely reduced the cpu overhead as before. But was found much better solution for this….

    make a temporary table for accepting the bids….
    make the indexes properly …
    use memcached for caching data (can cache for the second which is enough)
    flush the data from temporary table and move to real table in a particular interval time ….
    and use innodb type if you’ve frequent insert queries and have to use transactions…
    and off course use mysql clusters if you really need it…

  24. Mitul

    Instead of using CronJob you can create a view for it and data will be populated based on request. There are other several things which you do to optimize result.
    – Create indexes(Single or combine) based on requirement.
    – Access only those elements which are required.

    Also there is another option is you can create memory table which deliver results more faster then any other.

    You can drop me an email if you need more help.

    Mitul Patel

  25. What if the database server gets busier? You’ll end up in a much worse situation…

    There are *so many* better ways to do this!

    First, let’s start with the fact that in most scenarios you would need the content to be updated up to the second, what about 2, 5, 10, 60 seconds? Just a thought, it is possible to you need it up to the second.

    What about MySQL cache, is it turned on?

    What about persistent connections? Is it turned on?

    You can output the html cache into a file, could be partial part of the file or the whole thing. It will definitely give you a huge benefit over database queries as well as PHP processing.

    In terms of architecture, you can probably display cached output (again, you choose the update period), for example, do guests really need up-to-the-second information?

    From output cache to db data cache, you can store the data in a file or possibly SQLite.

    You should also consider Memcache if you have a really high amount of requests. If would let your RAM do the work instead of your harddrive (no reason to explain why RAM is faster, right?).

    In my experience, cached output, Memcache and MySQL’s internal cache always worked but in your case, I would use the info above with some Javascript code to calculate the time they’ve left.

    Just my thoughts…

  26. Somebody

    Are you the creator behind

  27. thanks for sharing :)

  28. That seems a bit complicated to me. I mean in half year you will not expect this app to depend on some cron job.

    Would caching not make sense here? maybe APC or memcached ?


  29. wow great tips but i can’t do as you are saying. i got the same problem CPU over limit due to uploading and playing videos by many users at a time and our server halted and some of them (previous hosts) had refuge to host our site. is there any solution that i can take from you? please reply.

  30. I think after uploading the video take some time to make it active or use cron job to convert it to flv from cron. Use one server for web and one for database. Hardware scaling thats the only thing you can do at the momoment..

  31. Hello, Very good article. I will continue to pursue this article, as it is written is very interesting. Since we are interpreted very much on good information. Best regards :-)

  32. The article seems to be very interesting for students. We would be delighted if we could pursue this Arikel for our students.

  33. Thanks for sharing. Very useful.

  34. Really nice and useful comments. Informative article. Thanks for sharing.

  35. Its old thread but always new!

  36. Hi, I find the website very informative and helpful. I will recommend this site to people on. The site is written very interesting.

  37. Thanks for these tipps. I currently have an similiar problem with my website. So i check your input and see if it works out for my issue.

  38. ashish singh

    Hi, i would like to know what is the best architecture in php to manage the auction site which having a large amount of data.

  39. Hi roshan, very gud job, keep it up!

  40. The article seems to be very interesting. We would be pleased if students can pursue this Arikel. Thanks

Leave a Reply

Your email address will not be published. Required fields are marked *