For our e-commerce store, we have a drop shipper with almost real-time data available on stock available for every item. We have created a script that downloads their 13MB XML file of data and updates the stock accordingly for all 10,000 products every 4 hours. The problem is that the script is doing 10,000 UPDATE commands as it loops through the products every 4 hours which brings the MySQL server to a hault.
Although there are less queries, it appears that the multi-table update still uses up a lot of mysql resources and brings the mysql server to a hault. The only real solution is to update in small increments throughout the day instead of all 10k at once.
13MB XML file should be something like 1.3MB of real data that you need updated.
Once update to write 1.3MB will not slow down your server to much, that is, unless your indexes are not used/messed up.
Just describe your update queries, and see if your using your indexes or not.
Then, if that slows down your site more than you want to, you can always do 1 update a second, so your 10k records will take 2.77h to finish updating.
But you usually do this when networking is the bottleneck, and from your description, your bottleneck is the CPU. In witch case, it will be allot faster to do one update instead of 10k, since your indexes will only be rebuilt once, (depending on what your updating).