I am beginning the testing process on Moodle 2.0 using the school's actual data. We have 675,000 assignment submissions, and one hour into the the update process, it has migrated less than 50,000 of these so the projected time is more than 15 hours just for that task.
If the update process from 1.9.x to 2.0 takes many hours, it is going to get much less testing than if test sites can refresh their test servers with a complete set of actual data on a daily basis to do side-by-side comparisons during the beta process.
Perhaps if a list of what is happening during the update process could be documented in terms of those operations that are very time consuming in the update scripts, a small set of SQL scripts could be made to have them do the heavy lifting.
The development machine I am doing this on runs the latest version of Ubuntu server, and has plenty of memory and processor speed. There has been no tuning of SQL, however.