There are 29 sites on my multisite project that, at this point in development, have exactly the same data set. Each of the sites is using different a theme. I have all sites in a (chrome) bookmark folder. As an experiment I opened them all at the same time to see how the server responded. It took about 3 minutes to open all of them! Below are screen shots from my google developer console that clearly document the slam on the server.
The project is on a google cloud platform g1-small instance (1 vCPU, 1.7 GB memory). Storage is on a 222 GB Standard persistent disk. The entire multisite server directory is currently only 675 MB.
Options exist to expand the server capacity including more memory and additional vCPU's. My question is: What is the best way to increase server capacity in order to accommodate opening dozens of pages at once without crushing the server?
Once the multisite goes operational I hope to have hundreds of sites generating lots of traffic (and funding to expand capacity). I want to be able to handle my server load efficiently, but I don't want to pay any more than necessary while I am getting started. In the meantime, the server must be as fast as possible in order to help me start selling sites. (No one wants to buy into a slow site.)
Is it more effective to increase vCPU's or memory (or both)?