Then we add real-time diagram sharing to mxgraph.com. Oh. Now we need hanging requests on a server for everyone listening to a shared diagram, so they can receive updates. Even if the changes transmitted are just deltas, the original diagram can't go straight from storage to sharing users because they are not the account holder, so the web application servers are handling the data again.
And so on, you get the idea.
Can you be expected to trust the data handling of every web application provider, no. Is it easy develop client-side functionality that avoid the server ever seeing the data, even if it's not persisted there, no. The solution is virtualization and/or private clouds, private clouds really being a super-set of the former. Eucalyptus seem to be leading the way enabling you to create your own hardware resources safe behind your firewall and run Amazon machine images and VMWare images on the collective hardware resource. Will this become the norm? Hard to say, but it's catching on in the bigger IT departments. It does present web application providers with a new headache though, to control their server hidden behind a firewall, possibly unable to communicate with it. Let's not forget this has been the norm for long time with license keys and so on of desktop server installs, the control providers have in cloud solutions in spoiling them somewhat. This setup makes the users life easier and easier, but web application providers clearly prefer the current public way of doing things and you'll most probably find them fighting to argue the case that they are secure so they remain in control of the hardware.