There has been a sharp increase in the number of users trying out the mxGraph demos using iPads. We took stock of a number of the devices last month to see just how well mxGraph works. There were two main observations, it was too slow and our fingers were too big. Displaying cells works fine, and so it should, we support Safari fully on non-touch devices. Some of the iPxxx specific gestures (like the two fingered zoom in/out) we've added very experimental support for, so you get a feel for how we intend to implement the various gestures. Handles are clearly unusable, so we need an alternative for grabbing the ends of edges to disconnect them. Touch and hold on a vertex to enter a connection mode makes sense, I believe. A lot of the other problems are more about the sizes of shapes on the palette and menu sizes on the examples, we'll create an iPad specific example to address this.
As an aside, the intention is to support the iPhone too. We are somewhat unconvinced that the iPhone is a serious platform for diagramming, interaction is painful in pretty much all similar applications that we've seen. We might look into simplifying the gestures on the iPhone to make viewing faster and easier, but at the expense of interaction (configurable, of course). iPhone work will be secondary to the iPad support, the iPad size makes it a far more likely device.
The API should be very little effected by the addition, already we abstracted SVG and VML as the vector drawing mechanisms. One part of the change that will be more significant is the loss of the display DOM's hit detection and dirty region repaint. We have implemented these mechanisms fully in the Java client, so there are no research elements, it is still several weeks of porting effort. Timescale-wise, iPad support will be by November 2010 and full canvas support by January 2011.