Performance of App-V and ThinApp

We weres recently asked to provide evidence that virtualising an application would not affect its performance.

The request was quite reasonable. The application in question was a high-performance engineering application: Patran by MSC Software. Patran has some configurable parameters to optimise performance on high-performance workstations. Not much point in optimising it if the virtualisation caused a loss of performance.

My first thought was that virtualisation really shouldn’t affect performance. Application virtualisation redirects the file system and registry to alternate locations. You can see this quite clearly in the structure of the package. This might affect IO intensive operations, but not operations in memory. But this is just theory, and I can quite understand an engineering manager would want to see more than a theory.

My second thought was to look for data on performance from the vendors (in this case VMware for ThinApp and Microsoft for App-V). But I didn’t find anything useful, which is odd.

So then we looked at the problem again, and began to realise that it could be really quite difficult. How would you demonstrate that in no way was the virtualised app slower than the physical app? How would you create controlled tests? For a few benchmarks, obviously, but not for every function.

The problem became harder when the testers showed some results than indicated the virtualised app was significantly slower. The test was to use Fraps to measure the Frames Per Second (FPS) when running a test model. Patran needs to render the graphical model on the screen as the user manipulates it. The test showed that the virtualised app rendered the model 33% slower than the physical app.

I was surprised by this, as the rendering clearly happens in memory on the graphics card, and has nothing to do with IO. But then I looked at the data a bit more and I found that the result was not really 33%. What really happened is that rendering is done at either 30 FPS or 60 FPS, In the case of this one test, the virtualised app hit 30 more than 60, and vice versa for the physical app. Still, it was not going to be possible to wait for any adverse test result and then find out whether it was significant or not.

The route we took was to take some benchmarking software and to virtualise it. That would mean that all the benchmarks would run virtualised, and the same benchmarks could be run normally. The software I took was PassMark PerformanceTest.

PerformanceTest has a wide range of benchmarks: for CPU, Disk, Memory and Graphics. The tests showed that for every benchmark the virtual app performed about the same, with no significant difference.

Here is the summary overall:

Test Rating CPU G2D G3D Mem Disk
Native 1924.4 3443.5 480.0 583.9 1674.3 3117.5
ThinApp 1915.1 3462.7 462.3 581.0 1706.3 3206.6

And here’s the summary for 3D Graphics:

Test 3D Graphics Mark DirectX 9 Simple DirectX 9 Complex DirectX 10 DirectX 11 DirectCompute
Native 584 41.1 22.6 4.4 9.7 315.1
ThinApp 581 41.0 22.6 4.4 9.6 313.5

Based on this, it seems fairly unlikely that an application would perform significantly worse by being virtualised.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.