I don’t know how GData incorporate the avast engine and if it includes all shields, primarily the web, behavior, script shields and autosandbox elements, these would have an impact on detections.
i asked the same question to bitdefender as they have many users of there engine…and this was the answer
All our partners receive the engines(including the heuristic ones) + our hourly updates. The rest of the technologies are unique, both on our side as well on their.
That’s why in some cases, there are some differences between the detections.
GData is using BitDefender and avast!. Do the math. It will always detect more than avast!, regardless of what you do. Unless it happens that both BD and avast! detect only exactly the same threats.
That would be the expected result, two engines in combination, logically would detect more than one, but rather depends on actually what they have in their engines when combined and as I said what is the user dependant.
If you analyse the pdf results page 6 GData has no user defined (Yellow) or blocked on/after execution (Light Green) sections in the chart like avast or bitdefender, so you would think that these two in combination would produce a higher score, but in this test it doesn’t.
GData just has, detected on scan and not blocked; to me implies it can’t possibly have all of the features/function of the individual AVs in is combined engine.
I don’t think it’s fair to say you “defeated” Microsoft and others. Overall, Avast came 8th out of 17th and is grouped with MS SE in the top ADV+ group. Small differences are likely due to sampling error based on what happens to be in the ‘wild’ on any particular day. This is why ADV groups products; rankings within groups mean little.
Look at the AV Comparatives forum and test info. Symantec wants to rule the test and not show all the results. In other polite words, they do not agree with the methodology of the test.