I think its annoying whenever something made by Americans about WWII comes out its instantly chastised by people from different countries about featuring Americans. First off, cry me a river. Second, its probably easier to film about the US anyway. Sorry, but I don't think many in Hollywood who'd be willing to create a series or even a movie about the government which started WWII and murdered millions of innocents. Regardless of it being SS, or Wehrmacht, or FJ. What's the fun in watching something about the losers of the war? In BOB you can feel the spirits of the soldiers rise up as they win victories. If the Germans were shown, what later war victories would there be? Market Garden is about all I can think of.
Then again, what about the Luftwaffe or Kriegsmarine? Never seen anything about that ever. I've seen Stalingrad which is German. Why does Hollywood have to be the one that breaks new ground all the time?