I've posted this type of explanation before on a few threads but can't find them to quote so I'll try to explain again.
The concern is in game, when you look or aim, a player can "zoom" in and bring the center of the screen closer.
Picture a globe (or better yet go find one and look at it), of the Earth, all the land masses in mostly correct scale to each other.
Now, picture the standard paper/book atlas map of the world where Europe is put at the center or North America is the center; standard mercator projection. You will notice that the 3D globe of the Earth has been cut and stretched to fit a 2D rectangular surface.
The result, the edges are stretched to fit, and places like Greenland and Antarctica are HUGELY larger than they really are.
NOW, shift back to the game (any video game), and think of your vision as a 3D field of view. You can see in roughly a half sphere in 3D, from the inside. How can a game program represent that on a 2D computer monitor?
It stretches the edges of the vision field just like the edges of the map in the atlas are stretched to include everything without cut out broken parts. So, the CENTER of your screen is actually much LESS magnified than the edges so the scale is off.
In a game where realistic vision is supposed to be represented, that center of the vision sphere is capable of being magnified, to "stretch" it to the correct scale, like a fisheye lens putting tiny Florida back into the correct scale side by side with MASSIVE Greenland on the Atlas map.
The zooom has to be there. The only alternative would be to shrink the edges of the map to keep a realistic scale, which would end up with us looking at the screen like an actual fisheye lens, big in the middle and distorted and shrinking towards the outside.
This is why the zoom feature is there in games and should be in this game.