My argument is that the MGs are fine, but the bolts are too accurate for battlefield situations.
I wouldn't like to see a random cone like DoD, but I really think that the current pixel aiming gives an unrealistic advantage to the rifleman. The reason is that a real life rifle takes hand eye coordination that isn't as precise as a mouse and monitor coordination.
Even if you had a nice hidden cover and something to prop your rifle on without a machine gunner noticing you, you still have to visually guesstimate the distance and how you aim the gun.
Since humans are intrinsically analog, you don't have a precise method to line up your gun on a pixel and pull the trigger know it will go exactly there. Even with a great deal of practice, its not a 100% accurate simply because (unless you went out ahead a time and set up range posts and sighted your gun) you simply don't know the exact distance most of the time.
Even the best modern day sniper with a high power scope must calculate the distance correctly before taking a shot.
In the real world, the machine gunner on the hand simply has to point the gun in the general direction and watch the tracers and move the gun accordingly. By the time the rifle man has fired a few shots trying to figure out his range, the MG has zeroed in on his location.
However, I don't know how in the world you would model this in an FPS game.
The only thing I can think of is a slight first time fire inaccuracy by the rifle... As in the player isn't holding it exactly the same as they expect and the first shot at a given range is a bit inaccurate. The second shot is a bit more accurate... The 3rd is a bit more.
Lets say at 25 yards you get 100% accuracy on the first shot.
50 yards gets 75% accuracy on the first shot and the second is 100%.
100 yards gets 50% and the second is 75% and so on...
So if you iron site and shoot at a MG at 100 yards you have a 50% change of hitting him, but right after you bolt for the next bullet you have 75% regardless if you aim at the same pixel or the guy next to him. And if you spotted a inf running towards you at 50 yards you would now still have your 100% pixel accuracy for him since you already shot once in iron sights (even if you have to turn around to get him).
If you lower your iron sights, change prone position, lean, or have to reload the gun completly then the accuracy resets itself due to loosing your bearings.
I know many people wouldn't like this, but keep in mind that firing a gun in combat situation (even when prone with a good gun support) has a great deal of distractions involved. If someone is shooting at you or explosions are going off nearby, your mind is either consciously or subconsciously paying attention to that fact which takes away from your brain powered guesstimate of how far away the target is.
However, I would suspect that this change (like bullet penetration) would cause more CPU load on the servers.
I wouldn't like to see a random cone like DoD, but I really think that the current pixel aiming gives an unrealistic advantage to the rifleman. The reason is that a real life rifle takes hand eye coordination that isn't as precise as a mouse and monitor coordination.
Even if you had a nice hidden cover and something to prop your rifle on without a machine gunner noticing you, you still have to visually guesstimate the distance and how you aim the gun.
Since humans are intrinsically analog, you don't have a precise method to line up your gun on a pixel and pull the trigger know it will go exactly there. Even with a great deal of practice, its not a 100% accurate simply because (unless you went out ahead a time and set up range posts and sighted your gun) you simply don't know the exact distance most of the time.
Even the best modern day sniper with a high power scope must calculate the distance correctly before taking a shot.
In the real world, the machine gunner on the hand simply has to point the gun in the general direction and watch the tracers and move the gun accordingly. By the time the rifle man has fired a few shots trying to figure out his range, the MG has zeroed in on his location.
However, I don't know how in the world you would model this in an FPS game.
The only thing I can think of is a slight first time fire inaccuracy by the rifle... As in the player isn't holding it exactly the same as they expect and the first shot at a given range is a bit inaccurate. The second shot is a bit more accurate... The 3rd is a bit more.
Lets say at 25 yards you get 100% accuracy on the first shot.
50 yards gets 75% accuracy on the first shot and the second is 100%.
100 yards gets 50% and the second is 75% and so on...
So if you iron site and shoot at a MG at 100 yards you have a 50% change of hitting him, but right after you bolt for the next bullet you have 75% regardless if you aim at the same pixel or the guy next to him. And if you spotted a inf running towards you at 50 yards you would now still have your 100% pixel accuracy for him since you already shot once in iron sights (even if you have to turn around to get him).
If you lower your iron sights, change prone position, lean, or have to reload the gun completly then the accuracy resets itself due to loosing your bearings.
I know many people wouldn't like this, but keep in mind that firing a gun in combat situation (even when prone with a good gun support) has a great deal of distractions involved. If someone is shooting at you or explosions are going off nearby, your mind is either consciously or subconsciously paying attention to that fact which takes away from your brain powered guesstimate of how far away the target is.
However, I would suspect that this change (like bullet penetration) would cause more CPU load on the servers.
Last edited:
Upvote
0