I remember it as if it were today: Steve Jobs introducing the first iPhone to the world and explaining why he needs a proximity sensor near the earpiece. He said something like "You don't want the screen to receive commands from your face." This is such a boring thing that, when it fails, it gets enough people to generate a 16-page discussion on Apple's support forums.
What happens to many users that they are holding the device in the wrong way on a call and the screen flashes, recognizes touches of the skin of the face and generates all kinds of confusion, such as silencing a call, starting a FaceTime session (requested by the face, hehe) and try to send text messages. , to use the expression of one of the users with this problem, “more than frustrating”.
The weirdest thing is that the hardware itself works if you cover it with your finger, it will turn off the screen. However, when using the iPhone as atelephone and take it to the face, the sensor goes crazy. Some people managed to exchange the device at Apple Stores (I don't know what units, since it is sold out), while others used the “5 Rs” (Reset, Retry, Restart, Reinstall, or Restore), but they all continued to have the same problems, so it’s suspected that it’s a software problem. It may be that the activation time of the sensor is unregulated, so that any minimum movement causes it to rewire the screen.
And there we go. Again. The supposed update to iOS 4.0.1 did not come today as it was originally supposed, but hopefully this and other problems will be solved by it, when it comes.