According to recent research by MIT computer vision expert Antonio Torralba and his team a lot can be gleaned not just from the objects visible directly but also from shadows, reflections and changes in light patterns. And yes, in some cases that enables you to literally see around the next corner: a shadow on the wall may be that of a person running down the street on the other side of a corner building. And while you are not able to see the person by seeing the shadow you can still gain awareness of their presence.
This ability is obviously very useful in some situations. For example, in a scenario described above may enable a driver to slow down or stop in time to avoid running over a pedestrian carelessly running down a cross street. Surveillance cameras can supply more information than is immediately visible in the frame. Reflections of car roofs from the road below can inform one of the traffic patterns in the area. Changes in shadows around the door areas may indicate when corresponding doors open and close which may be used to enhance security and even estimate the number of people coming and going - even in scenarios where these areas are at the edge of the camera's coverage area. Vibrations of plant leaves and sheets of paper could be used to reconstruct sound creating what is termed a "visual microphone". In short, the uses are almost infinite and can help us see - and even hear - far more than before. Once again, this is a situation in which new technology takes the data we have had access to for a long time - and allows us to drastically improve our ability to interpret that data.
This seems to be an area where big data and AI could take us to a whole new level of perception. I wonder if acoustic data could be added to the mix and help us go even further. For example, if not only could we see a shadow moving around a corner but also listen to a change in the sound traveling through the air and other media - like the ground - and extrapolate the shape and size of the object moving around that corner. So, using the example of a careless pedestrian running around, we could perhaps not only tell that this is a person running, but also what this person is carrying. And if that is a robbery suspect a cop sitting in a cruiser around the corner would know that the runner is lugging a heavy bag - and may get a chance to act accordingly. Science fiction? Perhaps not for long.
Sources
The New Science of Seeing Around Corners
Natalie Wolchover, Quanta Magazine, 30 August 2018
Accidental pinhole and pinspeck cameras: revealing the scene outside the picture
Antonio Torralba, William T. Freeman, Computer Science and Artificial Intelligence Laboratory (CSAIL), MIT, 2012