I haven’t enrolled with Apple’s paid developer service yet, so I haven’t been able to deploy my apps onto my iPhone. All in time, I hope.
Anyway, comparing the satellite map images between the iOS Simulator and my real iPhone and iPad I see that the simulator seems to be displaying different satellite images than my iOS devices are showing. How is this happening?
I thought for a moment that the simulator was showing Google’s satellite data but quickly came to my senses even though the data looks very similar. I now think that the iOS simulator is showing older Apple satellite imagery and my real iOS devices are showing the newer, higher quality data. I live in Glasgow and on the iPhone and iPad, the data is high quality images and able to be viewed in 3D with buildings rendered as rotatable, 3D objects.
Of course Whereami is a nice and simple app that doesn’t try or need to manipulate the 3rd dimension, but why the difference in the data sets, even when both the simulator and the iPhone are viewed straight down (3D off, in plan view as it were)?
Here in Glasgow it makes the difference between a major motorway (freeway) being there, or not. Just wondering if the api’s decide what data to send depending on the capabilities of the end device. If that’s the case then shouldn’t the iPhone 6.1 Simulator roll this functionality in?
Not a big deal