99.9% of the time when you use google or apple maps, the directions it gives apply perfectly. I’m a car with a sub seven foot clearance that has an ez-pass.
But what happens when none of that is true and you have to drive a truck?
You have an eleven foot clearance, no ez-pass and certain roads are off limits like the lower level of the George Washington bridge. How does the app know?
As far as I’ve found, it doesn’t and it can’t. This is where I believe there’s an interesting conversation to be had about technology making explicit the things it’s learning models are making implicit.
Google maps has implied information about my vehicle probably from learning about past behavior. But 1. I don’t know what it learned and 2. I can’t change it.
I see a world in which you can run Google Maps as a truck, but also browse twitter from another worldview, or see the instagram feed of another perspective by changing your own preferences.
It’s a world where tech companies give more transparency into their models and a world where people start to take more control of their data.