Interesting. I like that they seemed to prove people were changing their behavior in response to learning about unneeded permissions. Unfortunately, the choice is a coarse-grained one: Don't install/use the app. They note another possibility with their mention of "A Privacy Preserving API Layer", which sort of describes an UMA approach in their Section 8 (Recommended Best Practices), only in the guise of one-off proprietary solutions.

We should perhaps reach out to these folks, since their Section 9 (Related Work) says "To our knowledge, this is the first work that studies the problem of user privacy in the context of 3rd party apps on top of cloud storage providers", and from a quick search they appear not to mention UMA at all -- or even OAuth, whose mechanisms they're basically describing throughout the paper. Maybe they'd be interested to take on a research project around adding UMA to the mix.


Eve Maler
Cell +1 425.345.6756 | Skype: xmlgrrl | Twitter: @xmlgrrl


On Sun, Aug 28, 2016 at 10:14 AM, Adrian Gropper <agropper@healthurl.com> wrote:
The paper is a must-read for anyone trying to understand scope design of APIs as well as those that would design registries or trust frameworks to guide users that are considering signing up for an new app or web service.

https://arxiv.org/abs/1608.05661

arXiv:1608.05661 (*cross-listing*)
Date: Thu, 18 Aug 2016 07:36:11 GMT   (2469kb,D)

Title: The Curious Case of the PDF Converter that Likes Mozart: Dissecting and
  Mitigating the Privacy Risk of Personal Cloud Apps
Authors: Hamza Harkous, Rameez Rahman, Bojan Karlas, Karl Aberer
Categories: cs.CY cs.HC
Journal-ref: Proceedings on Privacy Enhancing Technologies. Volume 2016, Issue
  4, Pages 123-143, ISSN (Online) 2299-0984
DOI: 10.1515/popets-2016-0032
\\
  Third party apps that work on top of personal cloud services such as Google
Drive and Dropbox, require access to the user's data in order to provide some
functionality. Through detailed analysis of a hundred popular Google Drive apps
from Google's Chrome store, we discover that the existing permission model is
quite often misused: around two thirds of analyzed apps are over-privileged,
i.e., they access more data than is needed for them to function. In this work,
we analyze three different permission models that aim to discourage users from
installing over-privileged apps. In experiments with 210 real users, we
discover that the most successful permission model is our novel ensemble method
that we call Far-reaching Insights. Far-reaching Insights inform the users
about the data-driven insights that apps can make about them (e.g., their
topics of interest, collaboration and activity patterns etc.) Thus, they seek
to bridge the gap between what third parties can actually know about users and
users perception of their privacy leakage. The efficacy of Far-reaching
Insights in bridging this gap is demonstrated by our results, as Far-reaching
Insights prove to be, on average, twice as effective as the current model in
discouraging users from installing over-privileged apps. In an effort for
promoting general privacy awareness, we deploy a publicly available privacy
oriented app store that uses Far-reaching Insights. Based on the knowledge
extracted from data of the store's users (over 115 gigabytes of Google Drive
data from 1440 users with 662 installed apps), we also delineate the ecosystem
for third-party cloud apps from the standpoint of developers and cloud
providers. Finally, we present several general recommendations that can guide
other future works in the area of privacy for the cloud.




--

Adrian Gropper MD

PROTECT YOUR FUTURE - RESTORE Health Privacy!
HELP us fight for the right to control personal health data.

DONATE: http://patientprivacyrights.org/donate-2/

_______________________________________________
WG-UMA mailing list
WG-UMA@kantarainitiative.org
http://kantarainitiative.org/mailman/listinfo/wg-uma