In September, 2020, the International Digital Accountability Council (IDAC) released the results of an investigation into ed tech app privacy concerns and the need for privacy and security improvements in such apps (“Privacy Considerations as Schools and Parents Expand Utilization of Ed Tech Apps During the COVID-19 Pandemic”).

This investigation relates to the concerns of education organizations, but there is overlap in concerns with our audience for Learning Solutions, mainly eLearning practitioners working in enterprise and government organizations.

On September 3, 2020, I spoke with Quentin Palfrey, president of the IDAC, about the investigation and its possible relationship with similar concerns that Learning Solutions readers may have.

Bill Brandon: How do you see results of your investigation of ed tech app privacy practices relating to Learning Solutions readers who are primarily instructional designers and managers of online education and training for enterprise and government organizations?

Quentin Palfrey: Some of the lessons we’ve learned looking at ed tech apps that are in the universe that we studied are applicable. More broadly, there’s probably some overlap between some of the apps that we looked at and some of the subject matter that’s of interest to your readers.

It would be helpful to start by describing the process that we went through to determine what apps were within the scope of this study. That may focus on this question of subject matter overlap.

BB: It would be very helpful. Thank you.

QP: One of the things we were looking to do was to be helpful to the conversation surrounding the dramatic increase in utilization of distance learning tools that has been precipitated by the pandemic in the K -12 context. This is resulting in many parents having to grapple with something that looks a lot like homeschooling. And for teachers, it’s been a crash course in learning how to provide various tools to their students that will help supplement the other traditional curricular elements in a world where they’re not physically in the same place near as often, or at all. And for school districts, that has required fairly rapid trying to redesign their curriculum.

We looked at learning apps pretty broadly defined in two environments: The Google Play Store and the Apple App Store. In talking to some educators about what tools they found useful, and also by doing some of our own manual searches, we came up with this universe of apps that are relevant to distance learning but not confined to the additional set of tools that, say, a large school district that has a formalized procurement process might use. It’s a little broader than that. Then we took a look at those using two techniques. The first technique was a manual process, which is a little bit more intensive. We looked at 98 unique apps. It's a little more complicated than that because there’s some overlap between the apps that were on the Android environment and on the Apple environment. We went through that process, essentially downloading the apps onto the devices and interacting with them in a lot of ways that users typically interact. We also looked at our larger universe of apps through an automated process and the automated process simulates some of the ways that the user would typically interact with an app. It’s a little less comprehensive but it gives us some trends. Looking at these two approaches, we were able to identify some places where particular apps were falling short of best practices related to privacy and security, and also draw out some trends and recommendations as to risks that exist within this ecosystem.

BB: These 98 apps, I suppose I could characterize these as apps that are for delivery and interaction between an instructor or curriculum materials and learners, is that correct? These are not used for development of content, for example.

QP: They're a little broader than just sort of pedagogical aids.. I think in some instances these are apps that a user could use for their own educational development. We thought about it as a combination of apps that describe themselves as educational apps, apps that were virtual classroom apps, various types of educational content, various kinds of language learning apps, learning games, some library or reader apps. Team communication tools that were deployed in the context of remote learning. So we tried to have a pretty wide aperture in terms of the scope,.

BB: Were you thinking at all about GDPR requirements? That’s a European Union concern, and a valid one, but I don’t know how broadly you were looking at things.

QP: So strictly interested as an international study, we looked at apps across countries, and we were keeping in mind the requirements of GDPR and in a number of other countries. Also we’re interested in compliance with the terms of service of the platforms themselves. In a lot of instances, the guidance that Apple will provide to developers is a lot more specific than the legal jurisprudence of the jurisdiction where the apps are offered. Knowing what does or does not violate charter or deceptive trade practices within the context of certain US states or the Federal Trade Commission’s jurisdiction is one question, but there are a number of things that may be considered inappropriate in the context of the terms of service of, say, the Google Play Store or the Apple App Store. They may not rise to the level of an actual violation of, say GDPR or another legal norm, but still would be the kind of thing that we would consider concerning and would call out in this study.

BB: In the initial press release that I received about this study, there were five specific areas that were named as concerns. For example, ones that I made note of had to do with over-collection of user information and persistent identifiers, third party data sharing lapses bypassing privacy enhancements, and inclusion of advertising analytics and social media. What I’d like to focus on in your findings are the ones that you found most concerning. They're all concerning to one extent or another, but what’s at the top of your list?

QP: Yes, if you’re looking at the ecosystem as a whole, I think the biggest concern is the large amount of data collection and third party data sharing that goes on that users don’t see and can’t control and can’t opt out. We found some instances where there was a collection of geolocation information, where there was the use of persistent identifiers—where a specific person was at a specific time. And that’s pretty sensitive.

When that’s shared with third parties, there is a subsequent concern about losing control of that information. The issue with respect to the software development candidates is that SDKs (software development kits) are very commonly used in coding, and they’re not on their face an inappropriate practice. When developers are coding apps, they often take everything from scratch and often they do work pieces of code from other areas in that code, often provides functionality that is useful for users including workers.

We’re in a constant concern in the ed tech context where there’s not a fit between the functionality that’s provided by that code and the user experience. So the code, if it is providing tools that the user might reasonably expect if the code is useful. For advertising or analytics purposes or for sharing with social media companies for the user experience, that sometimes comes with the corresponding privacy concerns, where the SDK may be collecting information that may not be providing relevant user benefits. Context really matters in what the user thinks they are getting, the direction of the app really matters.

But at bottom, if there’s a lot of information collected, a lot of information shared with third parties, and the user doesn't know about it and can’t do anything about it, that starts to create some privacy risks. The security risks are real as well. The security risks that we saw related mostly to the inclusion of sensitive information in URL queries. If you have the URL and it includes information about name or email address or location of the user, or if it includes an ID that can be tied to the user, that creates the risk that you have information that finds its way into other hands that weren’t intended. It’s bad coding practice and something that we recommend that people avoid in order to keep user information safe.

So there was a combination of privacy concerns and security concerns. I do want to say one overall thing, which is on the whole we were generally pleased that a lot of the learning apps were incorporating privacy enhancing features into the design of their apps. On the whole, we thought it was encouraging what we saw among these apps, and we certainly are coming at this with a goal of trying to help facilitate this really core set of tools within the working landscape. We certainly don’t want folks to extrapolate from our findings that we are deeply concerned about this environment. There are some concerns that we identify that should be addressed by developers, by platforms, by regulators, by academics, by industry groups in order to allow all of us to get the benefit of these tools. Our end goal is to actually make this a trustworthy environment that everybody can benefit.

BB: How should the practices of learning practitioners in the enterprise or in government change? Are there trends that you see that readers should be aware of?

QP: Yes — let me call out a few suggestions, and also one trend.

Developers should focus on privacy by design and data minimization when they create these works. A lot of the behavior that we see that is concerning is the result of coding practices that could be changed fairly easily and have the effect of scope delivering valuable educational content without the price. I think developer education is the key to ensuring that not as much information is collected and shared with third parties.

The same is true of the security practices. Some of the security risks can be pretty easily remedied by following good coding practices and security protocols. We should try to help give developers education and tools, practices. I think some of it other actors should insist on. Consumers should insist that there are good practices, but platforms and regulators and vendors should also be looking at their practices and insisting that developers adhere to good practices. Coders can use software development kits in particular with more care, and that will have the effect of minimizing some of the data collection and privacy risks. I think that’s an area for attention.

What trends are concerning to us? One that I would mention that is concerning to us and deserves more attention is this trend of ID bridging. ID bridging can best be understood by understanding that a few years ago a privacy feature was included in the major platforms that allows for a user to reset the ID that’s associated with them for the purposes of advertising. So advertisers are collecting profiles on users over time, and there’s this privacy feature which allows you to reset the IP without replacing your device ID. ID bridging is a way of sharing both resettable ID which is associated with the device for advertising purposes and then the non-resettable ID at the same time. If the consumer resets the advertising ID, the past information and forward-looking information can be bridged, thereby bypassing the consumer’s pro-privacy actions. This practice, known as ID bridging, is prohibited by Google’s developer policies and is a way of helping to implement this good privacy.

But we do see the practice of ID bridging occurring fairly frequently, not just in the learning environment but also more broadly than that. That’s an area that seems to circumvent some of the steps that privacy professionals and policy makers have been trying to put into practice to make sure that consumers have tools to protect themselves. I would like to see greater trimming of that practice and greater steps put in place to prevent circumvention of these principles.

Want more information on this topic?

Privacy and security are growing concerns! In this comment section, please share your interest in and needs for more information about regulation (international and regional) of these matters. The Learning Guild will use this information in planning future content and research.