Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
My company has a web analytics package which we use for our own and customer marketing campaign tracking. It uses a combination of server logs, JS & image web bugs, cookies, unique cached files, and ETag headers to collect and collate user activity.
Recently we have found that a certain (unnamed) privacy-guard application which plugs into the user s browser is munging certain tracking codes with the apparent intent of preventing the user s activity from being tracked. We have purchased a copy of the app and tested locally, and it does the same for many other web bug and analytics applications including Google Analytics.
For most of these, the way in which the data is altered would prevent the tracking software from operating properly. However, they use a consistent pattern for the alterations, and due to the way that our collation works, their changes have no effect on the operation of our tracking and analytics package. (Well, there is one side effect which reduces accuracy of some timing calculations from millis to seconds.)
In a nutshell, the situation is:
Our analytics results are unaffected by the application s attempt to subvert the data
The user clearly intends to prevent analysis of their online activity
It is possible for us to alter our application to detect the attempted blocking
We would have to spend time and money patching and testing our application in order to make the attempted privacy blocking actually successful
So there is an ethical quandary, as to how much effort we should take to detect and honor the user s wishes. Some of the issues involved are:
Isn t it the responsibility of the privacy app to perform as expected? There are ways they could alter the data which would prevent our analytics from tracking their users.
It our responsibility to to enhance our application to detect the user s intent? This would incur both the development cost as well as eliminate valuable data (roughly 2% of our traffic is using this app).
What do you think our ethical responsibility should be?
We should ignore it and have our application work as-is
We should take the expense, lose the data, and honor the users implied desire
We should contact the developers of the app and tell them a better way to stop our system from working
We should publicize that their software does not perform as expected
Other...?
To clarify, the privacy tool simply doesn t work. Our application, without alteration, still tracks users who use it. We would have to change our app in order to not track these users.
We do have a cookie-based opt-out which the user can select from the tracker s home page.
We sent a note to the company that developed the privacy application, and they said they would look into it.