December 9, 2016 by Jacki Liddle, Adrian Carter, Christina Atay, David Ireland And Mark Burdon, The Conversation
Developers need to be aware of any legal or ethical issues when creating any healthcare apps for smartphones. Credit: Shutterstock/thodonal
From large companies to tiny startups, many people are
working on creating apps to monitor and improve our health. The technical skill
needed is widely recognised and developers are becoming more aware of the need
to involve consumers and health professionals in the design.
But app developers may need to consider even broader
issues. Our experience in the development of an app to monitor symptoms of
people living with Parkinson's disease showed us the ethical and legal issues
related to app development.
Parkinson's disease is most commonly known for its
movement symptoms (tremor, stiffness, changes to walking patterns). The
condition can also affect a person's voice and communication. These symptoms
can change from hour to hour, day to day and may change in response to
treatments.
Such complex changes can make it hard for health teams
to understand people's symptom patterns and how to enhance treatment.
Technology is a promising option to collect information about the symptoms,
hopefully leading to better treatments.
Understanding a person's voice quality and the
thinking skills used in communication is critical in Parkinson's research. But
spontaneous, natural speech of people with Parkinson's disease can show
symptoms that are not found under
laboratory conditions.
So rather than taking an occasional voice sample at a
clinic, an ongoing record of actual communication during daily life is the best
option.
An app for that
As part of a larger smartphone system,
we developed an app to detect speech, record a few minutes of speech, then ask
for the person and anyone else in the conversation's permission to upload the
recording to a secure server.
We can then analyse the acoustic (sound properties)
and linguistic (language and thinking skill) properties of these conversations.
This would give an idea of any speech problems or changes and when they had
occurred.
Consumers and health professionals were
involved in the development of this app. But it was not until a review by an
ethics review board that the potential for legal issues was raised.
The committee questioned the legal issues of
accidentally recording unrelated people in the background of the person of
interest. We had not considered this possibility.
There are no specific guidelines for groups conducting
this kind of research. It is also very common for people to use and record
video and audio in public areas and share this online. So we explored the legal
area related to recording voices, the details published in the latest edition
of Journal of Law
and Medicine.
We found that potential liabilities could arise from
Australia's various listening and
surveillance device laws because the app could record private
conversations without consent of third parties.
For example, if the app is activated in a public
setting, say a café or a restaurant, then the recording could potentially pick
up background conversations.
How do the laws apply then to the use of a smartphone
app for health purposes that accidentally records conversations? The answer is
currently unclear.
The age of these laws and their fragmented application
in Australia complicates legal analysis. Each state and territory has similar
laws that apply in subtly different ways. These laws were also designed to
combat a suite of different privacy related harms that may not be directly
relevant to medical self-tracking.
What's a private conversation?
Some of the foundational legal issues are therefore
challenging. For instance, the issue of whether an accidental recording is
legally considered a private conversation is surprisingly complex.
The converted text is then subjected to further
language processing. But it is unclear whether those words are heard or even
listened to, as understood within the context of existing legal definitions of
a private conversation.
The difference between human and machine agency is an
important consideration here and it is uncertain how listening and surveillance
devices laws are meant to apply when it is a non-human undertaking the
"listening".
Protecting people's privacy
While the law is unclear regarding this use of a
smartphone, it is possible to identify ways of recording conversations that
maximise people's privacy.
For example, once trained on the speaker of interest,
the app could identify this speaker and discard audio from all other speakers
prior to sending the recording.
Limiting the geolocation of audio recording by the app
to "listen" locations – say, at home – and "not listen"
locations – a café or supermarket and so on – also reduces accidental recording
of others.
Having the user set these locations also makes them
aware of the potential for the app to record background conversations, and use
it accordingly.
We have not yet used the app with people with
Parkinson's disease and are working towards a solution that is clinically
meaningful and considers ethical and legal concerns.
Laws need of an overhaul
Recording of private conversations and sending these
to third parties will become increasingly common. But our research indicates
the recording of non-consenual private recordings could potentially be unlawful
in different jurisdictions.
It is likely that the many other app developers, and
indeed users, are not aware of this.
Law reform needs to be considered in order to support
the growth of this important area of health research and to fully appreciate
the balance between social benefits and the privacy-related harms that could
arise. Where that balance lies is currently uncertain.
Ultimately, we believe successful health app design
requires a multidisciplinary team that is able to consider the technical,
ethical and legal issues together.
http://phys.org/news/2016-12-ethical-legal-issues-health-app.html
No comments:
Post a Comment