It
has
emerged
that
passengers
have
been
scanned
as
they
passed
through
station
ticket
gates.
A
complaint
has
been
made
about
alleged
breaches
of
data
protection
law.
The
use
of
AI
was
revealed
when
civil
liberties
group
Big
Brother
Watch
made
a
Freedom
of
Information
request.
It
appeared
that
Network
Rail
was
assessing
the
moods
of
passengers,
but
Network
Rail
said
that
was
not
so.
A
spokesman
told
the
Press
Association
that
emotions
were
not
being
analysed.
Instead,
the
purpose
of
the
scans
was
to
‘protect
passengers’,
and
that
the
data
could
be
used
to
‘measure
satisfaction’
and
‘maximise
advertising
and
retail
revenue’.
The
system
sent
images
for
analysis
by
Amazon
Rekognition
software
to
record
demographic
details,
such
as
a
passenger’s
sex
and
apparent
age,
but
that
part
of
the
programme
has
ended.
The
trials
were
said
to
be
part
of
a
wider
programme
to
use
AI
to
tackle
such
problems
as
trespassing,
overcrowding,
bicycle
theft
and
even
slippery
floors.
Trials
are
believed
to
have
started
in
2022
at
Glasgow
Central,
Leeds,
London
Euston,
London
Waterloo,
Manchester
Piccadilly
and
Reading,
which
are
all
stations
managed
by
Network
Rail.
Similar
trials
have
apparently
been
carried
out
at
Dawlish,
Dawlish
Warren
and
Marsden
stations,
which
are
managed
by
Great
Western
Railway
and
Northern.
Big
Brother
Watch
head
of
research
and
investigations
Jake
Hurfurt
said:
‘Network
Rail
had
no
right
to
deploy
discredited
emotion
recognition
technology
against
unwitting
commuters
at
some
of
Britain’s
biggest
stations,
and
I
have
submitted
a
complaint
to
the
Information
Commissioner
about
this
trial.
‘It
is
alarming
that
as
a
public
body
it
decided
to
roll
out
a
large-scale
trial
of
Amazon-made
AI
surveillance
in
several
stations
with
no
public
awareness,
especially
when
Network
Rail
mixed
safety
tech
in
with
pseudoscientific
tools
and
suggested
the
data
could
be
given
to
advertisers.
‘Technology
can
have
a
role
to
play
in
making
the
railways
safer,
but
there
needs
to
be
a
robust
public
debate
about
the
necessity
and
proportionality
of
tools
used.
‘AI-powered
surveillance
could
put
all
our
privacy
at
risk,
especially
if
misused,
and
Network
Rail’s
disregard
of
those
concerns
shows
a
contempt
for
our
rights.’
Network
Rail
responded:
‘We
take
the
security
of
the
rail
network
extremely
seriously
and
use
a
range
of
advanced
technologies
across
our
stations
to
protect
passengers,
our
colleagues
and
the
railway
infrastructure
from
crime
and
other
threats.
‘When
we
deploy
technology,
we
work
with
the
police
and
security
services
to
ensure
that
we’re
taking
proportionate
action,
and
we
always
comply
with
the
relevant
legislation
regarding
the
use
of
surveillance
technologies.’