What does digital eye contact look like?

A conversation with Liya Safina about trust, AI, and designing for human connection

What happens to trust when the machines take over the glances?

On a Friday afternoon in Melbourne, designers, technologists, and innovation leaders gathered at Portable to consider that very human question.

We were joined in person by Liya Safina from Google, a world-leading creative strategist and foresight designer whose work spans three continents and some of the most influential institutions in design and tech. With a portfolio that includes Apple, IKEA, the United Nations, and the Qatar Foundation, Liya is known for shaping futures thinking into tangible, emotionally resonant experiences. She blends cultural critique with systems design, helping organisations explore “what’s next” through the lens of care, equity, and possibility.

Across the hour, the conversation circled around one quiet but urgent idea:

If technology mediates more and more of our interactions, how do we teach it to recognise, to acknowledge, to care?

Across the hour, the conversation circled around one quiet but urgent idea:

If technology mediates more and more of our interactions, how do we teach it to recognise, to acknowledge, to care?

A glance across the street and across the interface

At one point during the event, Andrew Apostola (Portable's CEO and the night's emcee) volleyed with Liya about a simple metaphor that reframed the entire room’s thinking:

“In Amsterdam, bike riders usually make eye contact with car drivers before crossing; it’s a silent agreement of safety. So what happens when the self-driving cars take over? What’s the digital version of that glance, that human acknowledgment?”

It’s a deceptively small question, what is digital eye contact?, but one that opens up vast terrain for those of us working at the edges of technology, ethics, and public infrastructure.

Designing trust into systems requires more than empathy. It requires a deep understanding of behavioural cues, socio-technical systems, and the design patterns that either preserve or erode mutual recognition.

Trust, design, and the invisible handshake

At Portable, we often say that AI success isn’t just about technology; it’s about governance, design, and trust.

This event was a chance to sit with that idea. It asked us to consider not just how interfaces function, but how they feel, and how we might design systems that communicate “I see you” in moments when no one is watching.

Because whether we’re talking about smart intersections, online service portals, or digital health records, the same design challenge applies:

  • How do we make systems that acknowledge us?
  • How do we make systems that know how to look back?

Liya Safina’s provocations gave us space to reflect, to wonder aloud, and to momentarily zoom out.

It was a rare kind of gathering, small enough to be real, big enough to be ambitious. The kind of conversation that doesn’t just inform the work ahead, but realigns it.

These conversations are crucial and need to continue

This event was actually our third with Liya, kicking off with a SXSW panel that included provocations such as:

  • Avoiding the Black Mirror Future | Films like Never Let Me Go and Black Mirror show us worlds where technology has been taken too far, causing irreparable damage to our behaviors, relationships, and society. What needs to happen to ensure we don't end up there? What can you personally do in your work to help prevent that outcome?
  • Responsibility and Protection | Whose job is it to protect society from the harmful effects of technology? Just as driving requires licensing, traffic laws, manufacturer safety standards, and individual responsibility, seatbelts, attention, car seats, where do you see the balance between government regulation, corporate responsibility, and individual education and action?

Make sure you're on our newsletter list (sign up below) to get invites to events like these, or reach out to us to chat now.

Sign up to our email newsletter to get updates about our events, work and research

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.