Hello Future Self


Hello Future Self explores our increasingly augmented world by allowing participants to create an avatar that unites human and robotic senses. The experience seamlessly blends physical and digital interactions, resulting in a collaborative digital animation. Participants can prepare for human technology integration by deciding how to alter their avatar’s body. Which sense would you prioritise?

Inviting children to see their creations come to life, Hello Future Self is a participatory drawing project which transforms hand-drawn artworks into digital animations. Using a checkbox system embedded in each template – a nod to punch-card programming systems of the past – the work allows participants to customise their avatar’s features after its digital birth. Using simple, accessible scanning technology, young participants are encouraged to engage without assistance.

Playfully interactive and designed especially for children, Hello Future Self is nonetheless powered by complex 3D worlds and computer vision techniques.

Hello Future Self is an Experimenta and Australian Network for Art and Technology (ANAT) Commission.

photos by Josef Ruckli for State Library of Queensland

Concept: Steve Berrick & Experimenta
Design, Code & Electronics: Steve Berrick

    • The Lock-Up, Newcastle, 2018
    • Plimsoll Gallery, School of Creative Arts, UTAS, Hobart, 2018
    • Tweed Regional Gallery and Margaret Olley Art Centre, Murwillumbah, NSW, 2018
    • Rockhampton Art Gallery, Rockhampton, 2018
    • Latrobe Regional Gallery, Morwell, 2019
    • USC Art Gallery, Sunshine Coast, 2019
    • New England Regional Art Museum, Armidale, 2020
    • Benalla Art Gallery, Benalla, 2021
    • Albury LibraryMuseum, Albury, 2021

The fifth instalment in an interactive series, Hello Future Self builds on the mechanic of Somewhere Our City and The Automobile Lab, giving audience a choice via a checkbox system on the design template.

The template scanning and alignment is done using an  openFrameworks application, mining using openCV and a custom shader. The scanned template is saved as an image file, which is served to the realtime 3D world, built using Unity3D.

The scanning station is fitted with a custom midi controller, a button that lights up when a template is correctly inserted, and when pressed inserts the wrapped 3D model into the animation in real time.