top of page

Astrofin+ – Tri-Arm Visual Tracking System for a Golden Fish Motion

Date

Nov 2024 - Jan 2025

​

Location

London

​

Project type

Electronic music art and robotic engineering

​

Role

Designer & Assembler of the Robotic Arm System

Developer of the Visual Tracking System Software

This installation visualizes a fish navigating the cosmos, symbolizing exploration through motion and sound. On the surface, it presents a poetic journey; beneath, it challenges the role of human performers in electronic and audiovisual art. Can a non-human ensemble—fish, AI, and code—match or surpass human expressivity?

​

The work also explores the evolving relationship between carbon-based life and digital existence. Through particle visuals and interactive systems, it reflects on how digital technologies reshape identity, consciousness, and performance. It invites reflection on self-expression, human-machine interaction, and our place in a symbiotic digital future.

I built this system from the ground up, integrating a robust visual tracking module with a six-axis robotic arm. Using an Arduino-based control board paired with the OpenMV4 H7 Plus vision module, I developed real-time image processing capabilities to enable dynamic target tracking and object manipulation.

​

Key Technical Highlights:

​

  • Vision Module Integration:
    I leveraged the OpenMV4 H7 Plus, equipped with an OV5640 sensor, to capture and process images at QVGA resolution (320×240) at up to 25–50 FPS. Using MicroPython, I implemented color segmentation and thresholding algorithms—calibrated via the OpenMV threshold editor—to reliably track colored objects and faces.

​​

  • Communication & Data Processing:
    The OpenMV module communicates with the Arduino control board over UART (using designated pins P4/P5), transmitting real-time data on target positions and features. I designed a custom protocol to ensure that these data packets are parsed correctly and used as inputs for the arm's control logic.

​​

  • Inverse Kinematics & Motion Control:
    Once the vision module detects a target, I apply an inverse kinematics algorithm to convert the 2D image coordinates into precise 3D servo commands. These commands drive six PWM-controlled servos, achieving smooth, coordinated movement of the arm for tasks like sorting or grabbing.

​​

  • System Optimization:
    To handle ambient lighting variations, I implemented dynamic threshold adjustments in the OpenMV code. Additionally, I fine-tuned the system’s error compensation for misalignment between the camera’s coordinate frame and the arm’s physical workspace, ensuring stable tracking even under challenging conditions.

​

This integrated approach has allowed me to create a versatile, intelligent robotic arm capable of autonomously tracking and manipulating objects based on real-time visual feedback.

​

The electronic and musical visual component was developed by Bowen, a co-doctoral student at the University of Glasgow.

​

The engineer and art work was submitted to NIME2025 and Prix Ars Electronica.

© 2025 by Vercent. 

bottom of page