Yahboom DOGZILLA-Lite: The First AI LLM Embodied Intelligence Robot Dog for Raspberry Pi Education and Autonomous Development

DOGZILLA-Lite is presented as the world’s first educational robot dog that seamlessly integrates multimodal large language models (LLM) with embodied intelligence. Built around a Raspberry Pi module, this advanced platform supports multiple AI visual functions, including face detection and sophisticated object recognition. DOGZILLA-Lite is more than just a walking robot; it is a true AI partner capable of understanding images, voices, environmental cues, and making complex autonomous decisions.

The platform supports Robot Arm expansion, enabling the extension of a 3DOF robotic arm for autonomous object grasping and handling. It comes pre-programmed with a graphical user interface (GUI) system that includes built-in AI vision and voice programs. These programs unlock numerous exciting functions such as 3D object recognition, color identification, face and emotion recognition, and motion detection, providing endless possibilities for creative and educational projects. It should be noted that the robotic arm is designed to grasp standard EVA cubes and balls.

Users benefit from Multiple Control Methods and real-time visual feedback. DOGZILLA-Lite can be easily controlled via the XGO APP and PC software, compatible with both Android and iOS devices. The robot dog can transmit real-time video images directly to the application, providing the user with an immersive first-person perspective control experience.

The robot features highly advanced Gait Planning and free adjustment capabilities. DOGZILLA-Lite integrates inverse kinematics algorithms to accurately control the ground contact time, lift time, and lift height of each leg. Users can easily adjust these parameters to achieve different complex gaits. Detailed inverse kinematics analysis and the source code for these functions are provided for deeper learning.

DOGZILLA-Lite is positioned not just as a toy, but as a ticket to the future of technology. Students can use it to understand core AI principles, developers and geeks can use it to create and test autonomous driving algorithms, and families can enjoy it as an interactive technology partner. Yahboom provides extensive technical support, including open-source data code for AI visual interaction, Open CV, and AI LLM development.


Link to buy