Object transfer through throwing is a classic dynamic manipulation task that necessitates precise control and perception capabilities.
However, developing dynamic models for unstructured environments using analytical methods presents challenges.
In this study, we present DartBot, a robot that integrates tactile exploration and reinforcement learning to achieve robust throwing
skills for nonrigid objects under the influence of moment of inertia which cause the object to spin in the air. Unlike traditional
sim-to-real transfer methods, our approach involves direct training of the agent
on a real hardware robot equipped with a high-resolution tactile sensor, enabling reinforced learning in a realistic and dynamic environment.
By leveraging tactile perception, we incorporate pseudo-embeddings of the physical properties of objects into the learning process
through tilting actions at two distinct angles. This tactile information enables the agent to infer and adapt its throwing strategy,
resulting in improved accuracy when handling various objects and targeting distant locations. Furthermore, we demonstrate that the
quality of a grasp significantly impacts the success rate of the throwing task. We evaluate the effectiveness of our method through
extensive experiments, demonstrating superior performance and generalization capabilities in real-world throwing scenarios. We
achieved a success rate of 95% for unseen objects with a mean error of 3.15 cm from the goal.