🌴[CVPR 2024] OakInk2: A Dataset of Bimanual Hands-Object Manipulation in Complex Task Completion
-
Updated
Aug 11, 2025 - Python
🌴[CVPR 2024] OakInk2: A Dataset of Bimanual Hands-Object Manipulation in Complex Task Completion
Code for the paper: "Active Vision Might Be All You Need: Exploring Active Vision in Bimanual Robotic Manipulation"
VoxAct-B: Voxel-Based Acting and Stabilizing Policy for Bimanual Manipulation (CoRL 2024)
[CVPR 2025] Tra-MoE: Learning Trajectory Prediction Model from Multiple Domains for Adaptive Policy Conditioning
Official implementation for the LABOR (LAnguage-model-based Bimanual ORchestration) Agent.
[CVPR 2024] OakInk2 baseline model: Task-aware Motion Fulfillment (TaMF) via Diffusion
Official implementation of LLM+MAP: Bimanual Robot Task Planning using Large Language Models (LLMs) and Planning Domain Definition Language (PDDL). Codes and files are coming soon.
[AAAI 2026 Oral] Official implementation of "Learning Diverse Bimanual Dexterous Manipulation Skills from Human Demonstrations". https://github.com/zhoubohan0/BiDex2 Arctic added in v2.0
A replication of Stanford's ALOHA paper in simulation, integrating OpenVLA-OFT—an improved vision-language-action (VLA) model.
This repository hosts the implementation of a Dual-Arm Manipulation System developed in collaboration between Addverb Technologies and IIT Gandhinagar Robotics Lab.
Add a description, image, and links to the bimanual-manipulation topic page so that developers can more easily learn about it.
To associate your repository with the bimanual-manipulation topic, visit your repo's landing page and select "manage topics."