ROSE is open source software that gives robots the intelligence to navigate, communicate, and operate in real public spaces. Built for the conditions that only exist outside a lab.
Industrial robots are built for precision in controlled environments. Social robots are built for interaction. Neither is built for what happens when you deploy either in a real public space with real people.
The problem is not the hardware. It is the software layer between the robot and the world. That is what ROSE is built to solve.
When a visitor asks "where are the dinosaurs?" a standard robot can transcribe the words but has no idea what room that is, why you're asking, or how to respond in context. It can hear. It cannot interpret.
Once a crowd reaches 0.6 people per square metre, standard path planning algorithms freeze completely. In a school corridor at lunch break, a museum on a weekend: that threshold hits in seconds.
Robots trained in simulation perform at only 20 to 30% of their simulated success rate when deployed on physical hardware. Every robotics team hits this wall. Nobody has cleanly solved it.
Think of a robot like a smartphone. The hardware (sensors, motors, cameras) is the device itself. The basic robotics operating system handles low level coordination. But that's not enough.
ROSE is what sits on top. It handles everything that matters when a robot meets a person: understanding what they're saying, moving through a crowd without freezing, recognising when someone needs help, and staying operational for 8 hours a day in a public space.
It's built as seven independent modules. Each one targets a specific layer of the human robot interaction problem. Each one is open source. Each one is being tested in a live science centre with real visitors.
ROSE CORE · ROSE NLP · ROSE NAV · ROSE VISION · ROSE ORCH · ROSE EDGE · ROSE SIM
Each module solves one specific layer of the problem. Together they form a complete operating environment for robots in public spaces.
Every version of ROSE runs at Parsec Science Centre in Vadodara, a real open science museum with real visitors every day. Multilingual questions. Weekend crowds. Unpredictable children. That's our test environment.
This is intentional. Software that only works in controlled conditions doesn't solve the problem.
ROSE is an open platform. The work grows with every researcher who runs an experiment, every institution that deploys the stack, every engineer who contributes a module.
Access our multimodal human robot interaction dataset from Parsec. Co-design hypotheses. Run experiments on live deployments. Publish through the ROSE research network.
Request Dataset Access →ROSE is Apache 2.0. Fork the stack, integrate your own hardware, or contribute a new module. Every meaningful contribution gets credited in the research output.
View on GitHub →Museums, science centres, universities: become a ROSE deployment site. Your space gets the technology. We get more real world data. Everyone gets better research outcomes.
Explore Site Partnership →ROSE is building India's first open dataset for human robot interaction in public spaces. Micro-grants support individual experiments. Larger support scales the network.
Talk to the Team →ROSE gets better with more deployments, more data, more experiments. There's a place for you here whether you're writing code, writing papers, or opening your space.