Improving the robotic housekeeping experience

Improving the robotic housekeeping experience

Project CleanSweep was an internal initiative at Connected to gain an understanding in the IoT housekeeping space in order to uncover opportunity areas. A cross-disciplinary team of product, design, and engineering practitioners collaborated for 6 weeks, approaching the project just as they would a typical client engagement. They broke their work into 2-week sprints which allowed them to run an immersion phase, synthesize their research, ideate opportunity areas, and prototype and test a potential product.


Project Plan


Immersion Research

We began this project with extensive immersion research, including extensive user research, as well as technical and business research. This helps us to build our understanding of where users unmet needs might be, and spot opportunity areas we should began ideating within.

Competitive Analysis

Globally over 1/5th of vacuums today are robotic and this share is expected to grow. As competition has increased companies have turned to new features and interfaces in hopes of differentiating their products in a crowded market place.

Our initial industry scan revealed that there is a wide array of established and new OEM’s competing with similar products but competing on technologies, features, and price. This competition has led to advancements in situational awareness through the use of spatial mapping and artificial intelligence, as well as smart home integrations and voice interfaces. Additionally we are seeing the scaling up of automated floor cleaning to industrial solutions and a diversification in consumer products.

Technical Research

By starting our technical research on day 1 we were able to quickly understand the building blocks at our disposal and when evaluating concept, were better equipped to scope effort and feasibility. Some interesting technologies we looked at included the following:

  1. Spatial Mapping Adding barriers to a map may be difficult for users to accurately visualize resulting in misplaced zones. AR-live-camera view makes it easier for users to get the correct area.
  2. Mobile AR (ARkit / ARcore) Mobile AR is rapidly advancing and allowing people to interact with information in new ways. ARcore (Android and iOS) and ARkit (iOS) are platforms that we can build spatial computing experiences with.
  3. Voice Interfaces As users become more accustomed to talking with their connected devices there's an opportunity to explore new ways to interact with robotic vacuums.

User Research

We ran several different user research activities to better understand robot vacuum owners and help us develop a customer profile.

  • 1:1 Interviews
  • Surveys
  • Digital Archeology
  • Card Sorting
  • User Testing

User Interviews

Our 1:1 interviews revealed a lot of insights into our users specific jobs, pains and gains.

“It makes me sad sometimes! I have feelings for my robot, I find it sad. I have more feelings for robots than humans. When I get home I’m like, where’s the Roomba? And I find it in the spare bathroom - I mean, it tried so hard and it couldn’t get back to it’s base!” - Jimmy

“I wonder if it can see my cables and not eat them. That would be nice - and if it was smarter, and if it could navigate rooms.” - Dan

“They have some cute things you can do in the app. When the robot starts or finishes it says something, and you can change the voice. There are voices in English, Mandarin and different characters... There are several things they say - if there are errors, if they stop, starting, finishing. You can’t talk to it though.” - Katia

Customer Profile Development

We used card sort activities to rank our users prioritization of the jobs, pains and gains we had identified in our research.



Opportunity Framing

Based on the jobs, pains, and gains of each user type and the different themes that emerged, we created 30 different how-might-we statements to give us questions to ideate upon.

One of our key insights was that customers wanted more control over their vacuums and were frustrated by the lack of communication or ability to give direction.

This helped us frame potential opportunities to create real value, we stated our ideation with the following prompts:

How might we give robot vacuum owners the ability to communicate with their vacuums with more detailed instruction?
How might we give people peace of mind that their smart vacuum has kept their family’s environment sanitary?
How might we encourage users to only control their smart vacuums from afar?
How might we make using a smart vacuum similar to the experience of using a remote-controlled car? Or a toy train track?
How might we educate people as to the best way to set up their homes for their smart vacuums?

Concept generation

On the back of our how-might-we-statements, we began ideating opportunities for new products or features in the robotic housekeeping space.


8 main themes emerged from our ideas:

01 We need to talk.

02 I just do what my calendar tells me.

03 That’s it! You can do it!

04 more thing.

05 Be kind & rewind.

06 Getting stuck is fun.

07 Well...what do you think?

08 My robot, my rules.

Conceptual Prototyping

We developed our themes into conceptual prototypes, which are evolved 'napkin sketches' that help us evaluate our concepts for technical feasibility and potential product impact.



After t-shirt sizing our concepts for both value and effort we converged on our best idea. An alternative to the birds eye floor plan provided by top-of-the-line robot vacuums allowing users to annotate their space with directions that would help provide more guidance to the robot.


Selection-based Augmented Reality annotation tools

Some initial tools we explored as having high value based on our user research:

01 Spot Clean 02 Focus Areas 03 Exclusion Zones 04 Virtual Walls

toolset nested within floating action button
toolset nested within floating action button

rough sketching of ux flows
rough sketching of ux flows

Prototype Design & Development

To help us validate this concept for desirability we knew that we needed a 'minimal functional prototype', and a spatial computing experience is difficult to communicate with any degree of experiential fidelity without actually developing a code-based prototype. So rather than a Figma prototype we developed a small application that leveraged ARcore (Android). The app allowed users to see a floor plan of their space, and experiment with the different mark-up tools.

annotation to birds-eye view map
annotation to birds-eye view map
AR view can be used for viewing existing annotation in situ or for making annotations themselves
AR view can be used for viewing existing annotation in situ or for making annotations themselves


Demo of functional prototype of Map Annotator