You picked up your delivery, but you couldn’t open it without that pair of scissors. You also forgot where did you put it, or what did you do with it yesterday. You looked around and said: “Hey Siri, where are my scissors?” A spotlight lit up, casting a bright light on a small table across the room, and there you see the pair of scissors you needed.
People misplace stuff all the time. MyStuff helps people find stuff.
When a user couldn’t find something, they can simply ask a nearby device where it is. The system shows where to look for it by pointing at that area with a spotlight.
The system can track important items within the user’s home. As the user moves around throughout the day, the system keeps note of where they have been and what items they picked up and dropped off, creating a map behind the scene. The result is an invisible system that simply tells the user where things are and recedes when not in use.
Until a robot can find and bring our stuff to us, people will continue to have a disorganized home and fail to find stuff. We set out to approach the problem from two directions:
- Help user build a habit of organization
- Track everything regardless of a user’s lifestyle
Through preliminary research, we mainly focused on the second direction.
Testing: Finding Direction
It’s tempting to go with the first direction since it poses a positive change for our user’s lifestyle and seemingly easier. However, we later found out that it’s more complicated than we thought.
We first looked into some potential users’ home, asked them to find certain things, and interviewed them to understand the problem.
- Why do people lose stuff?
- How does a user find an item? What’s the thought process behind it?
- What’s our user’s behavioral pattern when they misplace stuff? (i.e., when and where does it happen and what is the user doing at the moment)
- What items do they lose? What’s the pattern for each of them?
- How does a user organize their home?
- How can we improve aspects of it?
A user described her place as unsightly, so we just had to capture some photos. Here are some of our findings.
Frequented tools like scissors move across rooms for different purposes. They are often placed on tables at prominent positions instead of in a drawer. During a session, the user passed by and failed to notice the scissors because of the chaotic environment of their room, but recent activities reminded the user where the scissors were when the user last used. This poses an opportunity for design: help users to remember to place items at designated locations, e.g., certain scissors on a certain table.
For items that relate to specific contexts, users often don’t have certainty of where it is, but they often have some deduction approach. For example, a flash drive is related to work, so users would look for it in their backpack or on the desk instead of the dining table or living room. If these methods can be established through external guidance and practice, users would enhance their sense of their place and find stuff easier.
Users explained their ways of storing items. A typical pattern is to base it on item size and use frequency. For example, pill bottles and files often go to the desk drawer, while cables and toolbox go to a bin in the closet. Limited by containers’ physical form, users usually have no simple categorization of stuff, size and use frequency influence more of user’s decision.
We asked users why a particular item is at a place and the reason for arranging them as such. Most said because these are frequently used, but some also explained that because they often lose them, they put them at a visually prominent location so that they will notice it. The latter can often fail like in the first set of tests — when everything is emphasized, nothing stands out. This later inspired us to use visual cues to assist users in finding items.
Tracking personal belonging is challenging. Sometimes the user has to deal with awkward hardware that gets in the way of using the item itself.
Compared to solutions like Tile, the project is intended to improve the experience while making use of near-future technologies. Hence it has to:
- Track a large number of items simultaneously
- Minimize a user’s effort
- Minimize hardware distraction and be invisible
- Reduce hardware cost and be scalable for a household
- Provide intuitive interactions without exposing limitations
To test the physical limitations, we needed to understand how many items our user wants to track, what and why they would like to track them. This would inform us of the interaction cost for a user to set up the system and for each item. In addition, we needed that knowledge to figure out what technology to use and what’s the form factor, both prototyping and final concept.
Testing: Studying Possible Items
We invited users to categorize a collection of daily items in their way — this can be their mental spaces, physical spaces in their room, or even how frequently they lose them. We also asked them to decide what items they would like to be reminded or tracked and share their personal experience around these items.
Everyday-carry items were picked out first, followed by frequently used household tools, then crucial but rarely used documents and devices.
Testing: Location Indication Methods
To guide a user to the requested item, we assumed a minimum solution of compass/map view in an app, but we also tested out some alternatives: spotlight and radar tone. We hid an item, and our user asked the prototype system (me manually pretending) where it is, later interviewed their thoughts on it. The spotlight solution was highly intuitive, and we made more advanced prototypes later. The radar tone solution was ineffective.
UI & Touchpoints
Given the design criteria, the user would have minimum interaction with the system, and the system only presents itself when needed. Most of the time, the system should be forgotten by the user and become invisible.
However, the user will inevitably come in contact with the system. This usually happens when the user is asking for an item or manually registering one for tracking.
For example, we experimented with Siri Shortcuts, a feature that fits our goal of simplification: at a certain time of the day, the phone can automatically suggest a shortcut action of finding a certain item, reminding our user at an appropriate time and offering a quick entry point.
We found out that the user needs to place tracking stickers (with an obtrusive appearance) on items. We invited users to test an ad-hoc prototype of manual registration. This is to see whether such design changes their item choices and where they would place the stickers.
- Users tried to conceal the sticker (e.g., place it on the inside of a pouch).
- Small items like earplugs are hard to handle with stickers
We ended up with multiple form design for the stickers. Since hiding the tag is impossible, we decided to make it intentionally prominent and aesthetically acceptable. Besides, users can order tags printed with design patterns or customized imagery.
App on Screen
A user does not need the app to find stuff, but they might need an intuitive understanding of the state of the system, or to get help and clarification. We designed a simple app that indicates where items are.
In the early explorations, we observed how items move around a user’s place and become forgotten by the user. This brought us how we would achieve our design criteria logistically. We took advantage of the user’s body movement and object movement: the system follows the footprint of our user and their items and figures out where stuff ended up.
Items are equipped with passive RFID tags, the user’s watch is equipped with a simultaneous RFID reader to sense nearby items, and the room is equipped with radio sensors that track the watch’s location.
When the user picks up a tracked item, the user’s watch can sense that the item is in close range. As the user moves around the room while holding the item, we can assume that wherever the user is, the item is also there. Once the user drops off the item, the watch can no longer sense a strong radio frequency from the item, thereby conclude that the item is where it was last picked up.
We took the assumption that logistic automation techniques like RFID on packaging can be utilized for our home automation in the future. Therefore, we can expect that future consumer products are equipped with RFID tags out of the box, or at least some digitally identifiable characteristics. In this way, a user never needs to manually set up for each item they want to track. New items compatible with the standard are automatically tracked the moment it’s out of the box or enters the room.
To locate the user’s location in the room, we used Bluetooth beacons to triangulate the user’s watch. Using an iPhone in place of the watch and 3 Bluetooth beacons (Estimote, through iBeacon protocol and off-the-shelf APIs), the prototype can function with an accuracy of 1 m in radius. Higher accuracy will likely need line-of-sight, which is impractical in this case.
In this example, the tracking happens on-device, inside out; the beacons are merely sending out signals like GPS satellites. Given the predetermined location of the beacons within the room, the phone can triangulate itself based on received signal strength. We can hope for using time of flight instead of signal strength in the future.
The design criteria do not require a particular mechanism like this, but it has an inherent privacy advantage. The device controls all the information and does all the calculations; a beacon only broadcasts data outwards.
To detect the user’s surrounding items, we used an RFID reader to scan nearby tags. Prototyped with a UHF RFID simultaneous reader and a wireless Arduino, the “watch” can scan stickers in the direction of the hand within 30cm by distance. It sends data back to the watch, in this case, an iPhone in place, for further processing.
The spotlight is mechanically controlled with two servos. Like beacons, the spotlight has its 3-axial coordinates stored on the phone. When the user asks for direction, the phone calculates rotational angles based on the requested item’s position and the average table height to the spotlight. The rotation information is then sent to the wireless Arduino of the spotlight, and the servos turn accordingly.
While the final product was not responsive enough to achieve an optimal effect for real-time testing, the simulated testing received positive feedback. Participants had an intuitive understanding of how to use it without being overly concerned about the technical details.
The idea of indirectly exchanging information across entities can be applied to other ecosystem challenges. The design problem lies in how to bring two separate systems together through existing mechanics. In this case, the user’s body movement is the connecting agent — the very mechanism that people lose stuff becomes the entry point for the design.
In this demo, all devices are custom-fit for a lab environment. In reality, it is unrealistic to wear a power-hungry RFID reader and let it blast a borderline-illegal amount of radiation. Although, future consumer products might carry related traits that, when combined, achieve a similar effect.
For smart homes, various levels of indoor tracking might become omnipresent to improve energy saving, smart AC, assistive agents, etc. For wearables, sensing the environment has been a rising influence in the industry direction.
For smart objects, we see basic, customized automation for consumables, such as adding trackers to prescription bottles. These ideas have been attempted many times for amateur projects like this. I’ve tried multiple times. However, the slow adaptation of home automation is still limiting its reach, and people are questioning if it is worth bothering at all. So in the near future, disposable packaging might not include more intelligence than we already have (especially in consideration of sustainability); reusable trackers might grow when they are cheap enough to be expendable.
Sep 2019 Update
Apple is reportedly developing item tracking technologies that fit nicely in the ecosystem. It’s unclear how it will work, but the relevant technologies show promise. Here’s a thorough explanation. Some speculate it will use existing phones and wearables to keep the system invisible. These ideas echo some design intentions of this project. Besides, UWB might be a valid substitution for some of the mechanisms used in this project. Apple has introduced crowd-sourced device-finding using its vast coverage of products. It’s undeniable that this will also come to AirTags. We’ll know soon.