Thursday 7 December 2006

Scenarios

A couple of scenarios to help with explaining aspects of the idea. Also some things I need to think about...

In the simplest terms possible, I want to create a device based application which automatically customises the device’s interface based upon different criteria, either set by the user or by an outside factor.

Adaptive, perceptive, intuitive. All things a handheld device should be. There is no “natural” way to interact with one, but maybe there is something that it can do to behave more naturally.

Doable Now

Temperature/Weather

Yahoo has an interesting desktop widget which along with telling you the weather vitals, visually represents the weather at that time. I can be sitting at home, and know it’s raining in Portsmouth. I really like this additional little feature, and that’s all it really is.

Scenario 1 – Bob’s walking down the street – it’s a clear sunny afternoon. His mobile calculates his location and discovers the weather accordingly. Linking to flickr, the application searches tags for “sunny” and “weather”, selects a photo at random, and displays it on his mobile background. The device selects colours from the photo, and applies a colour scheme or “mood” to the device’s interface accordingly.

Suddenly, the weather changes, a storm breaks. Linking again to flickr, the application searches tags for “storm” and “rain”, selects a photo at random and displays it on his mobile background. The colour scheme shifts from blues and yellows, to silvers and navies.

Not sure how to play this one – show the current weather, or what the weather’s going to be? Or the inverse of the weather. If it’s adapting it should match the weather. How much of a change in the weather? If the weather doesn’t change does it just update every hour?

Colour

Camera phones I’m pretty sure are capable of detecting colour – they seem to have their own magic ways to messing with the exposure so much so there’s not colour left in the photo you’re trying to take…

Scenario 2 – Bob’s going for a walk in some woods. Its spring and everything around is green. The device detects that green is especially prominent in the surroundings and searches flickr for “green” and “color”. The mood also changes to fit the surroundings. As he leaves the woods, the surrounding colours start to change, and once the device detects a noticeable difference, it picks the next most prominent colour and adapts itself accordingly.

People not leaving their phone out. How often does it change? The flickr tags for colours aren’t always accurate. Search the colour group?

Location/Advertising

Scenario 3


Bob decides to go to Paris for the day. Upon arriving, his phone detects his new location and searches for images tagged with “Paris” and updates the phone accordingly. As he walks past the Eiffel Tower, his phone recognises a Bluetooth node and bleeps, the background image changes, but this time to something decided by the promoters of the Eiffel Tower, either an inviting image, or entry prices. Once the device is out of range, it once again finds photos tagged with that location.

------------

I think I'm going to have it so that the device searches tagged photos within it's gallery first and then go to flickr if it can't find anything.

Also as an additional thought, I think there should be a way for the user to save images they like. Maybe there could be an online control panel as well, which would list the images used that day.

No comments: