Merging the Physical World with the Digital World for a Retail Interactive
I've always been impressed with the Marc Jacobs brand and wanted to use a retail interactive utilizing Kinect and a shopping experience. The launch of Daisy by Coty Inc. for Marc Jacobs was a situation pregnant with the possible use of emitters, nature, poesy, and good feelings.
I imagined "Daisy" - combining real time interactivity between a physical set and a digital experience.
As I conceptualized these storyboards for a in-store shopping experience, the interaction began to come together:
INTERACTION DESIGN - FLOW
I, along with fellow team members, first began experiencing with Open Frameworks Object-Oriented Programming with interactive emitters using the keyboard:
Using Kinect, we replaced the interaction from the keyboard to the depth sensor to see how infrared would function behind glass during a low-light atmosphere such as dusk. Ambient light typically does not mix well with infrared data capture.
LOOK AND FEEL PROTOTYPES
The look of the Daisy "explosion"/emitter display began to take shape as we used the product design for inspiration. Using OpenFrameworks, we experimented also with algorithmic code for the look.
At the same time I made a trip to Sak's Fifth Avenue in NYC who was the first client we would present the project to first. Where would Daisy fit? Would it be an window installation? Would the lighting indoors be ample for the interaction?
As for the interaction itself - what would the inital interaction be? We felt interacting with the product as a push-button was essential as spraying a scent should launch the experience:
How to make code responsive to physical objects? We didn't want to re-wire a fan, hack the insides for a binary on/off charge. A team member on a former project had invented a circuit design where one could plug in any ac plug and use it with the arduino megaboard.
The first test was to write code to make the on set fan turn on-and-off. It worked! (Video now lost).
We 3-D printed a perfume bottle and wired it with a push-button. With all interactions functioning, we were able to combine our complete code in OpenFrameworks with our Arduino IDE code. After many failed attempts, all circuits tested successfully.
FINAL INTERACTION UX/UI
After iterative testing with many users, including those who have never experienced anything with depth-sensing, the following version was settled as the final iteration. The bee represents the individuals' relationship to the hand. As an alternative, a real-time flickering silhouette of the user was also tested as a successful UI.
PROTOTYPED BOX MADE
We constructed a box to simulate a kiosk/window display. We placed several types of grasses inside the box to test which blew easily and visibly to demonstrate the prototype (real artificial grass was to be used in the actual installation).
The full prototype was presented with a projection as the blue sky LED screen.
A take-away experience from "Daisy" is a scented postcard, sent to anyone in the world whom the user desires to make a wish and"love me." Some prototypes from a team member:
FULL SCALE EXHIBITION DESIGN
"Daisy" can be made into a full-scale pop-up experience. When users enter the area of interaction, a tablet stands before them for the user to enter on a keyboard an individual to whom they'd like to make a wish. At the end of the experience, the card is printed with this individual's name - sent to anywhere in the world.