As computing becomes more ubiquitous in our objects, designers need to be more aware of how to design meaningful interactions into electronically enhanced objects. At the University of Washington, a class of junior Interaction Design majors is exploring this question. These pages chronicle their efforts.

Sunday, April 17, 2016

3–5: Situation, Sensing, & Logic Descriptions — Jun & Lucas

The scenario we presented started as an automatic baby monitoring device, and evolved into  capturing the movement and sound created by a baby in a crib and turning it into light—essentially, a baby monitor for the deaf. The inputs could be any of: movement, sound, gas, or stasis (no input), and the outputs we explored include light pulses or strobing, vibration or other movement of a chair, a lullabye played for the baby, or an alarm sound (this last one obviously would not work for the deaf or hard of hearing).

The direction we decided to move in, based on Tad's input and our discussion, is to reflect the behavior of a baby with a light, probably using piezo (and perhaps additional) microphones as inputs. The goal is to bring a parent closer to their baby with a lamp that ideally rests firmly in the upper right quadrant of the usefulness/poetic coordinate plane.

Logically, it will work by monitoring the sound and movement of the baby and transforming that behavior into light. The logic will be along the lines of: if there is sound/movement, create light. Depending on how much activity, as well as its intensity and volume, change the light accordingly.



No comments:

Post a Comment