As computing becomes more ubiquitous in our objects, designers need to be more aware of how to design meaningful interactions into electronically enhanced objects. At the University of Washington, a class of junior Interaction Design majors is exploring this question. These pages chronicle their efforts.

Saturday, October 27, 2018

Actuator Experiment - Netty & Hannah




Currently, our project is still developing in terms of finding the appropriate/accurate sensors to use. We also need to brainstorm how to create the most error-free experience for the user. For now, an actuator we know we will use for sure involves audio output and input. In the video above, we recreated the audio output experiment and are looking at ways (and looking into the right speakers to perhaps purchase) to manipulate the audio. 


The second experiment we did essentially entertains the idea brought up in class on closing the lid for the next guest. Understanding motors is important for this, so we experimented with that again. We are not likely to pursue the motorized aspect of the toilet experience, but it was good for us to see how feasible or enjoyable it was.

Netty Lim, Hannah Mei

Friday, October 26, 2018

Actuator Experiment


We will be creating a chair that shoots out bubbles when someone in a different location hugs a pillow to show their affection for the other person even though they cannot be with them physically.

We experimented with two different actuator experiments that might be helpful for this project.

The first experiment we created was using the servo, we are thinking we could use this to move a bubble wand up and down in order to submerge the wand with bubble solution. It seems like this could work for what we want it to as long we can make our fan work at the same time and speed. It will be easier to tell once we have an actual bubble wand. In our exploration, we used our Photoresistor (LDR) as a placeholder for the fan.



We also tried to experiment with our stepper motor to see if it could also work well for the turning our fan for us. We need a way to have the fan blow the bubbles. We don't have a fan that we could attach to it but we played with the delay and speed of the stepper motor to see if it might work. We will be getting a squirrel cage fan to try as well to see which we prefer.




Actuation Experimentation K. Hirata & E. Dastournejad


Because our project's outputs are fairly straightforward, we decided to focus on our LED display for our actuation experiments. 


Test One:

The objective of our first experiment was to give input via a button push and to change which LED would be active in response. In addition, we added a delay for our LED to emulate our "reserved" function on our bar stool design.  Our set up consisted of two LEDs, one that would remain on when the button was not activated, and another that would light up and hold a delay once the button was pressed.



Test Two:

For the second test, we wanted to test illuminating LEDs at different times upon receiving an input. This will later allow us to play with how our display will change when alternating between modes. Similar to the first test, this set up required a push button and LEDs (three this time). When the button is pressed the LEDs turn on in sequential order.





What will change going forward:

Ideally, we would test this out with a NFC sensor, but because we do not have that sensor at the moment we decided to practice this setup with a button instead.

Additionally, we will more than likely transition to using a programmable LED strip. For now we settled on just using multiple LEDs.

Actuator Experimentation

This week we started experimenting with actuators that we think might be helpful for our final project (a sound-sensitive lamp).

LED lights

We started to experiment with LED lights and getting a specific color LED light to turn on depending on the position of the potentiometer. We had a set of 3 LEDs (red, yellow and green) to experiment with. When the potentiometer points to the left, the green light turns on (up to a voltage reading of 341), as you move the potentiometer clockwise, the yellow light turns on (between 342-681 voltage reading) about half way and as you reach the right side, the red light turns on (above the voltage reading of 681).


IMG_20181026_164738.jpg

Piezo

We also made use of a piezo speaker that turned on when it motion was sensed with the ultra-sonic sensor. Using the code that we found, we explored different melodies and ended up using twinkle twinkle little star. We experimented with adjusting the distance needed to set off the sound. The main purpose of the experiment was to figure out how to manipulate the code to play a specific tune. 




Sunday, October 21, 2018

Sensor Experiment – Netty & Hannah

For our toilet seat, we knew that there were two main inputs we would have to sense: the sound (of the flush) and the position of the seat. The first sensor we tried was the ultrasonic sensor, which could possibly sense the orientation of the seat. We found some online guides to follow; unfortunately, we still were not able to make it work but we did get some pointers from other groups during class that would definitely help moving forward. We also want to try the microphone sound sensor next to sense the sound of the flush.


Friday, October 19, 2018

Sensor Experimentation - Jacob & Tim

We broke up the elements of our picnic table into a couple parts, and for this sensor experimentation we wanted to look into how we might sense pressure and translate that sensing into the brightness of lights. This would map to sensing people sitting down at our picnic table, and changing the brightness of the lights as more people sat down at the table. Working off the button example, we modified the code to constantly check for the state of three different buttons. Looping through each button state, we aggregated the count of which buttons are on then used that number to scale the brightness level of the LED.




Thursday, October 18, 2018

Sensor Experimentation - Joo & Angela


We wanted to use the joystick to give the user multple options of colors to turn on. We are thinking these colors would be a signal of an emotion or other factor that could allow the user to send a signal with miminal effort and as a soft indicator that someone was thinking of them.

We experimented with different sensor types and this is our successful version that we created with a joystick (potentiameter as an input). We have four different colors of LED lights as outputs, so each is connected to up, down, left, right

We started to follow directions from Interfacing Joystick with Arduino, but we modified what it had to better fit our needs with the help of our classmate Jeremy! He taught us a way to simplify what the tutroial taught us to do.

   






Sensor Experimentation (Cassie, Angela, and Mike)

Sensor Experimentation

This week we started experimenting with sensors that we think might be helpful for our final project (a sound-sensitive lamp).

Microphone Sound Sensor

The most important sensor that we're going to need is one that is able to detect sound, so we started with a Microphone Sound Sensor, and followed the Arduino source code found here

Microphone with LED off
I works by setting a threshold sound value at 0 (or LOW) for ambient silence, then if a sound is sensed above that threshold the sound value is changed to 1 (or HIGH). We also hooked up an LED that turns on if the sound value is 1/HIGH and remains off if the sound value is 0/LOW. We're going to keep experimenting with additional sound sensors and lights to find the right combination for our final approach, but this was a useful proof of concept to keep us moving forward.

Tilt Ball Switch

We also found an odd-looking tilt ball switch in a pile of arduino components and thought we'd experiment with how to use it. After watching this Youtube video about how it works, we decided to modify the Button example to work with the tilt ball switch instead of a button. We also added an additional LED to the breadboard, so we could have one green "On" light and one red "Off" light that would toggle on and off depending on the position of the tilt ball switch. One other modification we made was adding the output of the tilt switch to the serial monitor so we could see the value it was sending to the board. Here's are two example images of the "on" and "off" states: 
"On" state
"Off" state
We may use this functionality to mute or snooze our lamp, or we may try a different approach, but it was a useful feel to start thinking about how our users will be able to interact with our final product. 

Sensor Experimentation – Jack Sinclair and Alex Alspaugh

Work In Progress


To work towards our goal of completing our idea of "What if a public bench could interpret a moment?", Jack and I started experimenting with a few different sensors that were included in our Arduino kit. The goal here wasn't to perfectly replicate the sensors that are going to be used in our final design but rather to create something that is related to the final design. To do this, we built and tested two different types of sensors: an ultrasonic sensor and a thermistor. 

Ultrasonic Sensor


LED Off

LED On

We decided to try the ultrasonic sensor because it could be a sensor that we actually use in the final design. We are going to need to be able to detect if someone is sitting on a bench or not and given the ultrasonic sensor's ability to detect proximity it seemed like a good test run. 

To begin we first looked to the internet and found a very helpful article that detailed how to set up the sensor with an Arduino. After we got it working, we then decided to add on to it so we connected an LED and wrote a little if else statement that turned the LED on and off depending on the distance sensed. 


Circuit Diagram


Thermistor 


Ardunio is cold

Arduino is hot


After tearing down the ultrasonic sensor build we decided to try to detect temperature using the thermistor included in our kit. Although we plan to use a weather API to get the temperature in our final design we thought that this would be a good way to get a feel for how temperature behaves with the Arduino. 

To start, we turned to the internet again and found a great article on how to set up the circuit and code. While the article was very clear about what type of resistor to use, due to being new at this whole Arduino thing Jack and I struggled to find the correct one resulting in wildly inaccurate temperature readings. Despite this, through a process of trial and error, we finally found the right one and got everything up and running as it should. After this, we wanted to add something to make it our own so we wrote an if else statement that had the Arduino print to the serial monitor either "I'm cold" or "ouch hot" if it was above or below 75 degrees. To fake the temperature change we just pinched the sensor and got a little laugh at the result. 

Serial Monitor Output



Circuit Diagram

Tuesday, October 16, 2018

Sensing Ideas: Angela Yung, Mike Cardarelli, Cassie Meade

Sensing Ideas

Angela Yung, Mike Cardarelli, Cassie Meade

This week we shared three different concepts in class, one for each type of furniture (table, chair, and lamp). After presenting our ideas to the class and thinking about which concept excites us the most, we’ve aligned on a sound-sensitive lamp helps to make people in multi-unit buildings aware that they’re being too noisy.


In order to understand this situation more deeply, we’ll need to consider the following:
  • What is an acceptable volume level?
  • Should users be able to turn off the lamp or only use it during certain time periods (ex: from 11pm to 3am)?
  • What is the length of time that the response signal should stay activated?


As we continue to develop this concept, we also want to focus on how this product might affect our potential users socially and psychologically. How might we use this concept to help people be more considerate of those living around them? How might we solve an uncomfortable situation using technology?


For this concept to work, we’ll need to find a sound sensor that will be able to pick up ambient noise. Here are three microphone sensor options that might work:




Friday, October 12, 2018

Sensing Ideas

Based on previous discussion we decided to pursue our interactive bar seating idea. On a basic level, we are seeking to create a bar seat that will be able to detect when it is being occupied– even if the user has stepped away.

In order to achieve this we are planning to focus on detecting two things: weight and the presence of a cell phone. Currently, we are keeping the following sensors in mind:

Weight- this will provide the most immediate and obvious information: is the user physically sitting in the chair at that moment? This variable may affect the way the seat behaves (whether it is trying to save the seat or welcome a new customer).

Near field communication (NFC)- Most everyone brings their phone to a bar so this is a reliable signal to detect. However, we don’t want to overcomplicate the naturally simple task of sitting in a chair. NFC will allow users to tap their device on a surface and then carry on with their experience.

Lastly, we are exploring the incorporation of Bluetooth capabilities. Also utilizing the user’s phone, this option could allow us to create a social space. While we want to facilitate the process of a chair detecting and providing information on whether or not it is being occupied we don’t want to eliminate social interaction between customers. Because of this we are exploring possibilities of having optional games or conversation starters to encourage interaction.

Sensing Ideas - Alex Alspaugh and Jack Sinclair

Of the three directions, which one have you chosen? Please restate it, in case you need to be more specific, or it has changed.

Out of our three situations (getting ready for bed / waking up in the morning, feeling lonely when you live alone, and sitting on a public bench), we have chosen to pursue the question of, "What if a public bench could interpret a moment?". This question could apply to many different situations (e.g.: sitting alone at a viewpoint, or sitting with a friend at a museum), and could react to each differently, varying the output. We chose this concept due to the how open-ended the output can be and the many ways the one, or rather a bench can interpret a moment.

What signals will you be sensing in order to detect your situation? What medium (light, vibration, sound, heat, radio waves, etc) does this signal move through?


To aid in our goal of a public bench being able to interpret a moment we have decided to use time and weather to help define the situation and place the bench is in to give context to the moment. To activate the bench we want to detect when a person is sitting on the bench for an amount of time using pressure or proximity sensors.


Please make a suggestion of 2-3 possible sensors for each signal you wish to detect. Try your best to link to actual sensors online.


Time

  1. Arduino Time Library
Weather

Thursday, October 11, 2018

Situation Experiment – Tim and Jacob

Backyard Table

Out of our original three directions, we have decided to create a backyard table to bring warmth to eating outdoors. The table would respond to interaction as people come to sit and eat. Specifically, as people come together, the eating experience would become physically more enjoyable as the lights become warmer and the heaters turn on. The more people sit down and interact with each other the more enjoyable the experience. Finally, when the experience reaches peak warmth and enjoyment, the table will take a picture so that everyone at the table can remember the moment.

We will be likely sensing two signals: pressure which would be transmitted through the benches around the dining table. As well as sound, as people come and eat together the table will complement their conversations, with more light and warmth. When a certain threshold is reached with both weight and sound, the table will respond by firing a Polaroid so that the moment at the table gets captured.

After some quick research into sensors, we have found a couple that would fit each of the signals we hope to detect. To detect how many people are sitting at the meal we will either use a robust button trigger or load sensor that will activate the lights. Alongside these sensors, we will use multiple microphones or vibration sensors to detect the varying amounts of interaction at the table. Finally, to trigger the camera we plan to set up a wireless receiver, for the Arduino to connect to, with the capability to fire the Polaroid’s shutter.

Sensing Ideas | Joo Oh + Angela Piccolo


Our scenario that we want to move forward with is “What if a table lamp can detect how much your friends and family miss/love you”?
 Our scenario that we want to move forward with is “What if a table lamp can detect how much your friends and family miss/love you”? We thought this option would give us a wide variety of different inputs and outputs that we can brainstorm and play with. We were also stoked to hear feedback on creating the power of the placebo effect how your brain can convince a not-so-functional treatment is a real thing with our idea.

Here is a list of a few signals that we think would work well to detect our situation.


  1. Hugging or hovering over an object for a long time (heat) 
  2. Talking to an object (sound waves) 
  3. Blowing a kiss and have bubbles or fog as an outcome 
  4. Celebrating together by shaking something (vibration) or typing emojis (digital input), and confetti comes out as an outcome 


Here is our list of different sensors that we are considering to use for each of our signals. We have linked to a potential option but are open to finding different ones.


  1.  Hugging or hovering 
    1.  Heat sensor 
    2.  Flex Sensor 
  2.  Talking to an object 
    1.  Vibration Sensor 
    2.  Proximity Sensor 
  3.  Blowing a Kiss 
    1. Conductive Pom Pom 
    2. Air Flow Sensor 
  4. Celebrating Together 
    1. Vibration Sensor 
    2.  Touch Sensor

Tuesday, October 9, 2018

Situation Options: Hannah and Netty

We explored three possible situations/ideas that we could pursue around the relationship with humans and objects/furniture around the home.

Situation one: Remember Me
We often realize too late that we left something essential at home (keys, wallet, phone, etc.). This lamp would help alert a user before they leave if it doesn't detect those essential items on you before you leave your home.
Situation two: Put Me Down
It can be frustrating when individuals are not courteous enough to put down the toilet seat. This toilet seat serves to "teach a lesson" by emitting a loud and obnoxious scream if the seat is not put down upon flushing. This hopefully helps the individual develop a habit and consider the position of those who don't use the toilet with the seat up.
Situation three: Apologetable
Accidentally hitting a body part on furniture can be one of the most rage-inducing and painful experiences around the home. We often curse the inanimate object, but we hope to create an interesting relationship by having the table apologize and take fault. Not only does that "humanize" the furniture, but also makes the human victim feel less angered.
Upon discussion with the class, we think that situation two and three have the most interesting potential in exploring how, what we consider to be static, inanimate objects, can develop a relationship with its human users. There's also potential for each object to be "smarter" in terms of responding to multiple situations or giving a more complex response. 

Situation Exploration






Situation Exploration

We began exploring possible situations in which there would be furniture present. We then thought about the environment around the furniture and what it could potentially be sensing. When we brought our top 3 concepts to class to be critiqued, we got the most feedback for our 3rd situation which was a home setting in which a lamp helps make sure people are not too noisy. The feedback we got led us to decide to move forward with this concept but also to think about the social and psychological aspects. As we continue to develop our furniture concept, we want to look more into how we might use this concept to help people be more considerate of those living around them.




Situational Options

Situation Options Assignment
K.Hirata | E.Dastournejad

As we rapidly move towards a more "smart" future we get more and more cautious about using screens and gathering data by our gadgets. In this assignment, our team focused on some situations that (in our opinion) perceived as potent to improve, using some electronic awareness.

Situation One:
What if a car seat could respond to a baby’s behavior?

in this situation, we aim to detect any unrest in the baby's posture by the car seat, and then properly address it with basic seat functions.









Situation Two:
What if a table could detect the arrangement of toys?


In this situation, we are looking for the ways that a play table can enhance the playing experience for the kid. We're thinking about motion and color sensors in collaboration with physical actuators and screens.






Situation Three: 
What if a chair could detect when it’s being occupied?


Here we want to let a user reserve a seat on the bar while they're not seating there (i.e. using the restroom). In this specific example, we are planning to use NFC and some timer mechanism on the seat itself.







After some fruitful dialog with Prof.Muren, we decided to go forward with our third option while thinking about having some human interaction.

Situation Options — Jack + Alex

This week we explored three different situations that we thought might be made more interesting or meaningful with some sort of technological intervention. 


1. What if your lamp could sense when you are feeling lonely?



We are interested in how a lamp could possibly help with feelings of loneliness or even depression, perhaps turning on when the weather has been bad, giving you more light in your home.


2. What if your nightstand could detect when you were getting ready for bed or needed to wake up?



We thought it could be interesting if your nightstand could somehow aid in preparing you for sleep or getting you up in the morning. Possible inputs it could sense would be the time, or movement in your bedroom. 


3. What if a public bench could interpret a moment?



This is definitely our favorite idea of the three. We think it could be interesting if a public bench could sense when a person sat down, and compiled contextual data, such as the time, weather and location, and used that to return some sort of receipt, artifact or memento potentially adding meaning to the situation for the user. 

Situation Experiment - Alea and Kelsey

Living Alone

We are interested in further exploring the situation where a piece of furniture acts differently with different groups of people, but has a special connection to the owner which only shows up when they are home alone.


Away from Home

This is not one of the ideas that we are likely to explore, but we thought it would be interesting to explore what a piece of furniture could do when nobody is home. It could interact with pets and non-owner people who might be in the house.


Waiting Room

We are really interested in this concept because it is a location with a lot of emotion. If a person is waiting to be seen at the ER, the chair could communicate where they are in line. This could ease their concern knowing that others have more pressing health issues. A piece of furniture could also be used to help non-patients who are waiting for their loved one to be finished. People who are alone could get support from the furniture and could be connected to other people in the room who are in the same scenario as them.

Situation Exploration - Tim and Jacob

We began our quarter-long project brainstorming situations we might want to explore with interactive furniture. We then narrowed it down towards the following three specific situations.

1. Coming Home After a Long Day

Image result for bedside readingImage result for relaxing chair after long day




2. A Walk Through the Park

Image result for friends in the park bench

3. The Dinner Party







Situation Options - Joo Oh and Angela Piccolo






1. What if a table lamp can detect how much your friends and family miss/love you?
We are interested in exploring how people can grow a deeper relationship with others through an object remotely. 

Signals: might be messages from texts or emails, movement of footprints, audio

Feedback: We can create the object to possibly include the placebo effect instead always being data-driven or maybe a combination. 




2. What if a chair can detect you are falling asleep in class? 
We think using a chair that can detect if someone is falling sleep could alert the person or amuse them, kick them out of the classroom to eliminate distraction or blow fresh air. 

Signals: slouched posture, lowered heart rate, little motion, neighbor feedback, teacher response 

Feedback: This could individually help or collect data for the whole group.



3. What if a table can sense you are feeling overwhelmed at work? 
We would want to explore how we could help find a way to relieve stress for a person who is at work and might need emotional support. 

Signals: quantity of emails or phone calls, amount of objects on the desk, facial recognition, pressure, heart rate increased

We are leaning are toward Idea #1!


Sunday, October 7, 2018

Processing Experiment: Seeing Stars

We were tasked with choosing an example code within Processor to experiment with and manipulate. I chose the Star example to work with. When you run the code, three stars appear on the screen each with a different number of points.  My goal with this experiment is to change around the variables to see if I can manipulate the number of points on the stars, the rotation pattern of each individual star as well as the background color.

Original Effect:



My results:




my code:

/**
 * Star
 *
 * The star() function created for this example is capable of drawing a
 * wide range of different forms. Try placing different numbers into the
 * star() function calls within draw() to explore.
 */

void setup() {
  size(640, 360);
}

void draw() {
  background(20, 8, 900);

  pushMatrix();
  translate(width*0.2, height*0.5);
  rotate(frameCount / 200.0);
  star(100, 3, 5, 70, 11);
  popMatrix();

  pushMatrix();
  translate(width*0.5, height*0.5);
  rotate(frameCount / 500.0);
  star(100, 0, 30, 100, 4);
  popMatrix();

  pushMatrix();
  translate(width*0.8, height*0.5);
  rotate(frameCount / -100.0);
  star(100, 0, 30, 250, 100);
  popMatrix();
}

void star(float x, float y, float radius1, float radius2, int npoints) {
  float angle = TWO_PI / npoints;
  float halfAngle = angle/2.0;
  beginShape();
  for (float a = 0; a < TWO_PI; a += angle) {
    float sx = x + cos(a) * radius2;
    float sy = y + sin(a) * radius2;
    vertex(sx, sy);
    sx = x + cos(a+halfAngle) * radius1;
    sy = y + sin(a+halfAngle) * radius1;
    vertex(sx, sy);
  }
  endShape(CLOSE);
}