Reading Connections

  • Chapter 3: Visual Interfaces
    • Layers & Transparency (p. 51)
      • The projections of recipes on Cheffrey’s recipe selection screen are transparent projections.
    • Transparent Displays (p. 51)
      • The screens for Cheffrey are transparent displays projected onto the wall.
    • File Management Systems (p. 58)
      • Cheffrey has an option to save recipes that the user makes, and uses a file management system to organize these recipes.
    • Motion Graphics (p. 62)
      • Various screens on Cheffrey’s interface have secondary animations and motions.

  • Chapter 4: Volumetric Projection
    • The recipe selection projections for Cheffrey are made up of volumetric projections which display plates of food for the user.

  • Chapter 5: Gesture
    • Turn to Rotate (p. 98)
      • To scroll through recipes using Cheffrey, users must scroll through rotating volumetric projections of plates.

  • Chapter 6: Sonic Interfaces
    • Voice Interfaces (p. 115)
      • Cheffrey uses both visual and sonic elements to display information to users.  
      • Cheffrey uses elements of Limited-Command Voice Interfaces (p. 118) and Conversational Voice Interfaces (p. 120).  The user uses a slightly more limited vocabulary when talking to Cheffrey, but Cheffrey has a conversational element to it as well as to not feel so robotic and stiff.

  • Chapter 8: Augmented Reality

    • Context Awareness (p. 165)
      • Cheffrey’s system is aware of the users around it and the available ingredients inside of the house.  Cheffrey uses motion activation to sense users, and scanning technology to recognize food.

Week 15B

In class on Tuesday, we decided to rework our idea to make it more futuristic.  We went to the middle room to create new ideations and brainstorm:

In the first image, you can see our new idea of how the recipes would be displayed.  Having everything on-screen was making the design feel not futuristic enough and more like an app, so we decided to have volumetic projections of a plate of food on the cooktop so the user could swipe through a little more naturally and organically.

In the second image, we created kind of a walkthrough of using the interface.

  • Step 1
    • You scan your tongue as a way of identification into the app
    • Your fridge, pantry, and any other place food is stored is scanned for any available food at this time as well
    • Once your tongue is scanned, an overview appears on screen that tells you about your tastebuds & descriptions of the 5 major taste groups, allergies, etc.  After you’ve read this screen you can choose to enter the interface
  • Step 2
    • The welcome screen
    • This screen gives you the option to view suggested recipes based on available ingredients & your tastebuds, or you can search other recipes to make/save for later
      • “Based on your available ingredients and current taste bud scan, we’ve found these recipes for you.”
  • Step 3
    • This screen describes the recipe selection screen
    • Each plate would be spinning on its own axis and the user would swipe through naturally.  You would have the option to sort things by different categories, which would be voice-activated
    • The name of the recipe and a preview would be on the back screen, so users could get a preview of everything they’d need to make the recipe.  As you swipe through the recipes, the preview would change to match whichever recipe was directly in front of the user
    • To select a recipe, you’d place the “plate” onto the cooktop, and the next screen would come up
  • Step 4
    • This screen is the ingredient collection screen
    • A volumetric projection of each ingredient would appear on cooktop and disappear when the ingredient is placed down
    • There would also be a utensil collection screen after this screen is complete, we didn’t draw it though because it would be exactly the same as the ingredient selection screen
    • Cheffrey would read the ingredients and/or utensils aloud to you
    • Also the counter would identify each ingredient/utensil as it is placed onto the counter
  • Step 5
    • The instructions would be displayed on screen, one-by-one
    • Cheffrey would read them aloud to you, and you could ask the instructions be repeated if necessary

Some examples of techniques from the readings that we’d be using are:

  • Volumetric projection
  • Layering & transparency
  • File management systems
  • Turn & rotate


Week 15A

Today we talked to Professor Pannafino about our progress so far.  Below is the feedback we received:

  • Something based on specific ingredients
  • The visuals are working
    • Halfway there
    • Limited by the Mac OS style display
      • Could there be a “sort by” etc. option? Health, time, difficulty, etc.
      • Different shapes, different ideations
      • Think of Blue Apron sheets
        • Show all ingredients
    • Organization and structure is there, but try to have different ideations
    • Variation
  • Needs indication of temperature – not just low, medium, and high
    • Thermometer, more detail
    • Think about which colors mean what
  • Push some of the ideas of future design
    • Where do they use vocal commands, etc.
  • Be precise with the Cheffrey illustrations
    • Don’t be like Clippy – people didn’t really like Clippy

We went through and decided who is going to do revisions on which screens:

  • Alyssa
    • Heating screens (Gabi’s original files)
  • Erica
    • Cheffrey screens (Sara’s original files)
  • Gabi
    • Recipe screens (filtering, etc.) (Group original files)
  • Megan
    • My Cookbook screens (Alyssa’s original files)
  • Sara
    • Ingredients/utensils screens (Erica and Megan’s original files)


On Monday, I reformatted the cooktop screens that Gabi worked on.  Below is a comparison of the 2:

I took the general ideas of Gabi’s and tried to combine them with some of the elements of the other screens.  I also wanted to work on tongue scanning technology that would read tastebuds and dietary preferences and suggest recipes based on that, but unfortunately did not have a chance to work on it so I plan on working on it tomorrow night.


Tonight, I worked on refining the volumetric displays with Cheffrey. Before I began, I referenced current voice recognition technologies. The main three I referenced in my designs were Siri, Cortana, and Google Assistant. Below are pictures I used for inspiration:

Screen Shot 2016-12-06 at 9.48.16 PM.png
This is some inspiration I used for layering sound waves. There are different levels of sound such as the voice that is speaking directly to the interface and other background noises. I thought this would help to show that the interface recognizes these different levels but is smart enough to know which level to listen to.
Screen Shot 2016-12-06 at 9.49.14 PM.png
Here is a picture of Siri’s home screen. The screen displays a simple prompt which reads “What can I help you with?” and an inactive sound wave, showing that the interface is waiting for you to respond.
Screen Shot 2016-12-06 at 9.49.41 PM.png
Here is a picture of Cortana by Microsoft. Similar to Siri, Cortana also asks “What can I help you with?” with a prompt that gives the user a suggested question to ask.
Screen Shot 2016-12-06 at 9.48.30 PM.png
Finally, this is a portion of Google Assistant that displays the word “Listening…” as you are talking to it. I thought this was smart to show some sort of indication that the interface is actively listening to what you’re saying as you are speaking.
Screen Shot 2016-12-06 at 9.46.57 PM.png
This is the projection the user would see after he or she activates that interface by placing his or her hand on the countertop. The interface and Cheffrey, the assistant, greets you and asks if there is anything he can help you with.
Screen Shot 2016-12-06 at 9.47.13 PM.png
This design shows what the projection would look like as you are talking to it. Cheffrey changes slightly and the screen indicates that he is listening. Your words appear in quotes in the center of the screen and the sound waves also reinforce that the interface is being spoken to.
Screen Shot 2016-12-06 at 9.47.39 PM.png
This is a prompt you would see displayed at the end of your cooking session. What you are finished, Cheffrey asks if you would like to add the recipe to your cookbook and the sound waves are unmoving, waiting for the user to respond.


Today in class, because we were told that our interface was not futuristic enough, I created the cooktop/pot cleaner. Below is what it would look like:

Screen Shot 2016-12-05 at 5.09.41 PM.png
This cleaner would use laser-technology to clean even the toughest of grimes off of your cooktops, pots, pans, mixing bowls, and even utensils. As you can see, the cleaning laser is moving from left to right- cleaning the pans, and cooktop in its path.

Outside of class this week we were also asked to create some variation of what was done for this class. Below is a bit of a change from what Alyssa did before:

Screen Shot 2016-12-05 at 6.20.17 PM.pngScreen Shot 2016-12-05 at 6.18.28 PM.pngScreen Shot 2016-12-05 at 6.18.35 PM.pngScreen Shot 2016-12-05 at 6.18.43 PM.png

Screen Shot 2016-12-05 at 6.18.50 PM.pngScreen Shot 2016-12-05 at 6.19.36 PM.pngScreen Shot 2016-12-05 at 6.19.42 PM.png

screen-shot-2016-12-05-at-6-19-56-pmScreen Shot 2016-12-05 at 6.20.10 PM.pngScreen Shot 2016-12-05 at 6.20.17 PM.png


Before I began working on my variations of Alyssa’s “my cookbook” screens, I did some research both from the book and online. From both, I wanted to focus my variations more on style and layout to try to give more of a futuristic look while still keeping key layout features, like layers.

Above, the left image is the original screen Alyssa made and to the right is the image research I did. I pulled photos of vlumetric projects to understand how elements worked with each other and find commonalities. I also slightly adjusted the color palette. From the reading, there was a color swatch survey of the most prominent colors used in scify interfaces for each year. I took a screenshot of the colors and pulled one that I thought would work best. From there I did varying shades and tints to give a full palette (while adjusting the original as needed). The following are the screen variations I made.

For these first variations, I worked more with color. As you can see, each column is a different layout and each row has a different style. The style difference between the first and second row is in the elements inside of the outermost retangle. The first column is a different layout that incorporates overlapping layers, the second is a simple organized, and the third is similar to what we already had, but with the new style.
For these, I tried the dark colors with a mix of lighter ones for more contrast. For the background I found a lot of futuristic interfaces have a “light source” type of look where either the bottom is the lightest or the top and bottom edges are. I tried to incorporate this in my variations. The difference between rows is the amount of Gaussian blur on the outermost edge. The columns are difference in layout.
For these I completely removed the Gaussian blur but kept the glow in the background. The top row is the same as the previous image, but without the blur effect in the background. The bottom has some line work on the top and bottom to add to the “extra” elements the book talks a lot about.
For the top and bottom rows, I increased the opacity on the background to 100% as opposed to the 85% I was using for everything else. The top row has the light-gradient in a different location while still keeping the line work. The second row is similar to the image before with adding the line work, but I also added glow to the lines to make it pop more. The third row is the same as the second however has 100% opacity in the background.
Screen Shot 2016-12-07 at 11.21.45 AM.png
Lastly, I took a photo we had of Gabi’s apartment with the countertop and wall and tried to one of the screens in context.


We were told to come up with a few more ideations for our UI designs, so we all switched screens and tried to design them slightly differently. I added the list of ingredients to the side, slightly changed how the counter reads the ingredients (eggs), and took out directions and step because I felt they might not be needed.

Erica’s original design
My ideation

Week 14B

Today, Professor Pannafino was unable to make it to class, so we met during class time to discuss our ideas, refine our storyboarding, etc.

As requested, above are our meeting pictures showing that we met during class.

We spent most of class coming up with a more detailed look at what the interface will look like.  We drew inspiration from the gestures used in Her, and from the Cover Flow list option that Apple provides in its OSX interface.

Based on this, we did the following interface sketches:

We went through the process of making an omelette.  Our interface will be mostly gesture-based with elements of audio integration as well.

Over the weekend, we made a plan to get major screen design done.  We also divvied up some smaller tasks between the 5 of us:

  • Flow chart refinement: Erica
  • Storyboard refinement: Megan
  • Final context photography: Gabi
  • Logo/name design: Alyssa (lettering) and Sara (illustration)

Week 14A

In class on Monday, we got some feedback about our project idea.  Afterwards, we decided to narrow down the exact features we’d like our interface to have, and worked on storyboarding, usability diagrams, and doing a green screen test.


  • Make sure the contaminant detection also covers the counter top and not just floor
  • Like the idea of having the instructions up (ex. on a wall in front of countertop) instead of on the surface
  • Make hand activation to turn on and also recognize your personal profile
  • Magnetic knife cool idea – very useful and effective
  • Maybe add a “self-cleaning” feature
  • “Cool” feature to allow the user to cool things quickly

On Tuesday night, we all five met up and worked on the filming for the green screen test. Here is the video:

Here is the storyboard and usability diagram from the child’s perspective:

Here is the storyboard and usability diagram from the young adult’s perspective:


Here is a photo to prove we were all together (we couldn’t get all of us in the picture so Sara offered to take it):


Week 13A

Today in class we presented our ideas to the class (as seen in the last post). Some of the suggestions that we received were to create a smart surface instead of using a bunch of chips and such. We were also told that it might be cool to make use of the backsplash and that flat space- maybe the directions are there. Another idea that we were given was to have a hologram over the island – coming down from maybe a hanging pot rack. And the last idea that we were offered was to maybe use the ideas of using various stations- a cooking station, mixing station, cutting station, etc.

After receiving this feedback, we agreed with getting rid of the chips inside of the pot. We discussed making a clear acrylic sheet that would go on top of the counter that would heat your pots, turn into a cutting station (cut safe), we like induction, so it would not be hot to the touch and might also be self sanitizing. We also discussed having the watch be self sanitizing. Below are some images that we were influenced by: