Week 16A

In class, we received feedback on our two UI animations from other teams as well as Pannafino.

  • Add emphasis to plate in front
    • Brackets, something like that
    • Opacity varying
  • Add secondary animations
  • Show area of stove subtley
  • Easy ease the rotation
  • Scan goes down, maybe have some kind of imprint or visual feedback
  • Tone down the blue in the projections
    • See link Tim sent

We spent the reminder of class laying out a game plan for the next couple of days, including writing out a script for the voice of Cheffrey and figuring out what all still needs filmed.

We met up on Wednesday and worked on filming our final scenes, and spent the remainder of Wednesday night and all of Thursday working on putting our video together.

Week 15 Out of Class

Before we actually recorded our first volumetric projection, we referred to this video in order to get an idea how to do it. Here is a gif of our test video:

test-animation

Alyssa

We met on Saturday morning to shoot some sample video footage, and we discussed creating the gifs for all the different omelettes.  After shooting the videos, I went to the lab and worked on creating the Photoshop files.  Below are 2 out of 4 finished gifs:

cheese_01.gifmushroom_01.gif

Erica

Screen Shot 2016-12-12 at 2.52.11 PM.png

This would be the screen you would see after your tongue scan. It includes the following:

  • a greeting with the date and time
  • a breakdown of your taste preferences based on the scan of your tastebuds
  • a chart tracking your taste preferences and how they change each month
  • current allergy information
  • a prompt from Cheffrey based on your taste preferences

There’s some negative space that could still be utilized and some things that could be rearranged, but for now I think we are on the right track.

Gabi

This weekend I went in and added a little more detail to Sara’s designs and also added two more. This is what we have for now:

screen-shot-2016-12-12-at-2-57-47-pm

Megan

I was tasked with taking the interface screens from Erica, Sara, and Gabi as well as the gifs from Alyssa and animate everything in AfterEffects. I used the same technique and video reference as the apple projection I made previously this weekend (top of the blog post). For this, I animated both the volumetric plates and the tongue scan. In some areas it looks a little rough and am looking into ways to smooth it out to to make it seem more realistic.

Recipe Selection

Tongue Scanner

Sara

Since we redesigned most of our interface, I worked on the recipe information that would show up on the wall when the user is swiping through the various plate holograms. Although I think this information should be plain and simple, I tried to add some boxes and glows around different elements in order to make the design look more futuristic. Since we no longer needed the omelette image on this screen, there was more negative space to fill with “futuristic design elements.” (Easier said than done)

I still think there is a lot of negative space, even inside of the boxes, but I’m having trouble figuring out what to add so that the design doesn’t look so busy that it distracts from the basic information that it is displaying.

screen-shot-2016-12-10-at-2-43-29-pm

Week 12 Out of Class

We are creating an interactive interface that uses futuristic technology to teach people of all ages how to cook, and how to cook well. It incorporates audio, visual and kinesthetic learning styles to make this possible for such a wide range of users. This technology offers the option to cook with multiple people- whether someone wants to join a cooking club or a grandmother wishes to cook alongside her young grandchild, while cooking alone is still an option.

The system that we are creating includes the use of an intuitive watch, and chip technology working together simultaneously to create a simple learning experience. Some of the most important features of this system include, as previously mentioned, the option to choose whether you are cooking alone, with a partner or with a group. You can then decide how many servings you would like to make and the watch will use mathematical technology to change the measurements to the appropriate amount. If you are unsure about what recipe you might like best, the watch includes tastebud technology. If you touch the screen and then lick your finger, you will be able to experience what the finished dish will taste like.

Once the chips have already been strategically placed into your pots, you’re ready to cook. When making the recipe, you will have the option to either have the recipe spoken audibly, or you may just read it at your own pace. To make the measuring and cutting process simpler, which are common struggles, the strategically placed chips will project measuring guide lines onto the sides of your pot and guidelines while cutting (to ensure that you don’t cut your produce too thin or too thick). For added ease of cutting correctly and safely, the chip placed on the knife (using body-heat technology) will warn you, via gentle vibration, when you are holding your knife incorrectly.

The last, and potentially the most helpful feature of this system include holographic-projection technology. While cooking, instead of continuously leaving your mixing pot to check your recipe for the next step, you have the option to use a hologram, which will be projected in front of you, as well as the audible technology as previously mentioned.

Once you think that you are finished with your dish, you can use the intelligent-tastebud technology again to experience what the dish is supposed to taste like, compared to how it actually tastes. Spices and seasonings can then be added to taste.

Voila! You have made a delicious dish, you have gained cooking experience and you have used some awesome technology!