AR STUDIO Week 1: Recreate a memory by Maya Pruitt

A DAY AT THE RACES

For this assignment I recreated a day at the Wayne County Fair. Being a native New Yorker, this very rural event was always novel and exciting to me. My favorite part of the fair growing up was watching Rosaire's Royal Races, a tantalizing, extraordinary show where competitors run to the finish line. One year, my bet won the race - well done Christina Hoguilera, you really brought us home the bacon!

STORYBOARD:

storyboard_pigs.jpg

ASSSIGNMENT: Our goal was to create a story from one project in Maya. We could only use found models (no 3D modeling allowed!), but we could use camera angles or move things around within the setting to capture our scenes. This was my first time using the software Maya.

Designing Club Culture: Audio-Visual Sculptures by Maya Pruitt

I’d Like to Change the World by Ten Years After

This is a pretty literal visual, but I wanted to create a distortion to an image over time. The guitar parts without percussion and the airy sound of the vocalist inspired me to create something with a gooey movement.


Don’t You Worry ‘Bout a Thing by Stevie Wonder

My visual interpretation of this song is very colorful to evoke an upbeat mood.

Link to p5.js sketch.


My Generation by The Who

For this last sculpture I struggled with executing what I imagined. The lyrics of the song struck me, as well as its call and response structure. I feel like it lends itself to an interactive piece. I wanted to play with the idea of stopping the song after the lead sings and it would only continue after the user responds (clicks button, maybe). I also think it would be interesting if the words (from user input) appeared during the response segments, these would represent what users consider their “generation”.

Audio Block
Double-click here to upload or link to a .mp3. Learn more
my_generation_sculpture_wireframe.png


Data Art: Week One - Visualizing Hemlock Tree Data by Maya Pruitt

For this assignment we were asked to visualize a dataset about a hemlock tree that lived from 1579 to 2000. The data includes ring width in millimeters for each year, as well as the growth index.

Visualization #1:

This takes the in class example and animates it to show the change over time. By changing the for loop into an if statement, the points can be drawn one at a time. It gives a nice effect of adding more information throughout the lifespan of the tree.

Visualization #2:

This version creates rings based on the value of the ring width. It is linear like the in-class example to create a familiar timeline visual. Left most side is 1579 and right is 2000. If the ring outline is larger that indicates larger ring width. This visualization is quite chaotic and hard to decipher.

Visualization #3:

This visualization is meant to replicate tree cross sections that show concentric circles of tree growth. It is a very literal interpretation, but It challenged me to truly represent the meaning of ring width. Each year the ring will form around the previous one. The ring width is the space between rings. Ultimately, the tree’s lifespan is the sum of all the ring widths.

Visualization #4:

I wanted to extract the other column of data for this one which is the growth index values. I wanted to represent how from year to year the grow index either increases or decreases. I think the algorithm I used for this is off, but it is meant to depict increase as green circles and decrease as red circles.

Mass Extinction by Maya Pruitt

What? Mass Extinction is an interactive geological rock wall and augmented reality application designed for a museum exhibition setting. Each layer represents a different geological time period and each dark brown line, a different mass extinction in Earth’s history. Users are encouraged to step into the shoes of a geologist doing field work. Using a tablet, users can excavate the wall, searching for 7 clues of mass extinction that serve as AR object targets. When found, the targets trigger hidden AR content (3D models, animations, or diagrams) to appear on the tablet. Users can also read more written information corresponding to each object.

How? This 6’ x 6’ rock wall is carved by hand out of foam. The AR application built with ARKit in Unity features a scan mode with orange user interface elements to indicate users must search for clues. When an object target is found, the UI elements turn black, a score count begins, and users have the option to touch the screen to toggle and scroll for more information.

Why? Mass Extinction seeks to bridge the gap between the academic scientific community and the general public. It uses interactive technology to communicate complex research and make it more accessible, digestible, and fun. The exhibit aims to educate users about geology, Earth's history, the causes and impacts of mass extinctions, and about creatures that existed throughout time.

Created in collaboration with Dylan Dawkins, Mingna Li, and Emily Lin. Inspired by the work of Dr. Michael R. Rampino, professor of Biology and Environmental Studies at New York University. 

Mass Extinction was exhibited at the American Museum of Natural History and the ITP Spring Show 2019.

CataclysmVR by Maya Pruitt

Cataclysm VR seeks to recreate the most recent mass extinction in history - the destruction of the dinosaurs 66 million years ago. Experience a cataclysmic asteroid impact that we could never have witnessed first hand.

How? This VR experience was designed in Unity. Users are encouraged to sit in a swivel chair to give them 360 degree movement. The mechanics work so that user will move forward in the direction of their gaze when they push a remote button.

Why? CataclysmVR was an exploration of the balance between explanation and self-discovery. When illustrating a time in history, should it be a cinematic experience or an interactive narrative? We were really curious about choose-your-own-adventure style dynamics and how users might be affected by different possible endings. We played with ideas of eliciting emotion, granting/restricting agency, as well as creating suspense and surprise.

The experience incorporates technical applications of NavMeshes, AI, animation, transitional scenes, and spatial sound in Unity.

Link to presentation deck.

RETURN TO PORTFOLIO

Production Process by Maya Pruitt

Since my focus was fabrication, the majority of the process documented here shows how the physical wall was created. However, sprinkled in are progress shots of the AR development and visual design.

WALL PLANNING

To get a sense of the scale, we used butcher paper on the wall to start planning the size of our rock wall. It was important that the interactive was large enough to feel realistic and immersive. We settled on making it 6 ft tall. However, we knew it would also need to be portable, so I suggested we fabricate it in sections that can be pieced together and taken apart at will. We considered using PVC pipe or other materials for the structural integrity. However, with some research and advice from a theatre set designer, I learned that EPS foam would be the perfect material for us. It is dense enough to hold itself up given a substantial footprint, but light enough to transport, and most importantly it is malleable. It can be carved and painted so well it can transform into something that looks completely unlike foam.

We ordered 13 blocks of EPS foam at a size of 1’ x 1 ’x 3’.

EARL AR TESTS

Dylan began testing object tracking in AR. Instead of using an image target, the camera of a mobile device would be able to recognize 3D objects. We felt that with our wall being so sculptural that this would be the most effective way to spawn AR content.

IMG_1363.JPG

INITIAL FOAM CARVING TEST

While waiting for the final foam to arrive, I practiced craving on smaller pieces.

FOAM STACK ARRIVES!

IMG_1218.JPG

We made a sketches to plan out what objects to carve into the wall and which layers of Earth’s history to represent. We wanted to make sure that the layers were in the accurate order and that fossils or artifacts embedded in each one truly came from the time period we would be indicating. We projected a digital sketch onto the wall to get a sense of the sizing for each object.

projection_rockwall_plan

T-REX CARVING

I started with the T-REX skull knowing it would be ambitious and wanted to get a head start. It demonstrates the carving process well. I began by sketching onto the foam. Then I carve away using box cutters, heated wire cutters, sandpaper, and even my hands. Once the carving is to my liking, I used Hot Wire Foam Factory all-purpose coating. This powder acts sort of like a cement once water is added, it covers the foam and dries hard, giving the foam durability and a very rock like texture.

VISUAL DESIGN

For one of the AR components we wanted to create a taxonomy tree to illustrate the lineage of the shrew that survived the extinction of the dinosaurs. All placental mammals descend from this ancestor. I created a traditional taxonomy tree but created animal silhouettes in the graphic style chosen by Emily. I envisioned the tree to grow from a fossil of the shrew on the wall and extend beyond the physical layers to illustrate that these create exist in our current geological time period.

Mammal taxonomy tree.Designed by Maya Pruitt

Mammal taxonomy tree.

Designed by Maya Pruitt

The mammal tree appears in AR. This panel can be toggled on if users want to read more information.Designed by Emily Lin.

The mammal tree appears in AR. This panel can be toggled on if users want to read more information.

Designed by Emily Lin.

AR TEST

Dylan used the dino feet that I carved as an AR object target. It gave us an idea of how a device’s camera would pick up the material and colors. This test also allowed us to play with the scale of the taxonomy tree.

FINAL TOUCHES

Documented here are more photos of the wall over time. Every item was hand carved as well as the wall texture itself. Not all objects embedded in the wall would become object targets, but we wanted them to have just as much detail so users could not tell the difference. This would encourage more of a searching action and was meant to mimic that scientists will not find all the answers right away.

After carving, the wall was coated. Once dried the pieces could be gessoed and then finally painted. We wanted the wall to feel realistic to the touch but decided on a more whimsical color scheme. The colors are based on real rock colors like clay and sandstone, but making them bold allowed the layers to feel more distinctive and catch the eye.

Syng. by Maya Pruitt

What? Named after the Greek word root syn meaning “with” or “together”, Syng is a synesthesia-inspired ear training web application designed to improve its users’ singing skills and note recognition by visualizing pitch.

How? Syng uses an open source machine learning model, called ml5.js pitch detection, to identify a pitch from microphone input. Ml5.js pitch detection identifies the frequency of sound in hertz and Syng is programmed to match the frequency to its corresponding musical notes (with a margin of error). The application guides singers through a three-part experience.

Part 1: Intro. The intro pairs each note, regardless of octave, to a particular colored circle. The objective of this introduction is to establish to the user that sounds have pitch frequency and therefore correspond to musical notes. These note-color pairings remain the same throughout the entire experience to create consistency. For example, the note C will always be red.

Part 2: Pitch Match. Pitch Match is the ear training mode. Users can play a tone and sing it back with a visual cue of whether they are sharp, flat, or perfectly matched. 

Part 3: PerformThis part is the final freestyle mode that provides a more elaborate and expressive visual while singing. The color changes to follow the pitch mapping used throughout the whole piece and the opacity changes with the vocalists volume.

Why? A common hurdle for novice singers and even advanced vocalists is learning to stay on pitch. An inexperienced singer may hear a note but will be unable to reproduce it precisely. However, if singers are not familiar with musical notes or even hearing themselves sing, how can they identify their mistakes? Syng was created to provide approachable music education that using the combination of sound and visualization to enhance learning.

The experience incorporates technical applications of dynamic web development, machine learning, JavaScript, and HTML/CSS.

See more about the research and design process:

syng_margot.png

RETURN TO PORTFOLIO

Music Interaction Design: Final by Maya Pruitt

The time has come! After a whole semester developing this project, I’m proud to show how far it has come.

Post User testing, I decided in this last week to retain a lot of the existing functionality because it actually went over quite well. Why fix something that’s not broken. I put my attention towards aesthetics to highlight its simplicity. It is a visual piece and should look the part!

I learned so much about HTML and CSS. It was amazing to feel that in control of aesthetics of web design. I played with some different mappings in the performance mode, but felt it already looked the best that I could make it. I did successfully change the mapping on pitch match to be a bit smoother, which felt like the area that needed it the most. Lastly, I was able to combine the pages so that it truly feels like a navigable website.

I decided to emphasize the order and make it feel more like a progression. Each mode builds off the next.

In future iterations, I would love to add a harmony mode with color mixing and experiment with multiple mic input so it can be a collaborative experience.

Visual Style Guide:

syng_home.png
syng_intro.png
syng_pitchmatch.png
syng_perform.png

Music Interaction Design: User Testing by Maya Pruitt

Accurate pitch detection was my biggest challenge in this project journey. However, once I got a handle on ml5 and the pitch detection trained model, my concept got new life. I focused on creating 3 modes for the user test: an introduction to learn how the colors are associated with notes, a pitch match, and a free form performance mode.

I was really curious how people would interact with the project and if it would actually feel useful. My paranoia has been that the concept is too simple, but I was pleasantly surprised and received great feedback.

FUNCTIONALITY

My top priority was making this work. Are the mappings informative? Does it accomplish the goal (visualizing singing) in a meaningful way?

Color notes:

“The first one is confusing, I’m not sure what to do”

“I’m not sure what’s going on”

For the first mode people seemed a bit lost about what to do. There was a natural tendency to look for a pitch match scenario. Someone wanted less explanation of the task, another person wanted more, and most people didn’t really read the instructions.

Pitch Match:

“Pitch match is more effective for learning, maybe you need more explanation or a visual cue.”

“I’d like more feedback when I’ve gotten the right note”

“It’s satisfying seeing the circles match”

“i’m not that bad”

“I need this”

Pitch Match came across very straight forward and most people enjoyed it as a challenge. I liked watching people play the note many times to recenter themselves and try again. This is exactly how I intended it to be used.

Performance:

“there is good initial feedback in performance mode. It’s a bit fast”

“that initial fill is very surprising, maybe that can be used more”

“I can see the range of the song in the performance mode, saturation is mapped well to volume”

As expected this mode got a lot of feedback because it’s more expressive. People seemed to genuinely like the color to frequency mapping as well as the opacity changes based on volume. It was fun watching people get excited about singing a song when it was paired with the visuals, even country national anthems!

AESTHETICS

This project has a huge visual component so I took seriously any comments about improving its look.

“In the performance mode, maybe the graphics could be more feathery or painter-like”

“it should be as stylish as possible”

“circle buttons?”

“I like the color, its so beautiful”

My next step is to improve the navigation between these modes and make it resemble more of a website. I would love to add more modes, but will see how time allows. I feel better about it than I thought and am excited to make it polished.

Soft Robotics: Inflatables & Bio-Inspiration by Maya Pruitt

MYLAR EXPERIMENTS

The shiny silver quality of mylar called to my inner magpie. I knew it would be easy for me to get lost in making just aesthetic creations, so I tried really hard with my experiments to think about the functional quality of mylar and inflatables.

Experiment 1: Lift and Movement

For this experiment I made simple small circle shapes but thought about how it would act underneath a flat surface. It is easy to see how inflating can lift a surface upwards. However, in future interactions I think this would translate well with programmable air. I couldn’t get my hands on tubing to make this work how I wanted, but I would be really curious to see if blowing up each of the circles at different times could cause the board to move. I predict a sort of shimmy movement or it it remains stationary and just create diff surface levels.

Underside of the flat surface.

Underside of the flat surface.

Experiment 2: Claws

Still intrigued by the idea of movement, I loved looking at the inflatable gripping robots and wanted to experiment with mylar’s ability to curve.

The first claw uses a technique similar to the example in class, using a long slender shape with tacts along the middle, however I made the tact lines really thin and close together. The “fingers” however, were too long and didn’t quite get the closure I was looking for.

The second claw, has shorter fingers and the tact spread slightly further apart. This one has a much better closing look to it, but the “fingers” still don’t touch. It definitely wouldn’t be able to pick anything up. I think length matters a lot in terms of how the fingers will close and the way inner walls are constructed would make a big difference in how curved the mylar can form.

ezgif.com-optimize.gif
ezgif.com-optimize copy.gif

Experiment 3: Temperature retention

I read up on mylar and learned that because of its reflective surface, mylar is often used to keep things cool. I wanted to test this by using the inflated mylar around a glass with an ice cube. Which would melt first?

The inflatable lost air quite quickly, so I don’t think the mylar was really touching the glass enough to create true insulation, but the ice in the non mylar glass did appear to melt slightly faster.

The inflatable lost air quite quickly, so I don’t think the mylar was really touching the glass enough to create true insulation, but the ice in the non mylar glass did appear to melt slightly faster.

IMG_8167.JPG
IMG_7866.JPG
IMG_9135.JPG
IMG_4386.JPG

BIO-INSPIRATION: PARASITIC FUNGI

Parasitic_fungi_on_a_dead_arthropod_(31916986775).jpg

I recently watched an episode of Planet Earth, which featured a segment on parasitic fungi. I apologize for the graphic nature of such a choice and for the photograph depicted here, but it was just so fascinating to me I wanted to write about it. There exists many different types of fungi that pray on insects, infect their brains, kill them, and then grow out of their bodies. It is truly a horrific survival tactic, and almost so sci-fi it is hard to believe it exists in nature. The reason I think it would be interesting to transfer over to technology is because of the idea that it can spawn from any “shell”. When you think about it, this sort of exists in terms of softwares, which we can install into different machines. Or perhaps computer viruses emulate this biological example as well. However, what if we could do this with hardware as well. I’m imagining something a step further than something modular, that really emphasizes the emergent growth. Wouldn’t it be cool to have electronics that attach to different kinds of bases and self install/grow/learn/evolve? I’m not exactly how this would work or how to translate the tech from the bio-inspiration, but the fact that the fungi spreads by hijacking existing systems is really intriguing to me. It would nice to transfer it to technology with a more positive connotation.