Phase 3 final project

Phase 3 paper

Visualization website   

Sentiment visualization

Motion trail plot

As discussed in the Viacom visit, our original goal of the visualization was to put the animation in three frames, and attach and sentiment data to the body, which will demonstrate the different engagement level between participant with different experience of VR, and the impact of different headset. However, as we went further, we found that it is quite difficult to integrating the physiological data and the movement in three.js in a proper way, so we decide to visualize a simple physical motion without attaching the brainwave and heart rate, and the visualization of motion trial dots. Also we will visualizing the sentiment and heart rate data separately from 3D animation in a more traditional way by using pie chart and real-time heart rate figure and sentiment heat map.

Screen Shot 2017-12-15 at 6.03.24 AM.png

Our prototype, which has comparison across headset, experience level and stories, with attached heart rate 

One simplified visualization we have now is the motion trail dots in the 3D space consist of the x,y,z value of three joints from the motion capturing system, it does not contain skeleton the the animation of the entire body, but it has the color represent the heart rate, and user can change the view by rotating the cube.Screen Shot 2017-12-15 at 6.19.59 AM.png

3D scattered motion trail plot, Github link is here

Another motion visualization done is by using the BVH file, in this visualization, viewer can choose the physical movement over different stories, it provide a complete animation, but the comparison is not presented at this stage.

Untitled.png

Apart from using the physical motion data, the heart rate, and sentiment data over time are also a valuable source to visualize. Although heart rate is not the only indicator of the sentiment and the engage level, it is interesting to visualize it to see how different headset and experience level will impact the heart rate. Below is the graph showing the real-time heart rate trend between two participants wearing Cardboard and Samsung gear, one participant is familiar with VR while the other one only tried it few times. As can be seen from the graph, between the two participant with different experience level, the reaction in terms of heart rate can be quite different, especially the story they are watching is quite interactive and unconstrained. But it can be noticed that, for same person with different headset, the overall trend of heart rate is very similar, the cardboard one sometimes is ahead of the Samsung Gear in the graph, which means the participant generally respond quicker in cardboard, this might accord with founding in the AP report that, the cardboard elicited the highest level of stimulation, which is associated with individuals being more attentive than related.

Screen Shot 2017-12-15 at 3.53.10 AM.png

The sentiment change with time for several participants were recorded in the csv file, so we decided to visualize a real-time figure to show that how the sentiment change during the story for three participants, who has different experience of VR. As the figure below shows, for the person that has little experience with VR, the sentiment changed way more often than the other participant, and the majority of the sentiments are red and orange, which means simulating and powerful, this is totally explainable because when trying equipment for the first time we are always excited and the sensitive. For the participant that is very familiar with VR, the “fascinating” sentiment is dominant and the change of sentiment are smaller than the other two It’s probably because when people are familiar with the tools, they will pay more attention to the story content, which lead to high attention level and fascination.

Screen Shot 2017-12-19 at 7.02.36 PM

We’ve also use pie chart to compare the sentiment between different stories and different participants, in this case, the percentage of certain sentiment correspond to the frequency they appeared during the recording. But at this stage, we haven’t put the “participant B” yet, because we cannot find another person who participant in the recording for all three of“Mosul”, “Elephant” and the “New Orleans” data. Similarly, forHTC Vive, it was mainly used to test the “into the blue” story while the other two devices were never used for the story, so it is impossible to compare the sentiment between three headset for same story or person with same experience. This is the main limitation when we try to make a complete and interactive sentiment comparison which involves three headset and different participant.

Screen Shot 2017-12-15 at 6.02.42 AM.png

 

Advertisements

Phase 3 update 2

This week I am trying to visualizing the sentiment change over time between participant with different experience levels, this will be done by using the .csv file captured by multimer which contains processed sentiment, brain wave and heart rate data. My group member Jade and Shuai are working with the visualization for physical data using three.js, community partner Francesco also mentioned animations and traditional visualization by plot will also be helpful, especially associated with the different level of expertise for the participants.

However, when dealing with the physiological dataset, I found the many limitations of our current file, for example, there are only two people, David and Arsen who is “very familiar with VR and frequently watch”, there is no recoreded physiological data for Arsen, and David only used HTC Vive and Samsung gear, so it is impossible to build the graph to investigate how different level of experience of participant will affect the engagement level by visualizing sentiments.

Screen Shot 2017-12-19 at 6.09.04 AM.pngScreen Shot 2017-12-19 at 6.28.32 AM.png

Therefore, at current stage, I decide only to visualize the sentiment change overtime for the Samsung Gear while watching elephant stories, the data are from Kristin, Jason and David who has low, high and very high experience level seperately. Below is a “heat map” style visualization for the dataset from David, the different color indicate four sentiment, I will present a figure contains different levels and put them together later.

 

Phase 3 update 1

Screen Shot 2017-12-19 at 5.34.51 AM

We found the bvhplayer, which is able to load .bvh files, but we need to modify the code associated with join’s to construct a proper skeleton for our visualization.

Here are some detailed prototype of our visualization, which has the animation of three participants with different headsets, the heart rate indicator is included to have both numerical and color representation, these will be used to demonstrate our ideas in the Viacom visit.

ezgif.com-video-to-gif-2.gif

compare-1024x731-2.png

Atlas of motions, an inspirational work for visualizing emotion data

Atlas of motions

One character of emotion data is, different emotions sometimes are not totally opposite, instead there are many internal relations and share some common properties with its own intensity. In our case with the multimer sentiment data, where several sentiment share some common range of heart rate, attention and relaxation level, it could be interesting to construct a visualization showing their common properties and their intensity in each criteria.

The Atlas of motions is a great example of visualizing emotion data in this way, firstly it shows the overlap between five universal emotions, then it delivers a figure showing how different intensity of each emotion result in some sub-emotions.

Screen Shot 2017-12-20 at 12.32.58 AMScreen Shot 2017-12-20 at 12.38.33 AMScreen Shot 2017-12-20 at 12.38.09 AM.png

 

 

Good viz and Bad viz

 

Good visualization example:

The cause of death in United States in 2005-2014

Although this is not the most good looking data visualization in terms of color and designs, it is very informative and interactive. The user can understand and explore the data without too much instruction. By after clicking the area of any cause in the figure, it will be transformed to a detailed few of that cause, which shows the trend over age.

In addition, the comparison of cause of death between different races is not presented directly in this visualization, instead the user need to click on the bars on top. There will be a smooth transition animation when switching to another race. Thanks to the persistence of vision of human eye, it is not difficult for the viewer to get a general difference between races and sex, although this is not the most accurate way of comparison, it can avoid many controversy because there is no need to assign different color to ethnic and sexual group here.

Screen Shot 2017-12-19 at 4.30.11 AMScreen Shot 2017-12-19 at 4.39.10 AM

Bad visualization example:

Biggest overachievers in World Cup

Bloomberg did a visualization about overachiever in world cup, the overachiever ranking is calculated by each country’s population was divided by its number of FIFA World Ranking points as of June 5, 2014 to determine the number of people per point.

So here is the original ranking data, which shows the top 10 overachievers.top-10-overachievers-world-cup-2014-2

However, the visualization done by Bloomberg is somehow less efficient than this table. The interaction is too repetitive, user need to click the right button over 30 times to see which country ranked first. Also, despite making use of the map, the viewer can only “travel” from a country to another, it is still quite difficult to discover the trend in ranking in terms of continent or other units. Compared with the old-fashioned raking table, it is even harder to understanding the data. One way to improve this visualization is to give the option to explore the map, instead of switching between single countries, the ranking could be represented in different colors and detailed ranking and population will be shown if the user move mouse to that country, in this way, the viewer can not only get a broad heat map to see which continent are generally overachieving in world cup, they can also zoom in to compare neighbor  country.

Screen Shot 2017-12-19 at 3.43.47 AM.png

Screen Shot 2017-12-19 at 3.30.38 AM

 

NYSCI visit

The NYSCI visit was a great experience, it is good to see how technologies has enabled much interactive and entertaining way of education. The Connected Worlds exhibition shows an immersive animated environment, which ask visitors to explore the interconnectedness of different environments and strategize to keep systems in balance, and experience how individual and collective actions can have wide spread impact.

Hall of Science

The immersive environment is very realistic and respond to our gestures and decisions on the map quickly, by sowing the seed to the ground, a tree will grow immediately, the immersive environment did let me fell that I am a part of this virtual world, not to mention the kids, who are even more curious and make much more adventurous actions to this virtual worlds.

Overall the experience was fascinating and educational, it teaches kids the impact of several human behavior to the ecosystem. But one thing I noticed was that, although the kids were always very curious at first, they got bored of this installation much quicker than adults, even some or them may spend quite a white playing around in the virtual world, they don’t care much about the long term impact, there is a big panel on the side which tracks some key number of the environment including water level, population and food conditional, but not many kids are playing this, probably because there are too much text on the screen so most kids will not pay much attention to it, so there may be some extra work for the exhibition designer to do for a better visualization of the environmental change.

GlobalView

In addition, the staff told me that in most cases, the ecosystem does not end up well, mostly due to the water shortage; kids tend to block the stream for some reason. So this brings a question about the design of virtual environment, although kids are heavily immersed in the system, they are still aware that they are “playing” a game, rather than behaving in a realistic world.

 

For the Mathematica exhibition, there are lots of installations which gracefully explain some mathematical and physical theories, one of my favorite is the gigantic installation demonstrating probability distribution, I have seen this graph in book numerous times, but it is mind-blowing to see it in real life, especially in such a huge scale.

Another installation, which helps explaining the Kepler law, is a great visualization which involves interaction. As the marble move closer to the center, which means it has a smaller orbit, the speed will increase significantly, the user could press the button to release marble and create own “galaxy”.

giphy