Prototypes – Xu (Week 11)

Springe pump

I made a syringe pump which helps by an online open source. This syringe pump pushes out small, accurate amounts of liquid which helps the water dropping more similar than the real tears from human. The tears need to be more controllable, so this machine can smoothly control the water from forward and backward by using Arduino.

 

The components of this machine:

Arduino Uno R3, 17V stepper motor, Adafruit motor shield, mounting rail, threaded rod, shaft coupler, smooth rod, linear bearing, syringe, tubes, some nuts. And I 3D-printed 4 components. The motor rotates the shaft coupler, and shaft coupler rotates threaded rod which can trigger the rod mount plunger to move forward and backward.

Precedent:

Syringe Pump

Final Prototypes – Xu(Week 12)

Machine Learning
For general machine learning steps, I first need to get training data to train the network. I took pictures of myself with crying and normal images. Each of them was around 2000 images. These were the training data. The main technique used in my machine learning part is called CNN(Convolutional Neural Network)(11), it is a widely used technique in Deep Learning. By studying given dataset and corresponding labels (cry or not) the network is able to learn the underlying pattern of an image regarding if there is any characteristic of a crying person, for example, the shape of the mouth or shape of the eyes. 

Then based on the model that I have. I have a new set of data in order to see whether the training model is accurate. I chose a section of the movie Les Miserables which is a song called I dreamed a dream. This is a super emotional song which the actress cried and sang at the same time. I want to use this song as my final new-data in order to express the idea of how my facial classification read the movie. I saved each frame inside the movie and run them through the model that I had. The result showed which images were crying, which images were normal.

Then I wrote the result in stepper motor’s code from Arduino which illustrated by pushing the syringe forward or backward. If the current image in the video is classified as cry, then it will trigger the syringe from the tears machine to push. So that the dummy head drops water which can be seen as crying.

Final Project Documentation-XU

Name: The language of tears

For the psychological perspective of a human’s tears, from myself crying experience, I would like to have sad tears when seeing something have in common with my bad. Also, I’d like to weep when seeing somebody’s weep in front of me which could always trigger the sympathies in my heart. People like to mimic the other person’s facial expression when communication. It is easy to laugh inside a group of people who like to laugh. And it is also easy to cry inside a

sadness group. The emotions really communicate with people and influence other people’s feelings. I want to focus on these humanized interaction and build the emotional link between human and machines.

Machines may not have real emotions themselves but AI can help machines learn and do what humans teach them to do. I admit emotions are unpredictable and work as a key feature of being human. But the human facial expression, the physical situation always can trace a human’s emotion. For instance, we can get a conclusion whether one person is crying when seeing the facial expression of that human, or two light moving water traces emerging under the eyes. So does AI. They learn what kind of data expresses people’s emotions, like facial expressions. AI could learn and mimic human’s reaction based on what we have on our face.

Concept Statement:

By using machine learning which can understand tears and mimic crying as a behavior, I want to build a poetic feeling of humanized communication between human and machines.

Precedents: 

Tears Gun

Eindhoven graduate designs a gun for firing her tears

Artificial Shit Machine

Cloaca – Art(ificial) shit machine

Invisible Disparities

INVISIBLE DISPARITIES, 2011 – ongoing

Presentation slides and Process:

 

Future Iteration

Because of the technical problems of combining python with Arduino, the result of this project couldn’t run the whole process in the real time. Instead, I wrote the machine learning results in Arduino code. I will still try to figure this technical problem out in the future.

And also, after getting some feedback from Major Major show, I think it will very interesting if I will not use the dummy head but put the tubes on my own face instead. The tears machine will help me to drop my tears. This project can be a wearable artistic and personal device which may build a deeper connection between AI and human.

I will keep exploring this topic and also my interest in language and human expressions in the future. Hope I can find some deep explorations on those topics in my Thesis.

Final Project Documentation

Concept + Goals: 

  • Constructing an artificial, small-scale futuristic ecosystem under the scenario when synthetic biotic organisms are largely penetrated in human beings’ daily life.
  • This ecosystem will trigger interactions between human and simulated biotic organisms to raise a discussion of individual bio existence and the ecosystem at a large scale.
  • A system that humans are party to, but not sovereign over.

Intended audience: I imagine this piece living in the gallery, the structures in this piece are seen not as inanimate, fixed objects, but as living entities, capable of regeneration and growth. The intended audience should be someone who is interesting in immersive installation environment and synthetic form.

Precedents: 

A breath of life by Fraser Ross | Atelier

Muscle wire projects by Jie Qi

7K: new life form

Description of the project:

A.Sketch

Before going into the implementation stage, I created sketches to illustrate the possible project scale and appearance. (Fig1.1)The outlook of the installation should look organic enough to make people think of cells or life form etc. Transparent glass and the customized base are being considered as important elements in this project, for both how it looks like as well as the setup detail. In Fig1.2., this diagram shows how this installation will be set up, and the layout of the components including robotic plants, camera or sensor, and projector. The camera will be used to capture audiences movements and the projector will be used to create an appropriate visual outcome. Since this is an interactive installation, the main input will be audience’s movement, which will trigger the robotic plants to respond to it. Robotic plants will be influenced by human factor, but will still have the random movement like real plants.

Figure 1.1. Project appearance sketch

Figure 1.2. Project setup and layout

B. Technique

To generate a movement that is intriguing enough, the sensors in electronic components and multi-platforms are required. I want to connect Unity, Arduino, and Kinect these three platforms to complete the interaction. For each execution detail:  Kinect– using the infrared camera to detect audience’s body skeleton and find the x, y, z-axis. Unity– using the c# program to enable the trigger function when audiences touch the certain area  (there are five little yellow people models spread on the screen representing different location). When the models being triggered, Unity will send a byte to the serial monitor in Arduino. Arduino– in code, there is a processing message which will start to heat up the wire and rotate the motor, by typing “on” in the serial monitor, the function loop will start to run. To conclude, by connecting these three different platforms, I can achieve an indirect interaction without using a sensor. For the final setup, the audience will not see the unity interface so that they can just walk around, being captured by Kinect, and the robotic plants will move based on their movement without notice. The reason for putting all the technique together is to response the original idea for this project: a system that people can interact with in an indirect way.

C. Prototype and Playtest

Through making a physical prototype, the main material for making robotic plants is the radiant acrylic. This is an important design decision being made— the acrylic itself representing cheap and mass production like what we can see in CD or any plastic product. However, the shimmer visual effect makes the robotic plant look futuristic, and it also fits into the context of  “largely penetrate into our daily life”. I use laser cutting for customizing each plant, and also electronic components such as muscle wires, motors to create different movement.

Figure 1.3. the first physical prototype with radiant acrylic and stepper motor

After the very first prototype, I found out the movement of the motor is too stiff, and I need to find the other components that can create natural movement. The muscle wire is what I chose in the end. It is a unique type of wire that acts like the muscles in our bodies. Muscle Wire is an extremely thin wire made from Nitinol (a nickel-titanium alloy) that is known for its ability to contract when an electric current is applied. I incorporated this component to my project, and there are some interesting precedents created with muscle wires that I got the inspiration from. For example, A breath of life by Fraser Ross, using Flexinol, recycled electronic components, specimen jars, these “flower lamps” reside in a state of death until human interaction — a breath — brings them to life for a brief moment. The blowing is the appropriate interaction as trees and plants grow on carbon dioxide. Every living thing needs a home, plants change themselves to survive in their habitat.

Figure 1.4.”A Breath of Life by Fraser Ross.”, 2010. 

For my own creation with muscle wire, I tested out several prototypes to make a real-life look based on the shape of real plants, like a leaf, flower, and Venus flytraps (FIG 1.5). Using muscle wire is the most challenging part of this project, it’s really difficult to manage the right amount of electricity and there are countless wires to put together. However, after series of tests, I put all the components together and add the light effect to see the real set of the project and then conducted a primary playtest. At this stage, the primary project form and the aesthetics have already come out.

 muscle wire movement prototype- venus flytrap 

Based on the feedback I got from playtest, there are two aspects needed to be improved: first, the movement of wire and plants are not that clear, the movement is sparse and too gentle to notice. Second, the layout of plants are not well-formalized, they looked like individuals rather than a whole ecosystem.

D. Finalise

Consider the setup for the exhibition, I made a customize stand to store projector and all the electronic components inside. One another important decision is that I used transparent acrylic to reveal the wires and the circuits of this projects because I want people to feel the technology behind this piece as if those wires are the origin of robotic plants. Also, the depth camera used to capture human beings’ movements are decorated with robotic plants in order to make it fit into the project.

Figure 1.5. decoration on the depth camera

Reflection

The criteria used to evaluate this project is mainly based on visitors’ behavior during the major major exhibition. The most frequent feedback I got from this project is that people enjoy its aesthetics, including lights and handcraft robotic plants. At the same time, people do feel want to interact with it and see what will happen, and they will want to dig more into the concept behind.  The whole interaction takes people for 3 mins on average. Most of the feedbacks are positive, but one thing I think should be improved is that some people will neglect the movement because it happens randomly and there is no clear light resource to light up each component. The future iteration of this project will be the reassessment of the project scale, maybe the large scale living system can provide a more immersive experience.

 

List of components:

  • Arduino Mega2560

(there are more pins on Mega which enables me to connect 6 muscle wires, 1LCD, 8 LED lights, and 1 stepper motor, the total number of pins in need is 34. )

a nice substitution for Kinect, lightweight, easy to set up, and compatible with Mac environment. I was supposed to use Kinect 2 to track people’s skeleton but my windows laptop broke the night before the exhibition, so I change unity to openframework and Kinect to depth camera to create the desired outcome.

I bought tons of them on sparkfun. There are different specs and I think 0.012″ looks good, not too thin or too thick.

  • tip120
  • 330-ohm resistor
  • 12v stepper motors
  • motor shield for stepper motor
  • 12v 6a power sourcex2 – each for heating up 3 muscle wires

Presentation slides and Process:

    

Additional resource: For this project, I tried to gather all the useful information about using muscle wires, and I put them all in a google doc

Code: Code in Arduino on the GitHub

WEEK12-Further Iteration

The goal of the project- Try the basic set up for final project form, and two robotic plants with different motion included. One is made with the stepper motor(Ferns) and the other one is muscle wire(Flytrap). In this iteration, I want to test the light and visual effect of this project as well as differentiate random and systematic movement of robotic plants. Also, one important implementation of this iteration is to add the interaction part, using Kinect and Unity to communicate with Arduino, audiences can interact with this set by walking around in the room.

Main components:

Arduino UNO

Xbox Kinect 2

0.012mm muscle wires

tip120

330-ohm resistor

12v stepper motors

motor shield for stepper motor

12v 6a power source

How it works:

For the demo can see the playtest video below. Kinect– using the infrared camera to detect audience’s body skeleton and find the x, y, z-axis. Unity– using the c# program to enable the trigger function when audiences touch the certain area  (there are five little yellow people models spread on the screen representing different location). When the models being triggered, Unity will send a byte to the serial monitor in Arduino. Arduino– in code, there is a processing message which will start to heat up the wire and rotate the motor, by typing “on” in the serial monitor, the function loop will start to run. To conclude, by connecting these three different platforms, I can achieve an indirect interaction without using a sensor. For the final setup, the audience will not see the unity interface so that they can just walk around, being captured by Kinect, and the robotic plants will move based on their movement without notice.

Circuit and CODE:

The following circuit includes the detail for connecting muscle wire correctly, note that there are two sides of muscle wires, one side should go to the middle leg of the transistor(TIP 120), and the other should go to the positive side, which is 12v 6a (+) in this circuit. The other two legs of transistor go separately to GND & resistor- Pin, the reason for pin leg output can actually control the transistor and open the circuit for electricity to go through the wire. Be very aware of the external power source, don’t connect it to Arduino unless you want to burn it. Just connect positive and negative side to breadboard like how you use 9V battery to power up the circuit.

CODE in Arduino:

#include <Stepper.h>

const int stepsPerRevolution = 200; // change this to fit the number of steps per revolution
// for your motor

// initialize the stepper library on pins 8 through 11:
Stepper myStepper(stepsPerRevolution, 8, 9, 10, 11);

String buf;
char c;
int flower=6;
int flower1=4;
int flower2=5;
int led = 2;

void setup() {
pinMode(flower,OUTPUT);
pinMode(led, OUTPUT);
// set the speed at 60 rpm:
myStepper.setSpeed(60);
// initialize the serial port:
Serial.begin(9600);
}

void loop() {
while (Serial.available() > 0) {
c = Serial.read();
if (c == ‘\n’) {
processMessage();
buf = “”;
} else {
buf += c;
}
}
}

void processMessage() {
Serial.println(“processMessage”);
if (buf.equals(“on”)) {

//0.005″wire
digitalWrite(flower,HIGH);
digitalWrite(led,HIGH);
//
digitalWrite(flower1,HIGH);
digitalWrite(flower2,HIGH);
delay(3000);
digitalWrite(flower,LOW);
digitalWrite(led,LOW);
digitalWrite(flower1,LOW);
digitalWrite(flower2,LOW);
delay(8000);

for(int i=0;i<6;i++){
// step one revolution in one direction:
Serial.println(“clockwise”);
myStepper.step(stepsPerRevolution);
delay(500);

// step one revolution in the other direction:
Serial.println(“counterclockwise”);
myStepper.step(-stepsPerRevolution);
delay(500);
}

}
}

Problem encountered:

I spent a lot of time to caculate and test the right amount of electricity for the wire. Using the specs chart of wire and the formula of voltage caculation from computational craft class slides really help me alot.

 

For the playtest plan, I record a simple demo video to show the interaction. I would like to know people’s reaction toward the movement of plants, it’s interesting to look at or not. Also, I am curious about if they are aware of what interaction they are adding up to this system.

WEEK11-final project prototype

1. Physical object prototype with EL wire and 12v Stepper motor:

Using radiant acrylic as the main material, and then laser cutting the shape of robotic plants. To make it move I simply used a 12V stepper motor for a quiet and smooth rotation. There was a hole on the customized base of the plant which enables the plant to sit on the motor while rotating. To make it looked nice under the darkroom environment, I use EL wires to wrap the surface of the motor. Before using EL wire to create high contrast look, the plant looks like what showed below:

-Findings: After this prototype, I started to worry about how to create an interesting movement that will intrigue audiences.  Stepper motor is quiet enough for a subtle movement, but I definitely need something more organic or mysterious.

2. Muscle wire test: 

I bought a 1feet, 1mm muscle wire from Tinkersphere, and used 5a 10v power source to heat it up by attaching to the alligator clips. The movement of wire is vicious! I tried to build a circuit using photocell and Mosfet to heat up the wire based on the light amount received by the photocell. I found the reference from a book Arduino Wearables (Technology in Action) 1st ed. Edition, there is a chapter inside talking about shape memory wires. However, the circuit doesn’t work because of the insufficient electricity from the 9V battery. After a long run research, I found out the 1mm muscle wire is too thick to heat up without a large amount of electricity. After this prototype, I started to seek ideal components for my project including 0.012 mm muscle wires, tip 120 transistor, and 12v power source. The next step will be testing out the most stable circuit to generate the movement I want.

WEEK10-final project description and precedents

Final project description: 

  • I want to construct artificial, small-scale futuristic ecosystem under the scenario when synthetic biotic organisms are largely penetrated in human beings’ daily life.
  • This ecosystem will trigger interactions between human and simulated biotic organisms to raise a discussion of individual bio existence and the ecosystem at a large scale.
  • The interaction between audiences and this project should be indirect, which can echo the key concept for this project- A system that humans are party to, but not sovereign over.

Interaction/systems diagram:

Timeline with milestones: 

week 1 (4/3-4/8)- basic functionality

  • Communication between Arduino and  unity
  • Projection mapping visual test
  • Decision making: materials, project scales
  • Modeling the possible form of plants

week 2 (4/9-4/15)- Prototype

  • robotic plants prototype
  • project set up prototype
  • User test

week 3(4/16-4/22)- Iterations

  • modify the detail based on feedback
  • building circuits

week 4(4/23-4/30)-  Finalize

  • check the detailed implementation of project based on the exhibition place, equipment
  • troubleshooting physical computing part
  • polish the final project form

Precedents: 

EPIPHYTE CHAMBER by Philip Beesley

UNMAKEABLELOVE by Sarah Kenderdine & Jeffrey Shaw

petting zoo by minimaforms

sensory desserts by Erika Marthin 

 Synthetic Polleniser  by Michael Candy

 

WEEK5 light up led with ultrasonic sensor and photocell

The goal of this assignment:

Create a circuit with ultrasonic sensor and photocell to fulfill the following requirements:

  • If the environment is bright, the LED won’t light up.
  • If the environment is dark but the distance is far away, the LED won’t light up.
  • If the environment is dark and the distance is less than__, the LED will light up.

List of components:

  • arduino Uno
  • 220-ohm resistorx1 (led)
  • 10k resistorx1 (photocell)
  • photocellx1
  • ultrasonic sensorx1
  • wires and breadboard

How it works:

How photocell works is to change analog voltage based on the amount of light received by the sensor. As the light level increases, the analog voltage goes up even though the resistance goes down. For the ultrasonic sensor, using echo and trig pin to detect the soundwave and then calculate the distance between object. Put all together in this circuit, the photocell will detect the light amount in the environment at first, and if it’s bright enough (depend on each photocell, the number change dramatically because it’s super unstable) the led won’t light up. If the distance between the object(hand) and ultrasonic sensor is less than 10cm, the led will light up. However, even if you cover the whole photocell to make the environment dark, the led won’t light up because of the && function in code. You still need to be close enough to the distance sensor to make it work.

Circuit:

CODE in Arduino : 

const int trigPin = 13; //naming a constant value / giving a variable name / #define acts as a find and replace
const int echoPin = 12;
const int yellowLed = 8;
const int photocell = A0;
int light;

void setup() {
Serial.begin (9600);
pinMode(trigPin, OUTPUT); //trigPin is sending out the signal
pinMode(echoPin, INPUT); //echoPin is retrieving information, therefore it is INPUT
pinMode(yellowLed, OUTPUT); //trigPin is sending out the signal
pinMode(photocell, INPUT); //echoPin is retrieving information, therefore it is INPUT

}

void loop() {
long duration, distance, light;
digitalWrite(trigPin, LOW);
delayMicroseconds(2);
digitalWrite(trigPin, HIGH);
delayMicroseconds(10);
digitalWrite(trigPin, LOW);
duration = pulseIn(echoPin, HIGH);
distance = (duration/2) / 29.1; //Time it took for the echo to be received and divided in half then divided by the speed of sound

light = analogRead(photocell);

if(light < 700 && distance < 10){

// Serial.println(“work”);
digitalWrite(yellowLed, HIGH);
// wait for a second
}
else{
digitalWrite(yellowLed, LOW);
}
// Serial.print(distance);
// Serial.println(” cm”);

Serial.println(light);
// Serial.println(distance);

}

Problem:

The photocell is really unstable, I tried connecting two different photocells at first, and kept getting 1023 in my serial monitor. Only after I moved to somewhere really bright and opened up the third brand new photocell then I got the reasonable number from the serial monitor.

 

WEEK 8 MOTORS

This week is to explore different motors. The first part of the assignment is to shortly describe the differences between DC motors, servo motors, and stepper motors. Basically, the DC motor is the one good for continuous spinning. It runs at a high RPM (revolutions per minute) and can be used for something like fan and blender. Servo motor is easy to connect and can generate high performance. The advantage of the servo is to control the rotation degree from 0-180. Otherwise, if the rotation degree exceeds 180, it will turn counterclockwise. General usage for servos can be robotic arms while its movement is rude along with the noisy sound. Stepper motor is famous for its precise controllability of rotation angles. Unlike servo motor, it can go around 360 degrees with its fractional increments. Stepper motor is good for steady, precise and quite movement such as the mechanism of 3d printing machines.

For the self-exploration about motors, I choose DC motor to make a hand blender.

The goal of the project: Using the potentiometer to control the speed of DC motor and make a mini blender.

Components:

1) Arduino

2) D.C. motor

3) NPN transistor TIP120 X1

4) Rectifier Diode X1 (I used 1N4001)

5) 1* 220 ohms resistor

6) 10K Potentiometer

7) Breadboard

8) Jumper wires

How it works: 

Using the potentiometer( the adjustable resistor ) to change the speed of DC motor, as the value of potentiometer goes up, the motor will rotate faster.  Circuit sketch: connect 1N4001 to pin3 and potentiometer to a0.

Code:

Very simple code, remember to transfer the value of potentiometer from 0-1023 to 0-255 to fit the capability of the output pin. you can use map function but I just roughly divided it by 4.

Problem: the most difficult part is not about the circuit, but the blender itself. Because the rotation speed can be really fast, my handmade blender was thrown out by the motor after 3 sec and I have to tight the blender with iron wire to the motor for several times.

 

 

Final Presentation

Humble Logistics: “Us” Behind A T-shirt

Concept:

The convenience created by the logistic system is diminishing the values added during the logistics process. The story of behind the valuable process from production and supply are LOST. To target this issue, I am creating a interactive installation, which uses playful and symbolic interactions and metaphoric visual image to trigger people’s compassion and raise the understanding of the human efforts inside the logistics system.

Final Video:

 

Idea Iteration, Inspiration

1.0 Design Values

In regard to the outcome and user experience of the piece, I will be delivering following design values:

Theme:

This piece aims inform people with hidden values and human efforts behind sophisticated logistic system because of the rocket development of IoT in e-commerce.  This piece will be served as a model for logistics system at a global scale.

Reflection:

The outcome experience shall be playful, informative, and visually memorable.  This piece will be using symbolic interactions and metaphoric visual effect to generate compassionate and relatable feelings after players have interacted with the piece.

Perception:

The information within the piece will be presented as a visual symbol of human efforts using fingerprints. The aesthetics of the piece will be aiming at creating a sense of industrial toughness and abstinence to reflect the stereotypes that people had with logistics system but using light mapping to build a warm ambiance around the piece to generate a sense of compassion.

Application:

The target audience of this piece will be e-commerce customers. This piece could serve as an entertainment or advertisement implementation for logistics in theme park or museum.

 1.1 Precedents for Inspiration

I believe the best way to tell a logistic story is to use logistics itself. One of the first inspiration was a fieldwork video from Ryan Gander on Vimeo. In this piece, a series of objects and assemblages are placed on a vast concealed conveyor belt. As they pass through an aperture in the gallery wall, the viewer is invited to speculate on the stories they tell, as well as their relationship to one another.

Fig.3 Screenshot from video of RyanGander: Fieldwork on Vimeo, May 3, 2018

This piece provides me with new idea with regard to the use of conveyor belt and transportation device in design. Not only can tangible objects be conveyed but also intangible stories.

 The work, Multiverse, a long, conveyor-belt hall of the National Gallery of Art in D.C, gave me another example of how to create an immersive visual experience with the conveyor belt. This piece used LED mapping on walls and ceiling to create an astonishing visual journey for people riding the walkway.

Figure 4 Screenshot from video of Leo Villareal’s Multiverse on Vimeo, May 4, 2018

So, based on these precedents, I become interested in exploring the possibility of using conveyor belt for better storytelling and bring this element and methodology into a design, which also found the base for the design of my installation.

Process: Design upon Research

2.0 Precedents for Execution and Methodology

Particularly for this project, I want to use white and black color to bring a sense of industrial toughness and rigidity into the design to align the stereotype and impression that we, as customers, had with logistics system, but the piece should be aesthetically interesting and pleasing. So, for its shape and form, I created my initial draft (Figure 5) based on the work from Nike CLC Installation (Figure 6).

 

Figure 5 Initial Draft                                 Figure 6 Nike CLC Installation

This example is created by Super Nature Design studio, which uses LED mapping and cargo boxes to create a digital 3D panel for showing work efficiency of Nike company. This piece gave me an inspiration of how I can use lights to set a tone for a work and use simple geometrical shapes to create sophisticated structure.

2.1 Prototypes and Tests

To test out work mechanism of a conveyor belt, I laser cut plywood and paper and build the very first simple prototype, which uses manual force as drivers to rotate the wheels that lead the belt to move.

Figure 7 Prototype One – Mechanism feasibility

The success of this prototype allows me to move forward to test out the technological feasibility of this mechanism, for example how much power of a motor or how many motors I need for moving how much weight of an object.

 

 

Figure 8 Physical computing

For this particular prototype, I used a 5V stepper motor to test our if one motor is powerful enough to lead the belt moving and also used a potentiometer to test the rate of speed control. Moving forward, I need to calculate the friction and choose the right materials for the belt.

Figure 9 Material Selection

Figure 10 Prototype Three – Components together

After combining each component together, I was able to move a five-pound object (much heavier than a T-shirt) using one 12V stepper motor or DC motor. Next step will be jumping into building the actual installation of the project.

Figure 11 Final Outcome Making in the Progress

2.2 Interactions

According to the article, Selected Activity Coordination Mechanisms in Complex Systems, by Katarzyna Grzybowska, a supply chain, as a sequence of organizations collaborating to provide the largest possible amount of a product or service for the customer, can create very complex interrelation networks at every stage. To simulate this feature of the logistics system, I created three interaction stages, transportation, assembly, and operation.

The decisions are based on considerations of time and scale of the project, and most importantly the processes in logistics system that heavily involve labor forces. To enhance empathetic meanings and endow relatable personal feelings behind those fingerprints appeared in the end, I decided each activity to be finger-based interactions.

      Figure 12.1 Transportation      Figure 12.1 Operation (Trellis)             Figure 12.2 Assembly

For transportation stage, I used an analog sensor, called potentiometer, and laser cut acrylic sheet to make it look a mini steering wheel (shown in Figure 12.1) that players could use to simulate activities of workers in the transportation session of the logistics system.

Trellis:

Example code of the logic of Stages (Operation):

Ultrasonic sensor turns on the stage light and Trellis, doing the right commands on Trellis to lighten up Green LED and make Stepper Motor work.

Operation session (Figure 12.2) is the most complicated part of logistics system where lots of activities happen here. Thus, I need to find out which is the most important activity that I want to emphasize based on human involvement. In terms of human involvement, besides manual labor forces, the mental intelligence and decision-making process that people have put into the supply chain management is also a vital part that has been nonchalantly ignored by people. Sending right commands at the right intersection at right time is the key how whole logistics procedures work smoothly together without delay. By considering this significance, I used a mini keypad to simulate the command operation panel in the logistics system, which requires people to push the right command button in the right order once the mini T-shirt is transported to the operation stage.

The final interaction stage, Assembly (Figure 12.3), requires players to assemble circuit themselves by pushing two open windows together and make a circuit closed. This is a station where logistics workers assemble everything and prepare to deliver products to their final destinations, which are the front-end customers’ hands.

Github for all stages:

https://github.com/jiany457/PCom/tree/master/Humble%20Logistics

 

2.3Storytelling and Fingerprints

The idea to use fingerprints to reveal human elements was inspired by the project, Humans of The World, created by designer, Adrien Colombie. He left on a 365-day trip around the world, and, each day of his trip, he interviewed a complete new stranger to reveal their own unique human perspective. By doing so, he also asked them to give him one fingerprint and signature. Every piece comes with the story of the human who carries this fingerprint capturing the essence of their individuality. Using these, the designer created a 365-piece art series called “The Fingerprints”.

 

Figure 14 Screenshot from Human of the World on Vimeo, May 6, 2018

For me, fingerprint indicates the uniqueness of individuality, which, I convince, is also a perfect epitome of human efforts. The number of fingerprints shows the number of hands that have touched and handled the object. In addition, fingerprints could be either obscure or clear, light or heavy, which helps to reveal the level of efforts and time spent on each T-Shirt.

Figure 14 UV Light Test Outcome

To hide fingerprints and make them appear at the time that I want them to appear is the motive that I chose to use UV light and UV ink, instead of regular paints, to map fingerprints on the conveyor belt.

Material:

  1. LED UV light
  2. UV ink from Amazon

Reflections from Major Major Exhibition

3.0 Final Outcome

Figure 15 Final Outcome

3.1 Player Test and Challenges

Based on feedback from user test, the mapping fingerprints are informatively straightforward, visually powerful, and qualitatively effective in expressing human values and efforts in logistics process. Besides, watching the little objects moving along the conveyor belt based on their interaction is motivational and engaging, which also helps to catch pass-by people’s attention and stop them for a watch. In contrast, some participants responded to me that they wish to have some instructional board or graphs to show the purpose of the interactions in the beginning, as well as provide some context information of the project.

Figure 16 Player Test in the Exhibition

There were two main challenges in the design process, finding the right materials for the best visual outcome and creating an interpretable user interface. In regard to execution, how to organize the circuit in a more aesthetic way also took time. The decision to expose circuit and wires outside is based on the consideration that they are part of the human efforts and process of the logistics system. So, to fully transparentize the story behind the system, wires and infrastructures are also crucial parts.

 3.2 Evaluation

Overall, the project was effective at delivering desired content (human efforts) and providing expected user experience (playful and informative). Also, the method for using UV fingerprints as a metaphor for human efforts and values were considered valid. In terms of time management, I did not spend enough time on building user experience and making prototypes for storytelling because of technology prototypes. In terms of exhibition, because the room where the installation at was not dark enough, the final visual outcome was not as clear and conspicuous as expected.  Participants were appreciated with the non-abstractness and relatable stories from my idea and my project. In contrast, what was suggested by users for further development were a more comprehensive form and logic interactions and more hybridized mediums. For example, the level of difficulties determines the level of efforts that users have to accomplish, the frequency and the amount of time they have to spend for a better user experience, and a combination of physical interactions on a digital screen to give the project more interactive dimensions.

3.3 Future Application

I think, using conveyor belt and invisible UV image are very useful and innovative methods for delivering important messages through design. It provides me with new thinking and ideas that I could develop further into my future projects. The project, Humble Logistics, which works as a model for the logistics system at large, provides a lot of room and space for further application in domains such as marketing, data visualization, environmental protection, energy consumption, etc. For example, if I change the T-shirt to cigarettes or drugs, this project could be turning into an educational one or one for raising health awareness, or advertisement for having a better health condition. 

Conclusion

We are living in a world that keeps pushing us to live at an extreme pace, where we get to click to travel, to receive, and to obtain. We seek causes and results, and everything in between becomes minimal and invisible. It is crucial that we, before becoming blinded by the confining world displayed in front of our eyes, we need to step back and give credits and our compassion to those who make all-out efforts to support the world that we are living at. Their stories need to be told.

Based on feedback, the majority of participants acknowledged that they had not thought about this matter until they were informed by this project. They also agreed that this is an important awareness or message that worth spreading. They wish I could push my project further for a more powerful broadcast of the statement.

Final Presentation Slides: