10+11 week_Joyce

Project description

I am creating an interactive device to visualize people’s interaction with the online information using the plant as a metaphor. Through interacting with the installation, people can understand how our interaction can influence data and rethink or imagine our possible relationship with data.

I am doing research on Internet information because I want to explore what humans would react if data appears to be a living life form, having the lifespan and biological characteristics and habits, in order to discover a new form of interaction to connect the two and encourage people to rethink their relationship with information.

Precedence

Bionic design


Because the form I will try to redefine information with is by adding the biological characteristic to it and giving it live, I started to look at bionics designs and investigate how design can derive inspiration from nature. Usually, bionics designs imitate the shape and body structure of nature. Bionic Designs that get inspiration from shape and structure creature but some also learn from heir special behavior and their ethnic relations. Studio PSK designed a series of radio devices called “Parasitic Products”[4] that can interfere with electrical equipment, imitating what the parasites to their hosts in the nature. The purpose of

“Parasitic Products” that can interfere with electrical equipment
the design is to highlight the importance of deviance as a way to instigate paradigm shifts in

design. It is a good example of letting people rethink the function of products, and is not always viewed as a heroic discourse, neglecting the aggressive, predatory, and often ruthless lifestyle typical of most organisms. From this project, I discover the possible form to express product in a bionic way and also guide me to think about a way of exploring new function: imitating a kind of biological relationship in nature. I started to imagine the possible biological relationship between data and people base on the reality, like parasitism, symbiosis, saprophyte etc.

The natural metaphor of online information

Then I started to look at social media information which has the closest connection with humans. It helps us to build our public identity and it relies on out input and attention to keep it alive and active. I was inspired from Social Network Zoology done by Chia-Hsuan Chou[5], which compares people’s behavior on social media to animal’s social behavior while doing things like hunting or courtship in order to define your social media personalities as a certain kind of animal. This work tells the story by creating vivid metaphors for human’s behavior on

“Social Network Zoology” compares people’s behavior on social media

the internet, which means to me that metaphor is actually a concise storytelling method. But in my case, the focus is not on people’s social relationship online but the interaction between people and the media. Base on this project, I decided to use visual metaphor to visualize

people’s interaction with social media information. In order to create a narrative based on people’s relationships with social media, I observe what people usually do or react to social information and how these activities impact the information itself.

Interaction

I am planning to make a plant-like product that represents the social media. For example, on Facebook. The posts you made on Facebook would be printed on a leaf. The growing speed of the plants would depend on the interaction activity of people passing by. The usual action we do on social media, like sharing, liking and commenting would respectively

correspond to tearing part of the post. Without people’s attention or interaction, the social media plant would die, which shows the relationship between social media and people. The former relies on the input and attention from people, and the latter needs to build their identity with the public. Also, bringing the virtual action to actual physical ones exaggerates the bounce between the two vividly.

The goal and description of this project:

instruction for wireless assignment is to choose one of the wireless topics covered in class and put it to practice for use in everyday life. I chose IR Remote control as the topic I want to explore. I tried to use the remote control to control the two LEDs.

Components list:

  • arduino UNO x1
  • breadboardx1
  • photodiode x1
  • LED x2
  • resistor 220
  • wires

https://github.com/joycemolly/ir-remote-control-/tree/master

TransSense: Environmental Interconnectedness

The title of this work is called TransSense: Environmental Interconnectedness. The exploration into this project started by looking into how technology can influence not only how we interact with fellow humans across the world, but also our environment. So much of our lives are dictated by family, friends, acquaintances, our surroundings, etc. How can we extract environmental data to create a sense of calmness to our lives.

Presentation

Concept + Goals +Audience:

I started this phase of the project by asking myself:

  • How can information be translated in real time to the body?
  • How can it be sensed through physical output?

How many of us communicate with the world is through cellular devices. When too much time is spent communicating through this form, this can create a disconnect between us and our physical surroundings. The aim of this  project is to better integrate one with the world, through a different form, one that doesn’t rely on screens.

So, began to look at different forms of interaction as well as what information can we globally receive from spaces. I have been interested in wearables for sometime and one of the hardest things that I come across when developing wearable devices or garments is justifying the use of the item when the same information can be derived from a phone or other mobile device. However, I find that certain forms of sensation and particular types of information lend themselves to the platform. The sensation being haptics, and the information being weather.

In order to sense a vibration, one must be touching the device which generates the feedback. Due to the nature of clothing, which touches the human body for over 90% of the day, this lends itself to great advantages for intuitive connection. Clothing is also an item that changes based on the weather. If its cold one may wear a sweater. If it rains, a raincoat. If its a hot, a short sleeve shirt. With this project I want to push that function even further. Why not allow for our clothing to “speak” to us about the weather. In this case, I wanted not only to inform about wind patterns, but replicate them.

To this project there is also a poetic side to it. To replicate the wind on the body using information from anywhere in the world may allow one to feel more connected with a particular place in one’s past. For instance, by wearing a garment that can replicate the live wind patterns from my hometown I may be able to establish a mental connection with that location, to fill a dissent void. Of course, this is more poetic in nature, as I mentioned before. If even capable, much more testing and speaking with psychological specialists would be needed.

To clearly state my goals, I want to “shrink” the world for those who are away from the places they care (audience) about by replicating the live wind patterns of that location, in order to provide comfort and a better communicative relationship with our environment which is becoming more cutoff by the use of mobile technologies.

I chose this particular audience because I am one who misses home from time to time, as many do, but can always make it back. I am developing this product as a means for people in the future to cope with being away from a place that they care deeply about.

Precedent:

One precedent that I researched while working on this project was, Rachel Freire. Freire created a wearable called the embodisuit. What this suit does is act as a wearable mesh, worn as an undergarment that would parse information that the user was interested. This suit uses heat, cooling, and haptic feedback to alert the wearer of various forms of changing data, such as the weather and people -to- people communication.

Description of the product:

This product is a wearable haptic garment that allows the user to receive update-to-date weather conditions via the internet (IOT). This garment is embedded with 10 vibration motors. Each motor is connected to an Adafruit Feather MO WiFi Microcontroller. This microcontroller, when connected to a nearby accessible WiFi network, will pull the weather data by location from the OpenWeatherMap API. It then parses the information needed, in this case only the wind information, and maps it to the vibration motors. The are two pieces of information being mapped. One, the wind speed is being used to control how much power is given the the motor, and two, the wind direction is used to determine with motors are being activated.

Because this garment is meant to replicate the wind wind it was important to obtain a pulse effect across the body as if it was a gust of wind passing over the wearer. This also meant this varying intensities of power would have to be given to each motor and as stated previously the direction gathered from the API would dictate where on the body this gust would begin. For instance, it the wearer is facing NORTH and the API states the wind is coming from NORTH then the front-facing motors would be given full strength, the motors on the side are given slightly less power, and the motors on the back don’t receive any. How much power each section has is  modulated based on the wind speed taken from the API.

In order the make this a dynamic experience, a magnetometer (digital compass) was included initially. This compass would keep track of the user rotation and remap the “pulse” to a different part of the body, determined by the direction from the data set. I will speak to that instance further down.

Video Documentation:

 

Materials List:

Process + Prototypes:

This process started off with coding. Before the making process started I wanted to make sure that I could do it code wise with the electronics I had available. I started off coding the NODEMCU. I found this WiFi developer board quick for connecting to the internet and grabbing information through the API, however, it was tedious when it came to programming the pins. The first thing that I noticed was there were no PWM pins. To create the breeze effect I felt it vital to use PWM. In addition to that the number of pins available to power all of the pins as outputs were limited due to the multiplexing of the pins. When WiFi mode was enabled in the code, several of the the pins stopped outputting. If I tried to connect one pin to another, the board would reset.

At that point I decided to use the Adafruit Feather which was capable of PWM and WiFi connectivity, and without the multiplexing issue. Once I was able to map the API data to the LEDs I was using for testing, I added a digital compass to keep track of which direction the user was standing, as to create a my dynamic experience of feeling the motors react to your rotation. However, the compass was prone to many disruptions, electricity in particular. When I placed it near my computer I found that the readings were quite sporadic and didn’t map well to 360 degree movement. Readings could go from 0 degree to 230 degree with no values reading between. I believe this also has to do with the movement of the human body is faster then the ability of the microcontroller to process.  In the end, I decided it would be best to focus my efforts else where, and save the compass for a future iteration.

I then began crafting out the circuit and determining the best placement on the body. I tested various fabrics, and conductive materials. The Eeontex Stretch fabric was not conductive enough, so I switched to a conductive fabric tape, connected the vibration motors, sewable LEDs (more so for visual demonstration) and secured it from shortages by placing a layer of insulating material on top of the circuit.

Circuit Diagram:

Poetry, polity and power Instructable

Poetry, polity and power is the culmination of two disparate ideas that I have been intrigued by this past semester. The first is about a new kind of interaction that is emerging between human beings and machines as technology gets integrated into everyday processes. The common perception of technology or computerized systems being objective is a myth- they embody the values and perspectives of the people who design them. The second is about the power of art and poetry- and how they can be used as dynamic tools for resistance.

Poetry, power and polity is an optimistic poetry generator that can be fed biased text- hate speeches, discriminatory policies, misogynistic statements- and it removes words to create poetry that is hopeful and empowering. I wanted to create a computerized system that would automatically generate poetry from the source text- without human intervention. I see this project as a conceptual prototype that captures the essence, the value inherent in the idea- but needs further iterations to be fully realized.

In its current form, the generator would be more effective if it could respond to different source texts- by activating different heating pads depending on which text was fed in. Future iterations include programming a system that can operate on its own. Possible ways to do this would be to train Machine learning algorithms using many such blackout poetry examples.

The challenges for this project were mainly working with unfamiliar material, that was inconsistent and would react differently on different days. It taught me the importance of experimentation. Powering the circuit using the wall wart was challenging too- mainly because I found very limited documentation on it.

I loved working on this project though because I realized how simple, basic materials, mechanisms and methods can be used to convey ideas.

Here is the pdf with the final slides.

Here is the link to my final Instructable. 

https://www.instructables.com/id/An-Optimistic-Poetry-Generator-Using-Thermochromic/

Fireflies Lamp – Objects and Memories – Final Project (Dario N)

Presentation Link:

https://www.dropbox.com/s/t2ac2prwip2iozi/Objects_%26_Memories_Final_Presentation_051018.pptx?dl=0

Concept and Goals:

Objects and Memories seeks to analyze and project the powerful relations between objects, humans and emotions, and how they connect to evoke memories, nostalgia and rituals, as influential axes for new experiences and associations. Design theorist Donald Norman highlights the importance of the “history of interaction”, the associations and values that people give to objects, and the memories that evoke, over appearance, aesthetics and utility/functionality through the concept of “Emotion rather than reason” (Norman 2005).

The theoretical framework is supported by the development of a lamp inspired by the magical tradition of catching fireflies in a jar, as a playful and gestural ritual that allows users to ‘naturally’ control, illuminate, dim and turn off the light (See image below). This object/experience[1] is also meant to change the bias of relating objects as purely sculptural artifacts, to become elements that fully engage the user, shifting schemes from “observer”, static and contemplative, to “active user” and experience.

[1] Object: Related to certain attributes such as materiality, physicality, form, functionality. Experience: Related to the attributes that are triggered by the human action.

Behavioral process of catching fireflies with the lamp

A really valid question was promptly asked at the beginning of the project by my fellow classmates and professors. What is this? Why are you doing this? An important decision taken at the initial phases was to identify the platform and the context in which the idea was to be located. When talking about “flying an LED” (Image): the technical challenge set as my goal, it is easy to imagine the response framed in a sort of installation in a museum, or in another similar context. This is a very interesting path with a lot of potential, no doubt, because these contexts allow the spatial exploration that can configure a completely immersive experience; I totally agree. One of my goals as an Industrial Designer is to change the established paradigms and premises of a purely sculptural profession focused on aesthetical decisions. In addition to create an object that can be accessible and affordable, potentially by millions of people, as I said on the introduction of this document, the ground of this project can be extrapolated to other contexts and audiences…as a unique experience in a museum, as a tool for learning and nature consciousness, as a visualization for a dystopian future where there is limited access to nature and memories. The spectrum for execution and context manifestation is, at the moment of presenting this document, an item open for re-interpretation.

From these principles, the project started with the analysis of the relationship between a potential audience (mostly children from 9 years on, to adults), artifact and experience. The balance of these three elements results in a well-designed object, where the user (observer or operator) represents the axis of the experience, and the ones who have the right answer and insights to be able to argue the decisions on the artifact -Human Centered Design-.

Precedents:

Multiple mood-boards that gave a first formal approach to what it was intended to be shaped as a final product. The original mason jars and the old oil lamps were taken as inspiration. All this was carried out along with an analysis of elements that intuitively transmitted the action of “catch” such as meshes, baseball catcher gloves, nets; these correspond to relevant archetypes to analyzed actions translated into forms (Figure 11). Interesting references such as Infinity Mirror Rooms by Artist Yayoy Kusama, a fascinating way of using mirrors and light to create an immersive experience of endless worlds, and the product “Dreamlights” by Fred and Friends, which showcase a similar experience of using light and movement (like a flying led), and a clever way to  hide the mechanisms and LEDs with frosted surfaces.

Mood-boards. Formal Inspiration

Infinity Mirror Rooms by Artist Yayoy Kusama and “Dreamlights” by Fred and Friends

Description of the Project (Process and Interaction):

‘Objects and Memories’ seeks to go beyond the ‘completed’. I’m not presenting a finalized lamp, not even a completed conceptual body; my intention is to keep the boundaries open for future iterations and explorations; this project was meant to be inconclusive…an excited segway to future possibilities.

The Design process model of the that was followed in the course of the project covered 5 phases, which were developed around 3 main axes: 1. The achievement of a design concept that supported the experience and the artifacts, 2. A technical exploration that was based on the premise of how to make an LED fly, and 3. A detailed development of the artifact, which required formal and material exploration, and the realization of 3 dimensions of parts that assembled the object. Each of these phases had several technical, conceptual and human-experience challenges. The intention of the project goes past an academic exercise, but it aims to explore new interventions and experiences, as well as engaging in exciting technical experiments.

It is important to clarify that “User Testing” is a recurrent process throughout this development. The 3 axes named in the previous paragraph were developed concurrently, due to time constraints. Likewise, each phase fed and responded to the others simultaneously, so that progress was made in all the axes.

Process + Prototypes:

Initially, in the research phase, the related boundaries to the experience of catching fireflies were analyzed by describing specific objectives, actions and consequences. The objectives in this phase were raised from the observation of children and adults catching fireflies supported in a ‘playful’ context, and the different ways in which they would catch fireflies. The other important objective was to understand how users would interact with an artifact that has no instructions. In summary, in the process of catching fireflies we can identify 3 different paths. It’s important to annotate that this experience is also cultural and depends in many other factors that go beyond the acting exercise itself. In other cultures, the archetypes used to catch the fireflies range from nets to baskets. All of these icons that are part of the vast objectual domain have repercussions in the effectiveness of the memories, which is highly visual. For this particular creative exercise, I centered the analysis and results by influenced from the western way of catching fireflies.

Behavioral User Tests

In this phase, 2D sketches were made and different formal languages were explored that responded to the references of the research phase (mood-boards + inspiration + archetypes). The final result is very similar to the mason jar, since this form invites the experience of catching and containing fireflies, has a base and neutral colors that do not burst with the light that the insects generate, additionally, the lid is simple use and generate direct communication with the product’s operating system. In the same way, 3D developments were created, which aimed to test scale, technical and functional exploration.

2D Sketches. Formal Exploration

The intention with the first prototype was to quickly visualize the idea and the concept, and to test the interest and reaction of the audience to the overall experience. The prototype is screen based, made from a 120 series of “modified” images where is possible to see the behavior and response of the hardware and experience with the inputs of the user.

First Prototype. Look and Feel. GIF

In this phase was necessary to take approach the project from a technical exploration of considering different ways that resemble the light emitted by a firefly in a mason jar.

Possible Technical Approaches

The first technology that was explored was with a matrix of LED’s, which consists in making a cloud of LEDs (soldering one by one) until achieving the desired effect. This matrix was discarded since it requires a lot of space for wiring and hardware (hard-points), besides it is complex to test-build in the desired final object.

Fiber optic was another technology tested to make the effect of fireflies flying in a jar. This was not a good direction since the intensity of the light was not enough and it also required a matrix of complex LEDs on the base of the object. Lasers, projectors and a mechanical system were other alternatives that were evaluated, but finally the LED’s strip get the desired effect with the variations of speed and tonality of the light. In this option there is an important challenge, since the programmable LEDs drains a lot of current, so a very large battery pack was required, and this must be assembled inside the artifact without breaking its form and optimal operation. Something important to consider is that I wanted to avoid the use of external wires since this may affect negatively the nature and freedom from the traditional activity.

Parallel to this process, the development in 3D plays a very important role to achieve tight scales, tolerances and dimensions that are close to reality and consider hard-points by the selected mechanisms.

3D Development. Working to package all the components

I also tested different effects corresponding to the different behavioral actions from the experience. Each effect entailed different levels of complexity: one Effect with one strip, then one effect with two strips, and finally multiple effects with multiple strips (Figure 17). Also, the alignment between the different sensors and the lights, the response and feedback, corresponded to an important technical challenge. Not to mention problems with the  accelerometer reliability, problems with the light sensor, RAM issue due to the control of multiples effects in numerous leds, canvas and brush allocation problems (RAM) from the LED library, were some of the multiple issues and challenges faced during the development process.

 

 

Code Logic Illustration

Construction Process:

Most of the 3D components I built them in a 3D software and then they were manufactured in CNC.

I used 2 clear acrylic tubes, one from the outside and one from the inside where I wrapped the LED strips. I sandblasted the outside tube to hide the internal components.

I shaped the lid from a wooden block in the lathe. The lid has some magnets to activate the reed switch and to close tightly the cap to the body.

Materials:

For the Electronics:

  • x1 Arduino Uno
  • x1 Breadboard
  • x1 ADXL 345 Accelerometer
  • x1 Reed Switch or Magnetic Switch
  • x3 Neopixels RGB strips
  • x3 330 kohm Resistors for the Strips
  • x1 1000 µF, 6.3V or higher Capacitor
  • x1 Battery Pack. 5V 2A
  • Jumper Wires

For the Jar:

  • Wood block of 2in x 6in x 6in for the lid
  • 2 clear cast acrylic tubes. One of 6″OD and one of 4″OD. for the exterior and to wrap the led strips
  • ABS. Most of the 3D components I sent them to be machined in CNC

Circuit Diagram:

 

Final Project (Week 12 and 13)_Alyssa

Final:   Time’s Up 

Presentation: https://docs.google.com/presentation/d/1O3Tbw3HGmVE0klQ0oouqTND-hTB1slgr_E4iCs59kCk/edit?usp=sharing

Concept + Goals: With this project, I sought to create an interactive exhibit that complements the ‘Me Too’ and ‘Time’s Up’ movements. These were my guiding design questions:

Continue reading

Twinkle Stare – Final Presentation Documentation

  • Your presentation
    Online Link:
    https://docs.google.com/presentation/d/1hC8NEC84CYO7_m8aihOJW1nSprMXHYKTWJMagTQmW_o/edit?usp=sharing or
    PDF link:
    Pcomp Twinkle Stare_2
  • Concept + Goals.
    My concept was to create an IoT device that reconnects me in New York and my dog in Taiwan to be able to feel her presence and recreate a moment we share together. My goal was to be able to feel my dog’s presence over long distance and to spend more time with her somehow. This project is very close to me and I decided to an long distance device because of my dog who is actually sick with cancer back in Taiwan. So I want to build something that will actually work for us.
  • Intended audience.
    This project is mainly for myself so I can spend more time with my dog with her presence over long distance. However, I feel like this device for be for other dog owners who are in long-distance relationship with their dog and would like more presence of their dog.
  • Precedents.
    Pillow Talk is a device that lets you hear the real time heartbeat of your loved one over long distance by Little Riot, which really inspired having the presence sense in my device. SoftBank from Japan also created a series of devices called called Personal Innovation Act, Analog Innovation that helps connect the older generation to the younger generation by translating the new technology we use into older forms such as printing your social media updates in the mailbox as newspapers every morning so your grandma can read updates about you.
  • Description of the project.
    After confirming my concept, I begin to think about the technical aspects of my project. My device’s interaction has two parts with different interactions. One part will be my dog’s end with her doll embedded with a pressure sensor on which she will lay on it and a speaker. The other part is the model of my dog in my room in New York which will have a face tracking camera, a button, and a pressure sensor built into it. As I began searching online for the details of how to make these interactions work, I was also introduced to two tools that could help with the long distance IoT connection to work. One tool is the MESH sensors which consists of 7 block sensors which each has a built-in function such as tilt, led, button, motion, and more functions to make prototyping and building project easy for the Internet of Things. The other is the IFTTT the free web-based service to create chains of simple conditional statements, called applets, to connect with different applications and services over the internet. Both tools are extremely helpful for my IoT device, however, I want to make all my interactions work properly first offline. First of all, I decided to figure out how to get the face detection to work on camera. I used an Arduino controlling to a servo motor and connecting the servo motor to the OpenCV face detection on a program called Processing to track faces when it moves from left to right on the screen. Thankfully, I spent a few days studying the open source code example online and made face tracking work with no problem. Next was to get the pressure sensor connected to a trigger to open the face tracking camera to start tracking face detected when it is pressed. This was a part where I was stumped and frustrated because I could not get the code to work with my own ability. With some help from my peers, I was able to get the pressure sensors to work to open up the camera to start face tracking when it is pressed and another pressure sensor to turn in off. However, there was another one problem I encountered with my code which was that it can only be run once. If I want to try to press the pressure sensor to turn it on again, the sensor cannot read my pressures values anymore. I would have to rerun both the Arduino and Processing sketch for it to work again. After some debugging, I discovered that it was part of the Arduino code that made the instructions appeared to be stuck in a loop. After modifying my code, my code was working fine, though, at times unstable, I decided that with the time I had, I would not make the IoT connection happen but just focus on the interactions and the physical doll of my device. My whole framework to make this IoT device work is displayed in the diagram below. What I’m focusing on is the interactions on the left. Ideally, I would incorporate this with the MESH sensors and by using the IFTTT webhooks service to send to communicate with my MESH sketch.
  • Video documentation


  • Materials list

    For the face tracking part: 

    Software Required
    Firmware Required
    Hardware Required

    For the physical making of the doll –

    • Fluffy Socks – http://amzn.to/1FwCmTk, http://amzn.to/1NWUkhD
    • Polyester stuffing – http://amzn.to/1Ke62Wy
    • Sewing Kit – Amazon
    • Process + Prototypes.
      After making the technical part of my device work, I decided to quickly move on to the physical enclosure. The physical enclosure of my device consists of two parts. One is the doll that my dog will lay on and the other is the model doll that sits in front of my desk. I completely underestimated how hard it is to put anything physical together, even if it is something cute and furry such as a doll. Somehow, the thought of making a doll to me seems simple to me. Not until I started a few tries with making a doll did I realize that I have a lot of practice to do. I thought of taking apart an actual doll but I also wanted my doll to be customizable so I decided to make one myself. To start off, I used fluffy socks as my main material and stuffed it with polyester fiberfill. I made a koala doll as the doll that my dog lays on. This was easier to make as it is such a small doll with no body. Then, I moved on making the model doll of my dog. With my limited experience, it was hard to me to make a model of the dog that looks exactly like my dog. One of my first versions of my doll was unable to stand up properly so I had to make a stand stuck onto an acrylic board and put it inside the doll so the model doll could stand by itself by itself. I then glued the servo onto the board and cover it with polyester fiberfill and put the sock fabric for the model’s head over it to make the model’s head. After getting the body to sit properly on a flat surface, I moved onto making the head. I put a web camera into the head and cut a small hole for the web camera to be able to peak out through the sock fabric. The web camera, however, did not work well hiding inside. One problem is that the web camera is very sensitive to the lighting, the distance, and the height of where you stand. When testing with the web camera inside the head of the doll model, the camera had a hard time detecting face and would jump from different shadows in the screen which causes spasm of quick movements. Another factor that was contributing to this unstable web camera screen was the fur from the sock material I used. This seemed to disrupt the clarity of the screen with a few furs sticking out along the side of the hole I cut. Therefore, I made a hard decision to connect my device to my computer’s camera to ensure the most stable and accurate face tracking. In the end, I was able to put together a functional model doll of my dog. The face tracking camera can be triggered by another doll when you press on it (when my dog lays on it) and you can turn it off by pressing on the model’s head. You can also press a button which is using one of the MESH button sensors that will turn on music which will play in the speaker inside the doll.Prototypes-
           
      Playtesting:

      Challenges:

      • Pressure sensor did not work as will as I thought
      • Complexity of the code
      • Webcam did not work well behind my fabric + distance issues
      • Time Management
      • Aesthetics doll’s with many wires sticking outFuture Iterations:
      • Refinement on the design/look of the doll with no wires with webcam built inside the doll
      • Making it work over local distance and wireless with bluetooth
  • Circuit diagram.

Continue reading

Final Documentation

Title of this project

Storytelling Soundsystem of “Silence Breaker : One Survivor’s Story”

Presentation

https://docs.google.com/presentation/d/e/2PACX-1vS1Z4WoHKKeZttYAWRaZlF0HmT32XRFfxQtOlJhQp30_M3232goVM0RB9MDam6KEYw5Mc1xT9OmMIeu/pub?start=false&loop=false&delayms=3000

Concept

In the midst of current #metoo movement, I’d like to focus on domestic abuse particularly for this design project. We are facing significant moments of women’s right against sexual abuse in these current issues, and there are needs to bring about more positive effects to society than just mere attentions. I would like to bring up domestic abuse against women and children for my design project because sexual abuse in domestic area could be the most hidden, and deluded part.

One of a common myths associated with DA (Domestic Abuse) is that victims of DA are helpless, passive and fragile. However, survivors are often strong, and use a number of coping strategies to manage their situation. But society’s traditional approaches to survivors such as victim blaming, stigmatization, and pathologizing make obstacles to let them speak out. Also survivors are at risks such as retaliation from perpetrators and losing their life foundations. Through this design project, I’d like to call for a change to society to the level of acting out.

Continue reading

Servo Motor – Xu (Week 8)

 

I want to use the potentiometer to control the servo motor. When the potentiometer rotates anticlockwise triggers the motor move.

Core components

Potentiometer
Servo Motor
Wires & jump wires
Arduino board
Breadboard

Circuit

How it works
When the potentiometer rotates anticlockwise triggers the motor move. if the potentiometer rotates clockwise at max, the motor will stop.

Code

#include <Servo.h>

Servo myservo;

int potpin = A0;
int val; // variable to read the value from the analog pin

void setup() {
myservo.attach(9);
}

void loop() {
val = analogRead(potpin);
val = map(val, 0, 1023, 0, 180);
myservo.write(val);
delay(15);
}