Enertation: The Energy Behind Meditation

My project will dive into the actual energy created during mediation and how that impacts the individual and the world around them.
When you think of energy, it usually condors up images of electricity racing theough wires lighting up curcuits, an athelete performing rigorous exersices, or your favorite artist rocking at a concert. Meditation is not usually what comes to mind, and beyond that the measurement of these “electrical fields” unfortunately puts meditation into the box of pseudoscience that people label as “hokey” and “woo-woo”. But what if there was a way to accurately study the energy levels created by an individual during meditation? How does it change their minds and bodies? How doss it affect the people and things around them?
Thermographic cameras, a camera that forms an image using infrared radiation, similar to a common camera that forms an image using visible light. Instead of the 400–700 nanometre range of the visible light camera, infrared cameras operate in wavelengths as long as 14,000 nm (14 µm). Their use is called thermography. These devices have been used to visualize a person’s aura. 
One of these cameras, utilizes a Gas Discharge Visualization Technique (GDV) , which is a computer registration and analysis of gas discharge glow (GDV-images) of any biological objects placed in a high intensity electromagnetic field.
The GDV method is based on the stimulation of photon and electron emissions from the surface of the object whilst transmitting short electrical pulses. In other words, when the object is placed in an electromagnetic field, it is primarily electrons, and to a certain degree photons, which are ‘extracted’ from the surface of the object. This process is called ‘photo-electron emissions’ and it has been quite well studied with physical electronic methods. The emitted particles accelerate in the electromagnetic field, generating electronic avalanches on the surface of the dielectric (glass). This process is called ‘sliding gas discharge’. The discharge causes glow due to the excitement of molecules in the surrounding gas, and this glow is what is being measured by the GDV method. Therefore, voltage pulses stimulate optoelectronic emission whilst intensifying this emission in the gas discharge, owing to the electric field created.
Many of these modalities challenge the dominant biomedical paradigm because they cannot be explained by the usual biochemical mechanisms. One possible influence of biofield phenomena is that they may act directly on molecular structures, changing the conformation of molecules in functionally significant ways. Another influence is that they may transfer bioinformation carried by very small energy signals interacting directly with the energy fields of life, which is more recently known as the biofield (Rubik et al, 1994).
Moreover, other mysteries in biology and medicine exist that appear to involve interacting energetic fields, including the mystery of regenerative healing in animals, sometimes associated with innate electromagnetic energy fields that have been measured (Becker, 1960, 1961) and sometimes actually stimulated with external low-level energy fields (Becker, 1972; Smith, 1967). Another mystery is that living organisms respond to extremely low-level nonionizing electromagnetic fields, displaying a variety of effects ranging from cellular and subcellular scales to the level of brain, emotions, and behavior. These fields may be beneficial (therapeutic), deleterious (electromagnetic pollution), or neutral. Then, the mystery of embryonic development from the fertilized egg to an organized integral animal should be considered, which may also involve innate energy fields, starting with the initial polarization of the fertilized egg.

– Lyn Freeman 
http://www.faim.org/measurement-of-the-human-biofield-and-other-energetic-instruments



 – working BOM

Solar Powered Self Watering Plant

For our solar project, Dimos and I wanted to create a ‘self-watering plant’ system after being inspired by a few such as this and this. We wanted to take these ideas a step further. We wanted our system to be completely solar powered, thus being an eco-friendly fixture on the domestic plant ecosystem.

 

Materials Needed

  • An enclosure (1)
  • PC Board (1)
  • 5VDC SPDT micro relay (1) **
  • Solar Panel (1)
  • Lithium-Ion Battery (1)
  • Toggle switch (1)
  • 10K resistor (1)
  • Size M coaxial DC power plug
  • Red and black 22AWG wire
  • 12AWG black wire
  • Electric water pump (1)
  • Water storage container w/ lid (1)
  • 8-32 x 2.5″ nuts and bolts (2)
  • 4-40 x 1″ nuts and bolts (8)
  • 4-40 x 3/8″ nut and bolt (1)
  • 1/4″ spacers (4)
  • Wire nut (1)
  • 3′ – 5′ plastic tubing (2)
  • #8 Terminal Ring (1)
  • House plant to water (1)

 

Making Our Own Water Pump

We wanted to be as DIY with this project, so we decided to create our own water pump. This did not turn out as well as we had hoped.

This did not work as the enclosure did not provide enough suction for the water to be drawn in from our reservoir. We then bought a small water pump from Tinkersphere.

 

Battery

We used a 3.7V 650mAh battery that was able to produce the power we needed to power the water pump long enough for the water to be drawn into the plant’s soil.

Sensors

​​​​

 

 

Code

This is the code we used for the Arduino. It was inspired by Randolfo’s version of the code.

 

// Analog input pin that the soil moisture sensor is attached to
const int analogInPin = A1;

// value read from the soil moisture sensor
int sensorValue = 0;

// if the readings from the soil sensor drop below this number, then turn on the pump
int dryValue = 700

void setup() {

pinMode(12, OUTPUT);

// initialize serial communications at 9600 bps:
Serial.begin(9600);
}

void loop() {
// read the analog in value:
sensorValue = analogRead(analogInPin);

//Turns on the water pump if the soil is too dry
//Increasing the delay will increase the amount of water pumped
if(sensorValue < dryValue){
digitalWrite(12, HIGH);
delay(10000);
digitalWrite(12, LOW);
}

// print the sensor to the serial monitor:
Serial.print(“sensor = ” );
Serial.println(sensorValue);

//slow your roll – I mean… slow down the code a little
delay(100);
}

 

 

 

 

Draw Your Face in a Movie Poster Robot

Our group has had an incredible journey piecing together our robot together over the last 6 weeks. After building the CNC machine from scratch, we had to develop an idea that changed the nature of the CNC and used it in a completely new way than what was intended by it’s creators.

Concept: Create a robot that take a picture of the user, and then draw that user’s face as a character on a movie poster.

 

Our team had various design challenges we had to overcome. Some of the considerations we had to think about included:

  • Designing a method by which to hold a writing utensil. This is important as we needed the unit to flex just enough to provide a natural artistic stroke to the robotic gantry.
  • Designing the CNC bed to provide ample support, pressure and texture to the pen from above.
  • Determining the proper writing utensil
  • Converting an image from it’s original format to SVG, making it Black and White, then editing it so that the image is only composed of outlines.
  • Making sure the camera recognizes the user’s face
  • Determining where to place the cropped and converted image of a user’s face into the appropriate area on the movie poster

 

Method to Hold the Pen

We experimented with various types of ways to hold the pen using the original router holder. We ended up designing our own plate and 3D printing our pen holders for maximum flexibility and precision.

 

 

Designing the CNC Bed

We wanted to provide the proper surface for the pen to write on the bed. We created a bed out of acrylic, to make sure that the holes were covered. We then placed a thin sheet of foam and thicker poster paper on top to provide a firm, cushy and smooth surface to put paper on so the pen would easily glide while drawing.

 

Deciding the Proper Writing Utensil

We tested with various types of pens and markers. The way our machine is set up, it made the marker lines way too thick, so we opted to go with a ball-point pen, as it easily glides along the paper.

 

 

Computer Vision, Image Conversion and Process Flow

Our process flow is the following:

 

Select movie poster. Title it ‘poster.jpg’

Take image of user

 

Python script detects the user face on the movie posters

Select the face you want to swap with and swap faces

 

Convert File to Black and White & SVG

 

 

Design Expo: Week 3

A first pass at our vision, mission, and product statements:

Vision

To empower people to live healthier lives.

Mission

We physicalize accurate, realtime data to create a more engaging fitness experience.

Product

We are creating a MR experience that adds an informative, social and interactive layer to the running experience.

What does that mean?
We physically contextualize data and training goals to make training more personal than racing a clock.

———–

Research

Google form: LINK

Channels: Facebook friends, random runners, Running forums

We are following the customer development mindset in actively reaching out to those one degree away from our network to really get a sense of the problem with running tech today. By actively trying to seek out the problem, we can accurately plan and design our AR solution to meet the needs of our potential users.

We have already had conversations with active runners and plan to continue our efforts here.

———–

Project Plan

Our project plan can be found here: https://docs.google.com/spreadsheets/d/1Eotplzvp4JShx_T5ZoxSc9RbJ7bnM4kGGKWRImXxccA/edit

———–

Next Steps

1) Consistently perform customer development on a weekly basis

2) Develop user stories guided by our customer development efforts 

3) Organize features and develop low fidelity wireframes to then test on isers. We have an idea how we might be able to accurately and efficiently prototype and test on users

4) Develop a story board for our commercial complete with requirements needed for the shoot 

Light With Capacitive Touch

I decided to work on revising the light from earlier this semester. I wanted to incorporate what I learned from writing about the Captouch sensor into this project.

I first began wiring the Cap Sense sensor by following Adafruit’s examples. This provided a good frame work to then expand upon.

I really struggled with adapting the code to reading a byte of information, which Tom Igoe helped me think through. Even though I had it coded, it would not print the sensor values in the Serial Monitor. 

I originally wanted to use NeoPixels, but there seemed to be some conflict with that library and the Cap1188 library. After discussing with a few of my peers, they had similar experiences eith dealling with both libraries, most notably with the Rotaty Encoder. 

XYZ Progress Update

Our team had three goals this week:

1) Refine and stablize the pen plotter so that the pen is more secure

2) Draw a complex image and figure out file convesions from PNG to SVG

3) Experiment with various pens/markers
One thing we were impressed by was the level of precisions of the pen. Below you can see how precise fhe pen is.

We were impressed by the level of precision by the pen. If you can see below, the lines are incredily close together. We chose the dog image for it’s relative simplicity and then the following image of the girl to see if we could draw something thatbhad a few layers in the outline. The girl image took roughly 15-20 minutes to complete, so we can expect the movie poster to take that long. 

We need to figure out if we can use the Gcode to speed upthe drawing process.

​​

​​