Monthly Archives: June 2016

Project: Greenhouse eXactum

Greenhouse-eXactum concept / by Timo Hyyppä

Code: https://github.com/Hakkarainen/greenho

HW-Components:

greenhouse_components_v01

IMG_0166_pScreen Shot 2016-06-29 at 22.54.02

greenhouse_schema_v01_bb3IMG_0148_p

Application:

The application has six parts:
1. Sensor-handling and configuration management
2. Alarm-generation and processing
3. NTP-time stamping
4. Local statistics and reporting
5. sensor analytic on remote web-dash board.
6. MQTT-based network communication

1. Sensor-handling and configuration management

Sensors are connected to analog-inputs A0-A5 of Arduino Mega. There are three kind of sensors: photocells, humidity- and temperature sensors. The sensors are calibrated by using the measurement values of a reference plant-pot as reference levels. If the conditions of a reference plant are changed drastically, new reference values are read automatically and averaged as reference levels for the sensors in other plant-pots . Each plant type has two kind of profiles (soil and air) defined in
relativistic manner when comparing to the reference levels of the reference plant.

Screen Shot 2016-06-29 at 23.53.38

All measurements are based on several readings with a proper delay in between and those are averaged. If measurements are out of range at many sensors repeatedly, a re-calibration of sensors is executed automatically. The time dependent functionalities of the sensor measurements are parametrized and thus can be adjusted according to needs.

The number of sensors can be set and their functionalities and connections can be managed. An example configuration application is implemented in the demo, which also prints the current and modified configurations on the console (per sensor and all sensors) for user review.

Screen Shot 2016-06-29 at 23.06.25

All sensor measurements can be sent to the backend cloud via a publish/subscribe protocol (MQTT) for the pot-specific dash board application. System can also be configured per sensor in a mode which makes only local statistics without the need for network communication for the remote dash board.

Screen Shot 2016-06-29 at 23.10.37

2. Alarm-generation and processing

Each plant type have specific profiles which define the optimal life conditions (measurement ranges) for it. There are both soil and air profiles for the plant type and three relativistic zones (green, yellow and red) are defined above and below the reference plant levels. After measurement it is analyzed and classified by the alarm-application and also cumulative alarm-statistics are maintained and printed out on console locally.

Alarms are reported to the user via a traffic light-like LED-based display and a beeper. Number of beeps and the pitch of the signal ending tone informs the user about which zone the measurement belongs to and how serious the alarm is. The green zone does not generate any sound, The beeper can be set on or off by the user. Alarms are also sent to the backend cloud via a publish/subscribe protocol (MQTT) for the pot-specific dash board application.

For the local serial monitor the alarms are printed as clear console-text, which gives to the user information about the seriousness of the alarm and some proposals for corrective care actions.

Screen Shot 2016-06-29 at 23.10.37

3. NTP-time stamping

The system keeps track of correct time-stamping via NTP-protocol which fetches the official Internet time (Unix-time) in seconds when ever needed. Unix time is converted also into local time for the reporting on the console.
4. Local statistics and reporting

The system classifies the measurements into selected number of classes configured during the system installation. The last classified measurements and cumulative statistics are stored locally for console reporting. Measurement statistics and also histograms covering longer periods are available. Alarms are reported respectively per alarm zones.

Screen Shot 2016-06-29 at 23.17.08

Screen Shot 2016-06-29 at 23.42.42

5. Sensor analytic on remote web-dash board.

The remote web-dash board application supports creation of sensor and alarms based publish/subscribe-information feeds and configuring of a set of selected UI-components which are able to process the data feed inputs for user. The dash board UI-components are able to draw graphs at selected time interval scale and in real time. New feeds and UI-components can be created and existing ones updated or removed. If a new subscription feed is created at dash board end it needs definition of its publishing counterpart in the Arduino end and vice versa is true if a new publisher is defined in the Arduino end.

In the demo the pot-specific sensor measurements are displayed as realtime sensor-graphs and also cumulative class-histograms. The sensor specific alarms are displayed in real time as pot-specific lists. The dash board allows flexible configuration change and information presentation capabilities.

https://io.adafruit.com/dashboards

Screen Shot 2016-06-29 at 22.55.52Screen Shot 2016-06-29 at 23.01.59

 

6. MQTT-based network communication

Why MQTT IoT-protocol ?
https://learn.adafruit.com/mqtt-adafruit-io-and-you/why-mqtt

MQTT IoT-protocol:

Frequently Asked Questions


6.1 This Greenhouse-project uses:
https://learn.adafruit.com/mqtt-adafruit-io-and-you/intro-to-adafruit-mqtt
Publish & Subscribe protocol / Adafruit-concept:

With MQTT the Greenhouse system can publish data to the MQTT-broker, and also subscribe data from the MQTT-broker.

Adafruit CC3000 wifi + MQTT:

https://www.adafruit.com/products/1469
https://github.com/adafruit/Adafruit_MQTT_Library/blob/master/examples/mqtt_cc3k/cc3000helper.cpp

Screen Shot 2016-06-29 at 23.32.36

Libraries needed:

#include “Adafruit_MQTT_CC3000.h”
#include “Adafruit_MQTT.h”
#include “Adafruit_MQTT_Client.h”

/*** Adafruit.io Setup ***

#define AIO_SERVER “io.adafruit.com”
#define AIO_SERVERPORT 1883
#define AIO_USERNAME “…your AIO username (see https://accounts.adafruit.com)…”
#define AIO_KEY “…your AIO key…”

// Store the MQTT server, username, and password in flash memory.
// This is required for using the Adafruit MQTT library.
const char MQTT_SERVER[] PROGMEM = AIO_SERVER;
const char MQTT_USERNAME[] PROGMEM = AIO_USERNAME;
const char MQTT_PASSWORD[] PROGMEM = AIO_KEY;

// Setup the CC3000 MQTT class by passing in the CC3000 class and MQTT server and login details.
Adafruit_MQTT_CC3000 mqtt(&cc3000, MQTT_SERVER, AIO_SERVERPORT, MQTT_USERNAME, MQTT_PASSWORD);
Greenhouse-project implementation using io.adafruit:

Greenhouse-project libraries for connectivity:
#include <Adafruit_SleepyDog.h>
#include <Adafruit_CC3000.h>
#include <SPI.h>

#include “utility/debug.h”
#include “Adafruit_MQTT.h”
#include “Adafruit_MQTT_CC3000.h”
#include <ccspi.h>
/*** Feed examples used in Greenhouse project ********************/

// Setup a feed called ‘lightsensorA0’ for publishing.
// Notice MQTT paths for AIO follow the form: <username>/feeds/<feedname>
const char LIGHT_A0[] PROGMEM = AIO_USERNAME “/feeds/LIGHT_A0”;
Adafruit_MQTT_Publish lightsensorA0 = Adafruit_MQTT_Publish(&mqtt, LIGHT_A0);

// Setup a feed called ‘humiditysensorA2’ for publishing.
const char HUMID_A2[] PROGMEM = AIO_USERNAME “/feeds/HUMID_A2”;
Adafruit_MQTT_Publish humiditysensorA2 = Adafruit_MQTT_Publish(&mqtt, HUMID_A2);

// Setup a feed called ‘temperaturesensorA4’ for publishing.
const char TEMPE_A4[] PROGMEM = AIO_USERNAME “/feeds/TEMPE_A4”;
Adafruit_MQTT_Publish temperaturesensorA4 = Adafruit_MQTT_Publish(&mqtt, TEMPE_A4);
// Alarm-feeds

// Setup a feed called ‘lightsensorA0alarm’ for publishing.
const char LIGHT_A0_alarm[] PROGMEM = AIO_USERNAME “/feeds/LIGHT_A0_alarm”;
Adafruit_MQTT_Publish lightsensorA0alarm = Adafruit_MQTT_Publish(&mqtt, LIGHT_A0_alarm);

// Setup a feed called ‘humiditysensorA2alarm’ for publishing.
const char HUMID_A2_alarm[] PROGMEM = AIO_USERNAME “/feeds/HUMID_A2_alarm”;
Adafruit_MQTT_Publish humiditysensorA2alarm = Adafruit_MQTT_Publish(&mqtt, HUMID_A2_alarm);

// Setup a feed called ‘temperaturesensorA4alarm’ for publishing.
const char TEMPE_A4_alarm[] PROGMEM = AIO_USERNAME “/feeds/TEMPE_A4_alarm”;
Adafruit_MQTT_Publish temperaturesensorA4alarm = Adafruit_MQTT_Publish(&mqtt, TEMPE_A4_alarm);
// TOTALS FOR HISTOGRAMS FEEDS

//Measurement totals-feeds:
// Notice MQTT paths for AIO follow the form: <username>/feeds/<feedname>
const char A0_LIGHT_CLASS_TOTALS[] PROGMEM = AIO_USERNAME “/feeds/A0_LIGHT_CLASS_TOTALS”;
Adafruit_MQTT_Publish A0lightClassTot = Adafruit_MQTT_Publish(&mqtt, A0_LIGHT_CLASS_TOTALS);

//Measurement totals-feeds:
const char A2_HUMID_CLASS_TOTALS[] PROGMEM = AIO_USERNAME “/feeds/A2_HUMID_CLASS_TOTALS”;
Adafruit_MQTT_Publish A2humidClassTot = Adafruit_MQTT_Publish(&mqtt, A2_HUMID_CLASS_TOTALS);

//Measurement totals-feeds:
const char A4_TEMPE_CLASS_TOTALS[] PROGMEM = AIO_USERNAME “/feeds/A4_TEMPE_CLASS_TOTALS”;
Adafruit_MQTT_Publish A4tempClassTot = Adafruit_MQTT_Publish(&mqtt, A4_TEMPE_CLASS_TOTALS);

// END OF FEEDS DEFINITION *****************************************************

 

greenhouse_components_v01

Greenhouse monitoring system

A monitoring system based on Light Blue Bean, node.js/Johnny-Five and d3.js

Overview

IoT is a huge trend nowadays but it doesn’t necessarily guarantee real value for users. This project is basically a simple system for monitoring plants well-being. Main focus was to present collected data in clear and readable format and offer real value (usefulness) to users. One design aspect was to offer user interaction instead of just presenting monitoring data.

  • Features of the system:
  • Current temperature
  • Temperature trend
  • Current light level
  • Current soil moisture value
  • Previous soil moisture value
  • Alarms:
    • Temperature – Adjustable low and high limit values
    • Soil Moisture – Adjustable low limit value
  • E-mail notifications based on user defined alarms and measurement data.

arc

Light Blue Bean and sensors

Light Blue Bean is Arduino compatible micro controller board that uses Bluetooth Low Energy for communication. Bean has 5 digital pins and two analog pins that can be used also as digital pins. The Bean has also a built-in temperature sensor, a three-axis movement sensor and a RGB led.

Bean includes a small prototyping board where you can solder your sensors etc. I soldered photo resistor and wires for soil moisture sensor (basic conductor sensor). I soldered also external power supply wires for small battery back. Power source for Bean needs to between somewhere between 2.6v and 3.6v. Nominal voltage for Bean is 3.3v. I used old battery case with two 1.5v AA batteries which gave me around 3.2v when batteries were full. Bean was running several days with this setup.

I installer the Bean into a small glass jar. One great advantage of the Light Blue Bean is it’s relatively small size and independence of wires (if you don’t need external sensors or power).

For starting to use the Bean you need first upload your program (sketch) to the Bean. First a sketch is loaded with Arduino IDE to the Bean Loader. Open Arduino IDE and Bean Loader programs.  Select Bean from the board list in Arduino IDE. Then you can upload sketches to the Bean Loader. After that In the Bean Loader app simply connect to your Bean and upload the program. You might need to update your Bean firmware also, depending on the version your device is running.

I found out that loading sketch (firmata-bean in my case) to the Bean was unreliable. Sometimes it succeeded and sometimes not. I didn’t do any serious testing but I started to think that bluetooth signal strength might have something to do with it.

IMG_0833

Back-end 

Johnny Five framework is a JavaScript framework for controlling micro controllers / bots. Johnny five works (almost) straight out the box with common Arduino boards and you can use also other boards with dedicated IO-classes.

Light Blue Bean is plain BLE connection device so you can’t connect it with usb wire. Fortunately there is an easy way to connect the Bean to the Johnny Five with Bean-io class.

Bean’s built-in sensors are independent from the Arduino side. For polling these sensors Ble-Bean is a convenient solution.

You need to include all these plug-ins to your Node.js script. It is also mandatory to load firmata-bean sketch to the Bean. After that it is pretty straight-forward to use the Bean with Johnny-five.

One thing that is needed to take into account when programming Arduino side of the Bean is different mapping of the pins, e.g. when using analog pins A0 and A1 you need to call pins 4 and 5.

All the data was stored to Firebase. I used simple user account authentication. The data from the Bean was read and stored every two hours. When the Bean data is read script also checks if the values are within the user defined alarm limits stored in the Firebase. If the read sensor value isn’t within the user defined limits node script sends notification e-mail to the user with NodeMailer.

Sensor readings and scaling the values 

I used old soil moisture sensor which was already pretty heavily corroded. I needed to adjust the value range to suit the sensor readings which were approximately half of the theoretical maximum. I scaled the values before I stored the data to the Firebase.

Light level sensor was simple photo resistor that came with the Arduino Starter Kit. I needed to adjust the scaling of the sensor readings. The readings might be linear but I decided to cut the floor limit to around the 45% of the max value which was already very dim level.  This way the shown values were more in line with how humans perceive brightness. Probably plants needs also some light before anything happens.

Front-end

Front-end was made with D3.js and Angular.js. I first looked ready made re-usable charts, but I soon came to conclusion that I needed to do all the graphical presentations from the scratch with D3.js. If you want to use easily customisable charts with Angular for your data, check Angular-nvD3 or NVD3.js. I also included Ionic for easy platform conversations (iOS and Android). Ionic made it easy to run the app at certain IP so I could test the layout and usability with touchscreen devices such as iPad.

gh_view

D3 part is implemented as an angular directives. Angular controller and directives listens changes in Firebase and data is updated in real time. When user changes alarm seting angular code changes Firebase database values that are also monitored by back-end.

Detected problems / difficulties

I already mentioned that there was occasional problems when I was tried to load sketches to the Bean.

I also noticed that polling data from the Beans Arduino part didn’t work well when polling frequency was high. Average time the data updated reliable seemed to be around two minutes. If the polling time was below two minutes there were usually duplicated values from previous readings even the readings should have been something else. This problem didn’t concern the Beans built-in sensors controlled by Ble-Bean which worked just fine. There’s might be an easy fix for this problem but I didn’t found out the solution. Anyhow in my final version the data was read every two hours and the data seemed to be reliable with this frequency.

Future possibilites and reflections

The user adjustable threshold values could be used for new hardware implementations such as watering the plants or controlling ventilation / shading actions. Back-end (Node.js) could easily run in Raspberry or some other low cost device.

This project was clearly a prototype. The hardware installation could be done better. I thought about using 3.5mm jacks for the power and moisture sensor connection but I just didn’t buy those. However the glass jar worked well with the photo resistor. At code level there is also room for structural improvement but considering the scope of this course I’m pretty happy for the results as a whole.

Code examples can be found here.

Automated Lights and Curtains

Overview
Automation is the key to make life easier. In this project, we have implemented a simple Internet of Things (IoT) system to automate lighting and curtains based on triggers from the environment. The automation system can turn on or off the lights and adjust the brightness if needed. The system can also lower or open curtains. As the trigger to initiate actions, different sensors such as luminosity could be used, but in this project, the main trigger is network traffic.

Hardware
Philips Hue bridge + 3x lights
Intel Edison + Intel Edison Kit for Arduino
28BYJ-48 5V DC 4-Phase 5-Wire + ULN2003 Stepper Motor Driver Board
Dell D630 laptop with Netwjork W522U USB WLAN dongle
Easton Carbon One 660 arrow

Wiring the Curtains
Edison-wiring
The stepper motor board was wired to the following pins on the Arduino kit:
IN1 -> 3
IN2 -> 5
IN3 -> 6
IN4 -> 9
GND -> Digital GND
+ -> 5V

Software
The following software and libraries were used in the project
Mraa: Low Level Skeleton Library for Communication on GNU/Linux platforms [3]
phue: A Python library for Philips Hue [4]
Scapy: the python-based interactive packet manipulation program library [6]
Dnsmasq [1]
Hostapd [2]
Google Protobuf [5]

Description of the Prototype
As the full prototype includes a lot of moving pieces, brief descriptions of each piece is given here.

IMG_20160630_123652
Access Point
The central part of the system is a Dell D630 laptop. The laptop acts as an wireless access point and router for the entire system. The laptop is running Ubuntu 14.04 LTS, dnsmasq [1] and iptables. The AP is created using a Netwjork W522U USB WLAN dongle, which supports master mode, and hostapd [2]. The Philips Hue Bridge is plugged into the AP using a generic USB Ethernet dongle. The Edison uses its onboard WLAN NIC to connect to the AP.

IoT Hub
The IoT Hub is a small central server, which listens commands from different monitoring elements. These elements can be for example sensors (with relevant software) monitoring luminosity or network monitors. In the end, it does not matter what kind of trigger or combination of triggers are used, something needs to gather the signals and translate them to relevant actions.

The IoT Hub listens for commands through a protocol created with Google Protobuf.[5] In general, the signals are notifications from events. In IoT environment, signal could be luminosity change, or in a bit more general environment, a network event such as stream to specific address or a DHCP request.

In the end, the hub takes inputs and then translates them to relevant outputs to different IoT devices, in this case, the curtains and the lights.

Intel Edison and Curtains

IMG_20160630_123728
The stepper motor is wired to the arduino board, including power and ground. On the Edison, mraa, a low level skeleton library for communication on GNU/Linux platforms with Python bindings [3] is used to change the state of the pins and drive the stepper motor.

A small UDP socket server is used to listen simple, unauthenticated UDP datagrams. When the server receives either “open” or “close” command, the server runs the stepper motor in full-wave mode in either clockwise or counterclockwise direction by specified number of steps. By experimenting, 700 steps with 0.01 second interval between steps seems to be about right for the current gearing ratio for rolling the curtain up down.

Philips Hue

IMG_20160630_123747
To control the lights, an off-the-shelf Philips Hue set is used. The set contains three lights and a bridge to control the lights. The basic setup, i.e. associating the lights to the bridge and other initial setup was done through an android app, but for the actual communications between the setup and lights were done using Phue library.[4] A small server listening for UDP datagrams was implemented on the laptop. This time, instead of using plain text open/close -messages, a Google Protobuf was used to create a small protocol for controlling lights. Currently, only ON/OFF and brightness commands for all lights is supported, but finer control would be straightforward to implement.

Network Sniffer
In this prototype, sniffing for network traffic was used to trigger the IoT-side, i.e. lights and curtains. For network sniffing, Scapy [6], a very powerful network sniffing, packet manipulation and dissecting tool, was used. With Scapy, the sniffer monitored network traffic, and when the sniffer detected relevant packets, lights and curtains were triggered. The sniffer created a Google Protobuf message to the IoT hub to trigger relevant actions.

In the demo, a UDP datagram stream to a Google Chromecast was used to trigger turning off lights and closing the curtains. When the stream ended, the sender does a HTTP GET request to port 8008, which was used to trigger turning the lights on and opening curtains.

Integration
To make things slightly simpler, most of the components are either running on the AP laptop or are connected through network. The laptop acts as a home gateway, i.e. acts as a NAT device and runs DHCP server. On the laptop, the IoT hub, bridge controller and the sniffer are running. The curtain controller is  running on the Edison, which in turn is connected to the laptop with WLAN.  The Hue bridge is connected to the laptop via Ethernet cable.

Since all relevant communications are IP based, the components could be spread around the network. The only device bound pieces are the curtain controller and the network sniffer. Curtain controller has to run on the Edison to drive the motor, while the sniffer has to see relevant traffic. Of course, multiple sniffers and curtains can be added to the system.

Problems and Future Work
Surprisingly few problems arose during the course. Some, such as elastic drive band slipping from wheel opening and closing the curtains would be trivial to fix, but others are not so easy to fix.

The heuristics used by the network sniffer are currently trivial, mainly matching packet headers and payloads to predefined rules. While relatively straightforward to get basic funtionality, more fine grained detection and reactions need more work.

The main unsolved problem is the crashing of Intel Edison. When the Edison is connected to main powers with charger, the Edison will crash with quite large probability when driving the 5V motor. If the Edison is also connected to a laptop through USB (only charging), the crashes will not happen. One likely reason is voltage fluctuation, but that is hard to prove.

As with many other IoT platforms, this platform currently suffers from the lack of authentication in the protocols and messages. But, unlike other networks, a straightforward way to mitigate this is the network setup. All network components, i.e. the controller, Hue Bridge and Edison are isolated in their own network and IP address space. Other devices can at best see WPA2 encrypted WiFi traffic and packets are not routed between IP address spaces.

Code is available at: https://github.com/shatonen/eg16

[1] http://www.thekelleys.org.uk/dnsmasq/doc.html
[2] https://w1.fi/hostapd/
[3] https://github.com/intel-iot-devkit/mraa
[4] https://github.com/studioimaginaire/phue
[5] https://developers.google.com/protocol-buffers/
[6] http://www.secdev.org/projects/scapy/

The Pigeon Handler

From the land of the Angry Birds, now you can get a whole new level of entertainment by making those birds angry yourself!! If being the good guy of the story does not go with you then join the dark (and fun) part of the story.

IMG_0269

This prototype aims to scare unwanted pigeons on the rooftop of Kumpula Kampus. The main idea emerged from being wondering what I could do to help the plants from my family garden. Initially, I thought of a common problem we had in the front yard where it is common that random dogs goes to the garden and pee on the grass or plants we own. Thus, I thought that spraying water every time they come close might scare them and keep the garden safe from unwanted visitors. At the same time, I was searching for other ideas, one of them involved trapping small animals or taking pictures of them which sounded interesting but, not convincing enough. After a small talk and many jokes about the project, Samu Varjonen mentioned that trapping animals could be interesting since they have a problem with pigeons on the rooftop of the campus.

This talk lead to the creation of this fast summer project but, many small details were missing yet. Trapping pigeons might not be the best way to get rid of them, or at least it sounds like a heartless option, thus spraying water on them to scare them could be a potential option … as long as they do not start taking summer showers up there. I started to find projects that might be doing something similar. I found a project using Arduino and a motion detector to shots water at a random direction once the sensor detects movement. This project brought an important piece of information to this summer project, using a windshield water pump to not only spray water but aim to the target. You can see this project here.

Now with a more concrete idea and the previously mentioned project in mind, I wanted to extend it. Since I had no motion detectors at hand and I was in a hurry to start, I decided to use a camera which I was able to use not only for detecting motion but approximate the location of the target. Therefore, creating the final prototype for this summer project.

Prototyping

Requirements

The software and hardware required for the project includes:

  • Raspberry Pi 2 with Raspbian
  • A camera
  • OpenCV library for the camera
  • Windshield water pump
  • Servo motor
  • Diode
  • TIP122 Transistor
  • Two 1k Resistors
  • Power supply for the motor, in this case a small laptop charger (12v – 3 amp)

Extra (Not used but could be easily extended)

  • 2k resistor
  • Extra 1k resistor
  • URM37 ultrasonic sensor

Development

First, we have to install OpenCV on the Raspberry Pi. There are some tutorial available online in how to do it. The original library page is can be found here and one of the tutorials I considered really useful can be found here.

Second, we have to plug all the hardware. The following diagram contains 3 out of the 4 basic parts for creating this project. On the right side is the servo motor, which was directly plugged to the Raspberry Pi and contains only an input signal which modulates the direction. On the middle is located the control for the pump, in this project the power supply was a small laptop charger which Samu helped me to build. The left side, which is not included in the code of the project but I think that it can be easily extended is the sonic sensor, expected for avoiding shooting at far objectives and improve aiming accuracy. This feature was emulated here using the camera and will be explained in the following paragraphs. The fourth part is the camera but, in this case I plugged via USB and therefore it is not included in the diagram.

Greenhouse_bb

Once we have installed OpenCV, we can use the library for creating a motion detection system. In a nutshell, the system first have to chose a background image. In order to avoid problems with the first frames received and approximating the background image, the code read continuous frames in gray scale, smooth them using Gaussian blur and calculate the mean of squared error until a sequence has a similar value. This calibration only happens at the beginning of the program but, I think this should be calculated once in a while in order to adapt to small changes in the environment. Once it is calibrated, the system keeps reading frames and calculating the absolute difference. The difference can be used as motion detection since they are changes in the background but, in order to avoid noise and not significant changes, only the biggest change is taken into account and it has to go beyond an empirical threshold in order to be considered as movement. The movement is enclosed into a circle using the library and based on the location on the screen and the size of the circle we can approximate the position to aim for and a naive distance to the target.

The position of the servo with respect to the location of the movement in the frame is approximated using basic algebra. I measured the width of the camera vision at some given distance. Thus, we have a triangle with two known values which were used to calculate the angle of aperture. Hence, this angle can be given to the servo for aiming to our target.

In a similar fashion we can calculate the required duty cycle for each angle. For instance, we can easily obtain the values that give the rightmost and leftmost positions for the servo and then calculate the rest using those values.

Untitled

Finally, for the pump we only require to send a high signal to the GPIO in order to shot, and return it back to low after some given time. Therefore, completing the required code and completing the system.

The source code can be found here.

Eventualities

During the development I experienced some small eventualities that delayed the development and I would add here as warnings since some of them might be crucial. First, beware of the power supply you are using for the Raspberry Pi since it was the cause of many of my problems that showed as random errors or low performance in the best of the cases. Second, beware of the use and abuse of threads in the Raspberry Pi. Threads can increase performance in many cases but the Raspberry Pi has its limits thus, decide what would help the system if running on threads. In this project I only left the frame capturing part in a thread but only the capturing part since adding some calculations on the same thread stuck completely the system. Finally, while applying divide and conquer with the different tasks works great for developing and testing, keep in mind that the Raspberry Pi would have some problems once everything is running at the same time due to its limited power. This last eventuality made me rework half of the code in order to improve its performance.

Testing

Demo

Author

Angel Gallegos