I’m working on my DIY Smart Greenhouse …
- Automated heater control, ? check
- Automated ventilation control, ? check
- Remote monitoring and configuration, ? check
Now that the greenhouse has a comfortably controlled atmosphere, it is time to move the tomatoes and chillies outside and into their new smart home. I prepared the earth and filled the pots, now it is time that the first plants follow.
After soaking the seeds for two days, I transferred them to two tiny green houses. As soil I used compressed peat pellets which expand after watering them.
The oregano and basil are already showing leaves!
Thanks to all those organisers who made this awesome event possible!
After we set up our stand in the morning, we were busy until the very end. There was constant buzz of kids around, who wanted to try the ROV and steer it into one of the bucket-caves we set up in the pool. Just around closing time a managed to have a quick tour of the other stands. So nice to see all this creativity around, reminds me of the Chaos Communication Camp in 2011.
Heading out to the after-party now 🙂
PS: talking about the Chaos Communication Camp, who from Trondheim region would be interested in going there next year? I set up an etherpad here: https://pad.fnordig.de/p/trdccc15
Last preparations for the upcoming Trondheim Maker Faire 2014. In order to display the Raspberry Pi / Teensy 3.1 based submersible ROV, I bought a used 3m diameter inflatable pool. This should give us enough space to demonstrate the little buddy.
The ROV got a slight face-lift by adding proper motor-pods and casings for the cables. The ballast stones tied to the bottom got replaced by a rusty old chain in casings. Better solution pending …
See you at the Trondheim Maker Faire booth 10a: Map of the Faire
For the Maker Faire coming to Trondheim later this month, I’m currently building a simple submersible remotely operated vehicle. The concept is based on a plumbing tube used for the housing and submersible electric pumps as motors. A Raspberry Pi with Camera Module will deliver the live video feed via ethernet cable. The remote control is managed via a serial link to a Teensy 3.1 with sensors and motor controllers. The communication will be implemented using MAVLink, which enables the use of the QGroundControl station.
After initially testing whether the tube could sustain the pressure at up to 12m water depth, pieces are now falling into place. The serial communication via MAVLink works and just needs a little performance tweaking. It can transmit the manual controls from a game-pad down to the ROV controller and the telemetry back up to the ground station. Telemetry data consists of air-pressure and temperature in the body (MPL115A2) as well as orientation data gathered from the IMU/AHRS (MPU-9150). The ground station visualises the temperature and pressure as line-graphs and uses the orientation information for an artificial horizon.
The plumbing tube of the housing is sealed with two acrylic-glass windows. One 6mm one in front of the camera and three 4mm layers in the rear end. The three layers form tunnels for the cables going out to the motors and up to the ground control station. I hope with plenty of silicone this will be water tight.
If anyone has a good idea how to set up a basin/pool to demonstrate the ROV in, let me know.
Four weeks to go! 🙂
As a little biology experiment, I picked random things from the forest behind the house and put them into two small plastic aquariums and sealed everything airtight with silicone. I paced this biosphere on the window and installed a RaspberryPi with PiCam NoIR above it. This was in late March and ever since (with a few brief pauses doe to lack of space on the sd card) the Raspberry took a photo from above every 15 minutes.
Using mencoder and handbrake I compiled a time-lapse video of the past months. It is amazing how alive these plants are and how long the balance in this isolated environment remains stable. (Don’t be fooled by the brownish appearance, this is just due to the colour shift of the infra red camera. The plants are still nearly as green as in the picture above.)
I recently ordered a bunch of I2C breakout boards to tinker around with. The first thing I implemented, using the Pi4J library, was a simple eight by eight version of Conway’s Game of Life using an I2C controlled bicolor LED matrix from Adafruit.
Next I build a simple two-wheeled robotic platform (RPi2C) to test the 9DOF MPU9150 breakout board from Sparfun. This little chip integrates accelerometer, gyroscope, and magnetometer into a single package and also includes what Invensense calls a digital motion processor for on-board sensor fusion.
The two wheels are driven by basic Parallax continuous rotation servos and controlled by the same I2C based PCA9685 breakout board from Adafruit I already used on the RaspberryPylot. As the MPU9150 DMP requires uploading of firmware via I2C which I still need to implement, I’m currently simply using the raw gyro data and a PID controller for very basic stabilisation. Also the low rotational speed of the servo motors limits the ability to recover from disturbances. Fusing the sensor data of the accelerometer and the gyro, an improved controller, and maybe stronger motors should deliver better results, soon.
The platform also includes a basic bread-board, a bunch of potentiometers, switches, and buttons, as well as an ADS1115 16-Bit ADC and a SX1509 16 Output I/O Expander for happy tinkering. 😉