Sensoring Aquaponics

Your ads will be inserted here by

Easy Plugin for AdSense.

Please go to the plugin admin page to
Paste your ad code OR
Suppress this ad slot.

As part of my interest in aquaponics I’ve always intended to enable some kind of electronic monitoring, various thoughts on what to monitor and how have been in my head for a while. Recently I’ve been wondering if the temperature levels of the water are high enough so this was as good a place as any to start.

An Arduino was my obvious choice to start with, I built various circuits but soon decided that I wouldn’t be running Ethernet anywhere near and that kind of ruled it out with regards to re-using existing kit. I decided to use one of the nodeMCU devices I’m halfway through a project on (I’ll just buy another couple eventually).

Having previously used LUA script on the devices, I decided to use the Arduino method instead this time, mainly to investigate the OTA options now available. I’d already built up some perf board based shields to house a connector to a DS18B20 temperature sensor, so I just added some female headers to one of the waterproof sensors I have lying around and hey presto.

I have power near the tank so I’ve opted for an always on solution with a permanent supply, rather than power from batteries and have to worry about charge cycles. WiFi connection works perfectly outdoors too.

I will be publishing the sketch on GitHub shortly (watch here for an edit), just need to tidy up a little. But essentially it reads the temp every 10 seconds and publishes to an MQTT topic, fairly simple for now but I do plan on adding more sensors for humidity, greenhouse temperature and possibly others such as luminosity and maybe barometric pressure and pH. For now though I have enough to get me going and also have the data of interest.

The data itself just goes to an MQTT stream, but I’ve developed a simple web front end using jquery to show a simple gauge. I have also used the script at to generate rrd information to form rrd graphs which can actually be seen in the sidebar on this site.

Great data for now.

Here’s a couple of pics of the device and the probe in the water.


Aquaponics update

Thought I’d post a quick progress report on the aquaponics setup as the build pretty much complete and the growing has commenced. I left the last post where I had a lovely clean tank and had just introduced 20 goldfish. About a week later I introduced a further 20 and I’m happy to say that around 3 weeks on the fish are thriving, initially they seemed to cower in a corner but are making full use of the space.

We’ve had a good mix of weather, both heavy rain to top up the tank and blistering sunshine to reduce it again. The sunshine has helped bring on a fair amount of algae growth inside the tank, I’m assured this will subside once the nitrogen cycle really develops. Its assisted by the fact the tank is made of while slightly opaque plastic, this allows the UV to penetrate and promote the growth of the algae. This was one of the main reasons, alongside the promise of making it look nice that I created a wooden surround for the tank.

I created the surround using 2 or 3 old palettes a friend acquired for me, he asked permission first. I think the result is pretty good, if I do say so myself.


The other update is the progress of the plants, we now have carrots, cucamelons, pak choi, broccoli, spring onions, cucumbers and lettuce growing nicely. The lettuce has bloomed immensely, and mostly before the nitrogen cycle properly took effect. The rest are now starting to show progress as we are getting early signs of the cycle starting.

IMG_20160608_080619561_HDR IMG_20160606_214230596

You can see the lettuce has really come on, and also early shots of the cucamelons. These are a new fruit to me, they are small grape sized fruits which look like water melons but apparently have a taste of cucumber and lime.

I’m due to retest the water in the next day or so, but last weeks tests showed a good sign of the expected ammonia, I’m hoping to see the signs of Nitrates or at least Nitrites next.


Left to right: pH – nice and stable at expected value of around 7.6, ammonia showing good early signs, nitrites still showing zero, and nitrates maybe showing a trace. Still really enjoying the aquaponics way of growing and now looking at ways to expand.

My foray into aquaponics

A few weeks ago, a friend of mine and I were sharing YouTube videos around water harvesting, hydro electricity generation and the like. As the usual YouTube journey does, I ended up going off on a tangent and watching a video on aquaponics. I found it really fascinating so started looking deeper and deeper, and finally decided to give it a go. I’d always planned on getting a greenhouse for our new garden and a couple of planter boxes and growing veg but this looked great.

One of the videos I ended up watching was Building an IBC aquaponics system. From here and reading various blogs and forums I decided I would also build a system based on this method, a quick search on eBay later I found a local chap who was nice enough to deliver an empty IBC crate. These are a 1000 litre container roughly a metre square.


The idea with IBC aquaponics is that you cut approximately a quarter off the top, remove a further quarter then leave half. Turning the top section upside down this leaves a large tank for fish and a smaller tank for growing in.



I cut mine using a metal blade on my Black and Decker scorpion saw, went through the frame and the plastic nice and easy. Many people use an angle grinder also. Anyway I had mine cut and all nicely cleaned up from the stinky milk contents that were previously in the container. Now on to putting the pipework together.

The idea behind aquaponics, at its most simplest, is that the fish fertilise the plants and the plants clean the water for the fish. So with this in mind we need some way of getting the water from the fish tank into the grow bed, and of course back again. I fashioned a frame of 21mm pipe around the grow bed with 6mm holes at 15-20cm intervals and attached this to a 1500 litre per hour submersible pump to sit in the fish tank.


You can see the pipe roughly dropped into place in the image above. Now we had water going in, there needed to be a way for it to drain out. There are a few ways people do this, I opted for what is known as the flood and drain method. This is where the grow bed fills with water then almost completely drains using a bell siphon. In the above photo you can see a pipe sticking up in the centre, this is in preparation for the siphon.

I built my siphon based on the design of Affnan as most people do, although I struggled to find certain parts in the UK I just made those bits up as I went along. I used the lid of the IBC and put a hole big enough to fit a 21.5mm push fit tank connector through, I used a 90 degree bend version so I wouldn’t have to mess around with one further down. This fitting meant i was left with a screw fit connector on the topside, which I couldn’t seem to find a suitable connector to attach which would take 21.5mm push fit pipe and the inner diameter was just too small to take the pipe. I ended up heating up the pipe and coaxing it into the end, the result was sealed tight and seemed to work quite well.


The next part of the siphon was to cut the standpipe to size, this should be roughly an inch below the grow medium, which is roughly an inch below the top of the container, therefore 2 inches below the top. Over the top of the stand pipe needs to go another larger pipe, in my case a piece of 68mm drain pipe, with a top cap on and pieces cut out at the bottom to allow water to flow through. Also useful is surrounding this with a further larger piece of pipe to prevent roots or grow medium clogging up the siphon, for this I used a piece of 110mm soil pipe with slits cut into it.


Here you can see pipework, standpipe and grow medium all in the top bed ready for a test run. The fish tank is also full at the bottom.

The grow medium you can see is made up of small clay balls, these are a firm favourite among aquaponics system builders due to their consistency and guarantee that they won’t lower the pH of the water. It took roughly 210l of medium to get to the right level in the grow bed.

I gave the system a test run without the siphon fully built to ensure that the clay balls had any excess dust removed, even though we cleaned it thoroughly before putting it in the system they were still quite bad which ended up with me draining the whole system and re-cleaning the fish tank and filling it back up again. While I was doing that I also emptied the grow bed and gave that a clean out too, the main reason for doing so was I had spotted a nice idea on a forum, someone had painted the top frame black which looked great and now so does mine :).

IMG_20160514_171139215 IMG_20160514_171124077

I was much happier with a fully clean system so proceeded to refill the grow bed with medium and the fish tank with water. At this point I also had some good size lettuce seedlings (well a bit bigger than seedlings but ideal) so whacked them into the bed.


After this was done I left the pump going to start the flood and drain cycling process, I absolutely love the siphon its just genius.


The bed takes around 10 minutes or so to fill up, probably need to work on timings there but I have a few ideas on expanding the system slightly which will increase that to a better time of around 15 minutes. Once the siphon kicks in, full flow is achieved in a matter of seconds and a drain usually takes around 7 or 8 minutes which I’m really happy with the timings of.

I then left the system to cycle for a few days then measured the pH of the water, this came out at around 7.4 which is pretty much spot on for Goldfish.


I was thinking of leaving the system cycling for a week or two as that seems to be the unwritten rule, but seeing as the pH was OK I also measured Ammonia levels just to make sure with those coming out at 0. So what the hell lets get some fish in there, I went to the local aquarium and picked up 20 small goldfish and introduced them to the tank later that evening, taking care to acclimatise them to the tanks water temperature and introducing small amounts of its water into their transport.


They have been in for a few days now and seem reasonably happy.

So there we go, quite a bit different from my usual type of project but I’m completely engrossed. Although I keep building a small Arduino based device to monitor various elements of the setup and I’m also looking to maybe put a time-lapse Pi alongside to watch the plants grow so will definitely be including some geeky tech in this somewhere along the line.

Stay tuned for future progress on the system.

LightwaveRF Comfy Review

The LightwaveRF Comfy smart home heating starter kit comes with 3 radiator valves and a LightwaveRF Link unit. All looked fairly straightforward out of the box, fortunately my radiators

came equipped with Honeywell TRVs which made attaching the LightwaveRF ones really simple. That said there are a myriad of conversion attachments supplied so no worries there.


Being a true geek however, I didn’t start with the valves I went straight for the networky connection. Pluged the LightwaveRF Link box into my router and power and watched as the lights flashed in all their glory. No idea what they were doing it was time to consult the rather straight forward manual. No mention of any web interface it seemed the first step was to download the smartphone app. No problem, it seems most of the vendors head this way, after all its fairly easy to discover a device on a network with an application but not via a web browser (hmm, why hasn’t someone developed that?). App installed I’m ready to configure the gateway, but oh what? I’m presented with a login screen. One of the first rules of IOT broken I can’t control the device without having an internet connection, or at least set the device up – not a great start especially for those who value privacy. Further into the tests I tested whether internet connectivity was required to control the valves once the initial setup was completed, it wasn’t but an annoying red light would flash on the gateway.

After signing my life away I quickly followed the simple onscreen instructions to pair the device with the app, presented with the message afterwards that local link was detected and all commands would be sent locally – that was kind after having to sign my life away. The brief introduction within the app is painless and useful guiding me through the basic functions of getting going, the app looks pretty straight forward and pleasing to the eye too.


I then ventured to my first radiator about 3ft from where the gateway was positioned, screwed the device on and inserted a couple of batteries. The valve went through the motions to calibrate the motor, presumably some kind of learning exercise to find the length of the pin it was to control on the tap of the radiator, this took around a minute to complete. Having previously tinkered with a standalone programmable valve and being disappointed in how noisy the motor was, I wasn’t overly surprised to find that the motors in these valves were much the same. Probably something that can’t be helped but having small children I would probably be reluctant to have a programme move the motors at any point throughout the night, maybe I’m being over cautious and further testing is be needed. That said I’m quite excited about the prospect of having the whole house heating be specific to utilised rooms, and also balanced based upon current temperatures.

Pairing the devices with the gateway couldn’t be simpler, from the Android app I selected the heating tab and clicked the plus sign which then told me to press the link button on the valve. Hey presto I have a display. Ok hold on, when I said simpler I didn’t mean simpler – earlier I said I set up the Link unit before the valves, setup the app then had a mooch around. Well the first thing I was presented with is the “Rooms” tab in the app. Thinking that a radiator would be part of a room I added all the rooms which would contain a valve for this test. I then proceded to add a valve to a room, but no. I knew that LightwaveRF was previously around in lighting, but it seems that heating is a later addition to the portfolio and also currently just a bolt on the the application to “get it out there” as it is completely seperate under the application (at least on android anyway). Still the configuration is all under a tab in the main screen so its not difficult to find or configure.

During the pairing process a name for the device is requested, once this is entered in and pairing is completed the device is then listed under the given name with a nice guage and informational display. Pressing on this entry takes you into a screen which allows the draggins of the guage to select a temperature, after which about a second later the valve then begins to turn to the location it requires. Manual operation within the App is pretty slick and the feedback is also very good. Underneath the manual selection tools is a calendar looking section, albeit very cramped looking. This calendar, days on the left times on the right, allows you to set specific schedules for the valve setting temperatures for times of the day. A really good idea just not brilliantly executed within the app even on a large 6″ display, I believe this is where a web interface would be perfect. Back to the manual – visit it says for the web app, which turns out (after reading up where the pin is displayed) to only work with previous incarnations of the link device, disappointing.

Happy with the installation of the first valve I ventured from the Study into the lounge, literally across the hall, and fit the second valve. I proceded to pair the device but had absolutely no luck, swapped batteries and valves (even took the valve off completely to make sure it would pair closer to the gateway). While my router isn’t positioned centrally in the house which resulted in the Link box neither being so, I presumed if it were pairing would be less of an issue. However, taking into account that the majority of households have a router near the TV or at least the lounge this would be an issue for most. I experimented around the house finding that it wouldn’t pair if it was on the same side of the house as the lounge, this would be where a mesh network such as ZigBee or Z-Wave would triumph. Relegated to the West wing (OK my house is nowhere near as big as you are thinking right now) I proceded to pair the remaining two valves, again without issue. Interestingly the furthest away valve seemed to refuse to report battery information back to the gateway. Valves were positioned in the study, the hallway and a bedroom above the study.

Setting a simple schedule and varying temperatures, low in the hallway, high in the study and normal temp in the bedroom I found the valves responded really well. Unfortunately I could tell they were responding as I could hear them slowly moving the motors into the desired setting. Once they were set, however they would only move should the room temperatures vary enough. What I was unsure of was how accurate the temperature readings were due to them being so close to the radiators themselves, although manual compensation of a few degrees could probably be added without issue. The other scenario which bothered me was the fact that a thermostat was in place in my house, if it were taking temperature readings from a room with a valve which closed at a particular temperature which resulted in the thermostat report a lower temp than desired there could be a scenario where they would end up fighting it out. This would need to be seriously calibrated, or (as I’m sure is the plan) the LightwaveRF boiler switch be introduced, this made the Starter kit experience feel a little incomplete, but I suppose being a starter kit its the gateway drug.

Looking at the prices of the other elements to the setup – boiler switch, wireless thermostat (would this be needed) and more valves they all seemed reasonably priced but the fact that I struggled with signal strength across my house I would be reluctant to invest further.

All in all the LightwaveRF Comfy experience was pretty reasonable, initial setup was extremely simple although I felt the need for a login was a serious failure. Taking the Philips Hue system, you only need an account to benefit from the online features this should be the connected device approach. The fact that the lighting and heating parts were obviously seperate in the app made the system feel less joined up. I couldn’t find a hint of zoning or grouping of heating devices where this seemed in place for lighting and would surely be beneficial. The included manual was extremely simple to follow and was only let down by the false information around the web app. The majority of modern devices tend to not include a written manual, but in this case it was useful. It feels like this device suite is in its early infancy and it will be interesting to follow as to whether the system improves in functionality, the consolidation of heating and lighting within the app is a must also. Fortunately all the devices shortcomings are in software not hardware therefore could be resolved, but the question is will they be?

Decent Looking
Simple configuration
Simple pairing of devices
App navigation
Useful Manual which is straight forward

Noisy Motors
Control only via smartphone app
Needs internet for setup
Annoying red flashing light if no internet was available
No local web interface to control or configure via
Manual contained wrong information about web app



Maplins weather station fun

weather station control unit weather station sensor pole
A while ago I bought one of the Maplins weather stations, just for fun really no plans with it. The device came with a pole to which several sensors could be attached: temperature, humitidy, wind speed, wind direction and rain level. The other important bit in the box was the control station, which is a large LCD screen based output for the information which is gathered in the sensors, it also gives an attempt at forecasting. Areas of the screen are pressure sensitive which allows for a touchscreen style interface for changing views etc.

All pretty cool, and looks great on the worktop displaying weather info. But the geek inside of me wants more!!! Along with the aforementioned components comes a USB cable and a cdrom, the brains also does data collection which can be extracted using the software on the CD.
weather station in situe
Thats also cool, but unfortunately no software for Linux included on the disk, not to worry pywws  from Jim Easterbrook steps in here. I won’t go into detail on how I set this up as there is a good tutorial here: but what it gives you is a way to extract the weather data from the brains of the weather station store it on a pc then display it through a pretty web interface. Gives allsorts of graphs and fancy data tables.

This was quite cool, but got me thinking about what to do with this data, can it be uploaded to some weather data aggregation service for the greater good? Thats where I found weather underground, or wunderground, and their API. Since finding it I’ve been using it to both upload my data and also retrieve data for various projects (nagios monitoring, temperature monitoring). Pywws contains the ability to post the data to wunderground built in. Great, I have a nice solution the head unit connected via usb to a server in my garage. I can’t see the unit, but I can see the data via Pywws.

It was all going well, that is until the batteries in the head unit run out and the project headed back onto the todo pile. That was until I was working on a seperate project where I was trying to retrieve the RF signal on 433mHz home plugs and saw some random traffic. After thinking it must be a neighbour or a cars remote locking, i realised the data was being sent on a regular basis. It eventually dawned on me after a bit of googling that the data may be from the sensors on the weather station as the batteries were still active on that side. A small amount of googling later and I found that I was able to not only retrieve that data but also decode it into readable data.

To explain, I am using an RTL DVB dongle which is popular in Software Defined radio circles as its frequency can be set to pretty much anything. I haven’t delved into the SDR side of things, but I’ve been using it to scan 433 and 868 mHz frequencies to try retrieve data on RF control devices within my home. Using the dongle and the RTL-SDR software I can achieve this.

My aforementioned googling around the weather station data led me back to a project I was already using, OOK-Decoder, OOK being the form of modulation used in many RF devices and seemingly the weather station. The OOK-Decoder project also comes with another executable wh1080, which it turns out is my weather station.

The project was written by a chap who has a remote weather station, so he was sending the data over a multicast network, recieving it locally and then decoding it. This means its a bit overkill for what i’m achieving but I’m achieving it. Essentially, there is a machine which sits within the vicinity of the weather station sensors with the DVB dongle attached and running OOKD which takes tunes to the 433mHz frequency by default, and streams the raw data over the network. On the receiving (network) end, a PC runs OOKDUMP to output the stream of OOK data raw or runs the wh1080 binary which takes the stream of OOK data and output a json formatted text file.


Why it’s overkill for me is that I run both ookd and wh1080 on the same box, I’ll probably figure out combining the two at some point to reduce the load but for now it works.

The next step is to do something useful with that data, so I wrote a fairly rough python script to upload the data to Wunderground. I’ve also made the wh1080 output the file to a directory which is accessible via a webserver to make it useful to any scripts I write for other projects. Having the data stream directly to the server now also means I can relocate the previously USB tied control station back into the house, double win!

So in a nutshell the process to get weather data directly from a Maplin weather station:

  • Install rtl-sdr as per: (dont forget to blacklist the dvb driver)
  • install ook-decode as per:
  • Run ookd and wh1080 (don’t forget to run with & at the end to allow access back to the console), this outputs to /tmp/current-weather.json by default
  • Create a weatherunderground account and create a personal weather station ID
  • Use my python script with the ID generated above and password to upload the data


Maplin weather station

Jim Easterbrooks PyWws

Jim Studt ook-decoder (and wh1080)

My script to publish json data to wunderground

Ensure systemd services restart on failure

I wrote a post a while ago covering the use of Monit to monitor services running and the use case I covered was to ensure these services restarted on failure. While a useful feature of Monit, it seems to be now a little redundant with SystemD having a built in restart feature.

Same use case where MySQL (or MariaDB in this case) is being killed by Apache’s oom killer.

I first copied the original systemd file associated with Mariadb from /usr/lib/systemd/system/mariadb.service to /etc/systemd/system/mariadb.service

Then under the [Service] section in the file i added the following 2 lines:


After saving the file we need to reload the daemon configurations to ensure systemd is aware of the new file

systemctl daemon-reload

Then restart the service to enable the changes.

systemctl restart mariadb

You can test this configuration by firing a kill to the process, e.g.:

 ps -ef|grep maria
mysql    22701 22542  0 06:52 ?        00:00:01 /usr/libexec/mysqld --basedir=/usr --datadir=/var/lib/mysql --plugin-dir=/usr/lib64/mysql/plugin --log-error=/var/log/mariadb/mariadb.log --pid-file=/var/run/mariadb/ --socket=/var/lib/mysql/mysql.sock
kill 22701
watch "ps -ef|grep maria"

You should see the process restart.

Hardening SSH with OTP for 2 factor authentication

Something I’ve been meaning to do for a while is look into the possibility of using 2 factor authentication, or 2FA, with SSH connections. This would add a much needed level of security to servers I host out in the wild.

Here’s how I did it:

The Google Authenticator mobile app used to be an open source project, it isn’t any more but the project has been kindly forked and looked after by Red Hat under the guise of the FreeOTP project. The first step is to download the app, which is available for Android and iOS there is even a Pebble project in the works.

Google Play:


Next we need to configure PAM, this is the component in Linux which ties authentication together. It allows us to add modules for various authentication sources into various applications which require authentication, in this case we need a module compatible with FreeOTP to provide authentication to SSH.

We’ll be using the pam_oath for this, the OATH toolkit is designed for building one-time password based systems.

yum install pam_oath oathtool gen-oath-safe

This gives us the tools needed to link in to pam, and also generate the initial keys to share between the devices.

Next we need to edit /etc/pam.d/ssh to recognise this module by adding the following line to the top:

auth sufficient usersfile=/etc/liboath/users.oath window=10 digits=6

Notice we specify a users file, this is where the users who harness OTP will have their details stored. Once this is saved we need to restart sshd

service sshd restart


systemctl restart sshd

So thats ssh configured, from now on when a user logs into the system via ssh, they will be prompted for a One-time password.

Next up we need to generate the keys which are to be shared between the target host (SSH) and the client generating the OTP (Android or iOS app)

gen-oath-safe jon hotp

Replacing jon with your username. hotp denotes the type of key to be generated, hotp being a counter based key and totp being a time based – choice is yours here.

This command will generate a number of codes, HEX, b32, QR and yubikey. The keys we are interested in are the HEX and the QR:


From the app select the QR scanning option on the top tool bar:


Scan the generated QR code which will then store the key in the application.

The final step is to add the HEX code to the file we referenced earlier in the sshd pam config file. Drop the following line /etc/liboath/users.oath (making sure you use your generated key and username):

HOTP jon -  da50cc2e1ee6726c847c5b960a62751e9bbea3a9

Once that file is saved we can go ahead and login via ssh with the specified user. You will now be prompted for a One-time password which can be generated by pressing on the entry within FreeOTP.

Note: if ssh keys are setup then these will be preferred over OTP, i’m sure a modification of the pam config would allow for both but haven’t spent any more time on this yet.




Etckeeper – config version control

A valuable tool I have been using for many years is etckeeper, it works by essentially turning your /etc directory into a git repository.

This is a fantasticly useful set of tools as any configuration changes can be logged and also reverted quite easily. Install and setup is exeptionally easy too!

Packages are available for most distributions, but my scenario (Fedora,CentOS,RHEL) was:

yum install etckeeper

Once the package was installed an initialisation must be performed:

etckeeper init

This essentially runs a “git init” in the /etc directory setting up the directory ready.

That’s all there is to the installation.

Using it is a matter of committing changes when they are made, my workflow generally consists of running a check to see if all previous changes were committed, make the change, commit the change.

etckeeper unclean

Will check the /etc directory for uncommitted changes, if they exist they can be committed in the same way as any new changes:

etckeeper commit

Running this command will present the familiar commit log screen in your favourite editor as it is essentially running a git commit from within the etc directory. Once the commit log is saved any changes are then stored within the version control system. A cron job is also in place to ensure a daily commit takes place, incase commits have been missed.

Now this is cool and extremely useful, but extending the git elements to push to a remote repository gives your etc that extra bit of resilience. Hook scripts are already present within /etc/etckeeper/commit.d/99push to recognise if a remote repository is configured and push to it on commit. Adding a remote repository is fairly simple, in my case I push to a gitlab (think self hosted github) server which I run.

First up a repository needs to be created in which to push to, I won’t go into detail here as there are hundreds if not thousands of Git tutorials out there. Gitlab has a repository created for each server and the ssh public key of each server stored to enable access.

cd /etc
git remote add origin git@gitlab01:etckeeper/server01.git
git push -u origin master

Will set the remote repository and populate it.

The last element to configure is the etckeeper config file, changing




(or whatever branch you choose to use)
And thats it! You an amazingly simple piece of software which could potentially save your Apache server, your Dovecot server or maybe even your job!


SSH known hosts verification failure one liner


Those who regularly build and rebuild machines or virtual machines on a dhcp network will probably be faced with this quite often, this is due to the known fingerprint for the previous host being different to a new one which has aquired the same IP address.

Someone could be eavesdropping on you right now (man-in-the-middle attack)!
It is also possible that a host key has just been changed.
The fingerprint for the ECDSA key sent by the remote host is
Please contact your system administrator.
Add correct host key in /root/.ssh/known_hosts to get rid of this message.
Offending ECDSA key in /root/.ssh/known_hosts:66
ECDSA host key for has changed and you have requested strict checking.
Host key verification failed.

There is an option to have SSH ignore these when connecting, however i find that cleaning out the old line before connecting far quicker and i do this with a Sed one liner.

The line in the known_hosts file we are interested in can be found at the end of the line:

Offending ECDSA key in /root/.ssh/known_hosts:66

66 in this case, so we can get sed to simply delete that line using:

sed -i '66d' ~/.ssh/known_hosts

An SSH session can now be opened without Host key verification failure.

Hope this helps someone.

Getting Fedora 21 on the Raspberry Pi 2

The recent release of the Raspberry Pi 2 uses a newer version of the ARM architecture spec, the ARM Cortex-A7 uses ARMv7 whereas the previous model ARM11 uses ARMv6. The great thing about this is the majority of Linux distros already provide an Image for this architecture. More importantly, Fedora already have images.

There is a slight caveat to the above statement however, that being they won’t just work with the Pi 2. The process isn’t that difficult either just a few steps:

  1. Download the image you require, for this we’ll go with the Fedora 21 minimal –
  2. Flash the image to an SD card xzcat Fedora-Minimal-armhfp-21-5-sda.raw.xz |dd of=/dev/mmcblk0 bs=1M
  3. Make sure the card is unmounted
  4. fdisk the card:
    1. remove partition 1
    2. add a new partition where the old partition 1 was, with type B (FAT32)
    3. write and exit
  5. mkfs.vfat /dev/mmcblk0p1
  6. Clone the Pi firmware repository – git clone
  7. Mount the card again
    1. mkdir /mnt/sdcard
    2. mount /dev/mmcblk0p3 /mnt/sdcard
    3. mount /dev/mmcblk0p1 /mnt/sdcard/boot
  8. Copy the contents of the boot directory from the repository you just cloned to the new boot directory and the kernel modules to the lib/modules directory on the main root partition
    1. cp -r firmware/boot/* /mnt/sdcard/boot/
    2. cp -r firmware/modules/3.18.7-v7+/* /mnt/sdcard/lib/modules/
  9. Edit the fstab file to reflect the new UUID of the partition and change from being an ext to a vfat type
    1. blkid /dev/mmcblk0p1 – this will give the UUID of the partition
    2. vi /mnt/sdcard/etc/fstab and edit the line which contains /boot to contain the above info
  10. Create a /mnt/sdcard/boot/cmdline.txt file containing the following:

    dwc_otg.lpm_enable=0 console=ttyAMA0,115200 console=tty1 root=/dev/mmcblk0p3 rootfstype=ext4 elevator=deadline rootwait

  11. Create a /mnt/sdcard/boot/config.txt file containing the following:
    #uncomment to overclock the arm. 700 MHz is the default.
    arm_freq=700# NOOBS Auto-generated Settings:
  12. save and close any open files on the sd card then unmount and ensure all writes are complete
    1. umount /mnt/sdcard/boot
    2. umount /mnt/sdcard
    3. sync
  13. You should now be able to remove the SD card from your PC and boot it in your new shiny Raspberry Pi 2

I’m sure it won’t be long before dedicated images are available, but for now this seems to work for me. I haven’t tried any more than the minimal install, with these your mileage may vary.

Note: Please remember this will only work on the newer Raspberry Pi 2.


Extra steps suggested by Tim Bosse

14. Install rpi-update.
Install binutils and tar.

Download and setup rpi-update.

# curl -L --output /usr/bin/rpi-update && sudo chmod +x /usr/bin/rpi-update

15. Run rpi-update as root.

I find this is important to run any time you get kernel updates from Fedora repos.

I have a wireless USB dongle that I use.

16. Install NetworkManager-wifi and NetworkManager-tui (because I find nmcli not so much fun).

I’ve created an image based on steps 1-13 it’s fairly rough and ready so YMMV