Today’s visit brought us to the dramatic coastal features of Scremerston, just south of Berwick-upon-Tweed. This is a tough location for Assisted Requirements students as the high, steep cliff face can be quite challenging to negotiate for anyone. The closest practical access is down a gravelled lane (which could be driven up at a push), stopping before the gate to a muddy field full of cows. From there, the beach site (which should be visited on a falling tide) is accessed down a grassy cliff face 300m or so from the gate. From the beach access, the geological study site is quite stretched out along the sea front, with features jutting out from the cliff base into the sea.
In order to make a video and audio link from the end of the gravel lane to the long beach site, we started by thinking that a long piece of ethernet cable, about 60-70m, would make it down the cliff onto the beach. At the near end of the beach we mounted a Nanostation on a stand and pointed it along the beach, plugged into the bottom end of the ethernet cable. As near as we could reach to the clifftop, length of ethernet cable permitting, we sited our battery-powered ethernet switch in a dry-bag, our netbook running Asterisk server and our shiny new Thrane & Thrane satellite terminal. Whilst we waited for the tide to expose the rest of the beach, we tested the Satellite BGAN terminalto see what it’s uplink characteristics were, and how it would co-operate with the rest of the equipment. Since the other equipment had been set up with fixed IP addresses, and to look for a gateway at 192.168.1.1, we fixed the BGAN terminal’s ethernet interface to that address. We also forwarded some network ports on the BGAN Terminal to see if we could access services in the field from the rest of the internet.
Setting up the satellite terminal to connect to the satellite is pretty simple considering how far away the satellites are, with visual and audio indicators as to how well you’re doing. Since we have a ‘pay-as-you-go’ contract with Inmarsat, we have to have a dynamically allocated public IP address, which turns out to be a bit of a pain since Inmarsat deliberately make it impossible to offer services from the field to the internet in this way; the ‘pay-monthly’ contract offers a static IP option to specifically allow this, and is more expensive. Having to push data from the field to the internet is not ideal, but we tested it anyway using iperf, and getting some quite respectable results – around 2-300Kbits/second, but with lots of latency although suprisingly little jitter. We pushed some picture files from one of the netbooks up to a remote ftp server, just to prove that upstream ports weren’t blocked.
We then set up a pair of nanostations on stands, one attached to the switch on the clifftop, and one at the end of the gravel path near to the parked car. A further nanostation was set up at the far end of the beach, connecting wirelessly to the nanostation at the near end of the beach. It’s important to set these two pairs of routers to different radio channels – we used channel 1 for the clifftop pair and channel 6 for the beach pair. A netbook was connected to the nanostation at the end of the path by ethernet cable, and another netbook associated as a wireless client of the far beach nanostation. With connectivity established between the two extreme ends of our network; the beach (sherpa) end, and the gravel path (student), we used the nanostation’s airOS ‘stations’ tool to check signal qualities of the two nanostation to nanostation radio links.
Now we were ready to show what we really came for: 2-way voice, video and still pictures to a student from a remote site. Using pre-configured user IDs to log out Twinkle VoIP client software into Asterisk, we then made a ‘phone call’ from the student end netbook to the sherpa on the beach. The sherpa and student were out of walkie-talkie range of one another at this stage – the only way to make a walkie-talkie link was by getting the person at the switch location to relay voice messages from end to end. The Twinkle / Asterisk combination made a loud and clear voice connection, even allowing both ends to chatter at the same time. Call quality was hugely superior to the walkie-talkies with the lightweight stereo headphones/boom-mic headsets, although in high winds we found wind noise to be a problem, and careful mic positioning (close to the mouth but slightly under the bottom lip) is useful when it’s noisy or windy.
Next we started the video link using the MJPG-streamer software and eee-mjpgstreamer script with the built-in cameras on the Asus 901 netbooks. This was easier to co-ordinate once a voice link had already been established, and clear 320×240 pixel colour video was going from the beach to the student. The Asus camera being mounted on the screen-side of the lid proved to be an irritation – it’s not possible to accurately aim the video camera and monitor what it’s pointed at with the monitor and camera facing the same way – towards the study site, not the sherpa. With a voice and video link, the ‘student’ was able to direct the investigating ‘sherpa’ on the beach, so we now started to send higher resolution picture files from the still camera to the Student laptop’s LAMPP server suite. Our Ricoh weatherproof cameras have the feature of being able to send picture files seconds after taking to an FTP server over a WiFi connection – in fact if the camera is set up to do this, the photographer needs only press a button when prompted for this to happen.
The last feature to be trialled at Scremerston was a different SIP/VoIP phone client – Ekiga, which is the one supplied with Ubuntu Netbook Remix. Ekiga v.3.2.0 is perhaps not quite as solid or featureful as Twinkle as a voice phone, but has the advantages of being able to do video phone calls, and that all it’s menus fit nicely in the small screen area of the Asus 901s. Ekiga will use any Video4Linux camera device as a source and has adjustments for picture size, brightness/contrast/saturation, some choices of codec and a slider control for bitrate over picture quality to determine what should happen to the stream if bandwidth is restricted – should frames be dropped to retain picture integrity, or should incomplete frames be displayed at the highest frame rate. A maximum bitrate can also be set, which the software will aim to use – an estimation of available bandwidth will inform the choice for this setting.
The local display for Ekiga can show the remote video, the local video or both as picture-in-picture video. Using video bitrates too high for the available bandwidth can cause Ekiga to crash, which would leave the Student and Sherpa out of contact with each other (unless there was a walkie-talkie link between them), and that can be annoying and time consuming. In our experience at Scremerston, it’s worth agreeing a procedure in case of VoIP disconnections beforehand. We were fortunate to have a pretty consistant walkie-talkie link, even if the messages had to be relayed, and this saved time, but our VoIP disconnection routine became solid enough so that we hardly the walkies by the end of the day. If the picture froze at either end, then the person experiencing the frozen picture would repeat that they could see a frozen picture whilst steps were taken to improve the available bandwidth (normally involving a wifi client moving to an area of better signal). If a voice link became inaudible at either end then, similarly the person affected would keep talking, encouraging the other end to talk if they could hear, and try to establish an improved connection.
If, following a short period of silence – say a minute or so, there was still no voice connection, then both ends would stop their VoIP client software, restart it and log back in to the Asterisk VoIP server. The Asterisk software shows that a client has registered, so a fresh call could be safely initiated when both clients were re-registered. Asterisk proved to be remarkably stable throughout, tolerating repeated logins and connection drops with only the rarest of restarts – running Asterisk with the -cvvv options (show a reporting console window which allows diagnostic commands to be issued and report events with 3/4 ‘verbosity’), letting the user know about relatively minor program events.
All in all, our day at Scremerston showed that a multi-hop wireless network can accomodate a wireless client running continuous voice and video links, using either VoIP video with Ekiga or M-JPEG video over HTTP with mjpg-streamer, and at the same time send higher resolution still images to a local web-server straight from the camera using FTP. Using a satellite BGAN terminal connected to the same network, those still images can be pushed to a remote web-server for viewing from the internet. Using channel-separated Wifi links in the same fixed-IP network and battery powered ethernet switches in the field was shown to work very well. External USB connected webcams were seen to be more practical than those built into the netbook screen. We extended our networking abilities and had some important practice as a team, today’s tests demonstrate that ERA can deliver good end-user experience at extended ranges, well out of sight of the unassisted student.