Rather complicated sensor network

I was working on a project recently and it was the first one which was involved enough to make the sensor networking complicated. In the end, I think the communication was the bottleneck in terms of overall performance and I’m wondering how more experienced people would have solved this problem. This is a long read, but I think it’s pretty interesting so please stick with it. The problem was to design an autonomous blimp capable of navigating an obstacle course and dropping ping pong balls into brown box targets. Here goes:

Sensors

4D Systems uCAM-TTL camera module - UART interface
HMC6352 Digital Compass - I2C interface
Maxbotix Sonar ez4 - 1 pin analog interface
Actuators

2x L293D motor drivers (connected to simple hobby motors) - These were used to drive 6 motors bidirectionally. They required PWM inputs in order to vary the speed. Now 3 of our motors were always doing the same thing (the ones that controlled up/down movement) so they only required 2 PWM outputs from our controllers to control all 3 motors. The other 3 motors which controlled lateral movement all needed individual control (for omni-directional movement) so that was another 6 PWM outputs required from our controllers.
Servo motor - PWM interface
Controllers

For reasons that will become clear later, we ended up using 2x ATmega328Ps. We used an Arduino Uno to program them (we didn’t have access to an ISP) but we fab’d a custom PCB so we didn’t have to use arduino boards since that would just add unnecessary weight to our blimp. As for why we chose the ATmega328P, I was very familiar with the arduino environment and I think that made the code development much quicker and easier.

Communication & Processing

2x Xbee Basic
2x ATmega328P
Desktop computer running C++ w/ openCV
So as you can tell from the camera module, most of our project relied on computer vision. The blimps could only carry so much weight and we didn’t feel comfortable implementing computer vision on a microcontroller. So what we ended up doing was using XBee’s to relay the image data back to a desktop computer. So on the server side we received image data and used openCV to process the image and figure stuff out from it. Now the server side also needed to know height information (from the sonar) and compass information.

The first wrinkle was we were not able to have the camera controlled by a microcontroller for a couple reasons. The main issue was internal memory on the uP couldn’t handle storing an entire frame. There might have been ways around this through clever coding but for the purposes of this question let’s pretend it was impossible. So to solve this problem, we had the server side send camera commands through the XBee transceiver and the XBee receiver (on board the blimp) had its output wired to the camera’s input.

The next wrinkle was that there are not enough PWM’s on a single ATmega328P to control all the motors BECAUSE the I2C interface uses one of the PWM pins (damn them…). That is why we decided to use a 2nd one. The code actually lent itself perfectly to parallel processing anyway because the height control was completely independent of the lateral movement control (so 2 micros was probably better than one attached to a PWM controller). Therefore, U1 was responsible for 2 PWM outputs (up/down) and reading the Sonar. U2 was responsible for reading the compass, controlling 6 PWM outputs (the lateral motors), and also reading the Sonar. U2 also was responsible for receiving commands from the server through the XBee.

That led to our first communication problem. The XBee DOUT line was connected to both the microcontroller and the camera. Now of course we designed a protocol so that our micro commands would ignore camera commands and camera commands would ignore micro commands so that was fine. However, the camera, when ignoring our micro commands, would send back NAK data on its output line. Since the command was meant for the micro we needed someway to turn off the camera output to the XBee. To solve this, we made the micro control 2 FETs which were between the camera and XBee (thats the first FET) and also between U2 and the XBee (thats the second FET). Therefore, when the camera was trying to send info back to the server the first FET was ‘on’ and the second FET was ‘off’. Unfortunately there appeared to be some cross talk with this method and sometimes when the server was trying to receive height data for example, it would read a NAK from the XBee.

So to give you an idea of how this worked here are a few examples:

Server requests a picture - PIC_REQUEST goes through XBee and arrives at U2 and camera. U2 ignores it and camera sends back image data.
Server just finished processing a picture and is sending motor data to tell blimp to turn right - MOTOR_ANGLE(70) gies through XBee and arrives at U2 and camera. U2 recognizes as a micro command and thus turns off Camera’s FET (but perhaps the camera already responded with a NAK?? who knows…). U2 then responds to the command by changing motor PWM outputs. It then turns Camera’s FET back on (this was the default setting since image data was most important).
Server realizes we’ve come to a point in the obstacle course where our default hover height now needs to be 90 inches instead of 50 inches. SET_HEIGHT goes through XBee and same thing happens as in example 2. U2 recognizes the SET_HEIGHT command and triggers an interrupt on U1. U1 now comes out of it’s height control loop and waits to receive serial data from U2. That’s right, more serial data. At this point the U2’s FET is on (and camera’s FET is off) so the server receives the height that U2 is also sending to U1. That was for verification purposes. Now U1 resets its internal variable for height2HoverAt. U2 now turns off it’s FET and turns the camera FET back on.
I definitely left out a good amount of information but I think that’s enough to understand some of the complications. In the end, our problems were just synchronizing everything. Sometimes there would be data left over in buffers, but only3 bytes (all our commands were 6 byte sequences). Sometimes we would lose connection with our camera and have to resync it.

So my question is: What techniques would you guys suggest to have made the communication between all those component more reliable/robust/simpler/better?

For example, I know one would’ve been to add a delay circuit between the on board XBee out and the camera so that the micro had a chance to turn off the camera’s talk line before it responded to micro commands with NAKs. Any other ideas like that?

Thanks and I’m sure this will require many edits so stay tuned.

You would probably be best to provide at least a basic schematic (using Fritzing would be good as this is a Fritzing forum :slight_smile: ) of the circuits (or at least what I assume are serial ports, the serial ports being multiplexed somehow by fets). That said, first recommendation would be dump the 328s. A PI Zero would be a good bet (may be still hard to obtain though as I see they are still out of stock on Adafruit) or another 32 bit large ram ARM CPU. That should allow you to do the graphics stuff on the micro and eliminate the radio link. Your pwm woes are easy to solve (even on a 328), if not the full board then at least the chip driving it.

I believe that I2C should be able to run on pretty much any pin (spi not so much) as it is slow. I’m not familiar with XBee transceivers bur radio in general (unless it has internal error correction) is subject to drop out and you will thus lose characters due to transmission errors. Your code needs to deal with that by for instance retransmitting lost packets if the radio doesn’t do so internally. The real issue is likely to be 2 uarts both trying to send down the same path (which sounds like what should be happening here, modulo what the fets do, thus the diagram). Uarts expect to be the only thing sending usually and will tend to write over each other and cause transmission errors even without the radio.

Peter Van Epp