Summer 2025 internship: Full Sensor Development Report - By Gaudeor Rudmin
For the summer of 2025, I was invited by Joe Betit of Earth Systems Management to an internship with the goal of developing a sensor capable of identifying seafloor sediments in shallow coastal waters. The commercial incentive behind this initiative is to allow oyster farmers to identify the sand beds where they plant their oyster spawn, without having to manually wade around to find suitable beds, which is their current approach. Earth Systems Management develops autonomous sensor platforms, which currently include a sonar mapping system. While sonar can be used for underwater material classification, it is limited in its resulting map resolution by the long wavelength of the sound waves it uses – longer wavelengths produce lower resolution images, and higher resolution images are produced by ranging created with higher wavelength sensors. Green lidar, which was the sensor type selected for my internship’s initiative, has a wavelength several orders of magnitude smaller than sonar, so the theory was that we would be able to detect fine details in the transition from muddy silt to lighter colored sand.

The sensor we used was a small, cheap, USB-powered laser ranging device from Chengdu JRT Meter Technology Co., Ltd. The first goal was to establish a way to read the lidar measurements. Sample code was provided with the documentation. With the hexadecimal serial codes from the documentation and the help of AI, I developed a script that could instruct the device to take a single measurement and then read the measurement and the “signal quality” … which seemed to be a proxy for the return frequencies seen by the sensor. The signal quality was the basis for our material classification, and so to test the theory that different materials could be identified by the device, we ran a few tests in the air (as opposed to underwater, as the device did not yet have a waterproof enclosure) by placing paper and plastic in front of the device. At the same distance, the signal quality showed a significant difference between material types, which established that the theory was sound in that regard.
In clear air, the distance and signal quality together would have been sufficient to identify the material the laser struck. Underwater, the problem is somewhat more complicated due to particulate matter suspended in the water. The most direct method to measure the ability of the laser to penetrate the water is with a Secchi disk. A black and white patterned disk is lowered into the water until it is no longer visible, and the distance it was lowered is recorded as the Secchi Depth. Another way to characterize the ability of light to pass through the water is with turbidity, which measures the haziness of the water. Turbidity is easy to measure electronically while Secchi depth is not. Unfortunately, the conversion of turbidity to Secchi depth is not straightforward for several reasons. Environmental light can easily affect the distance that water can be seen into (and therefore some attempts are made to standardize the time-of-day and atmospheric conditions under which a Secchi depth measurement can be made). Suspended particle size and reflectivity is also a key factor. Conversion models between Secchi depth and turbidity are often experimentally determined and only apply to the local water body for which they were created.
Turbidity meters are often quite expensive and are lab grade equipment. However, a lower quality turbidity meter can be found in a washing machine, and such modules are readily available for Arduino for around $10. These simply provide a varying voltage level reading, as a fraction of the input voltage. Higher voltages indicate lower turbidity. The next step in the project was to develop an interface to such a turbidity sensor so that the distance and signal quality of the lidar sensor could be qualified with the light absorption properties of the water body.
We were using a Raspberry Pi Zero 2 W to read the lidar, and to start with we attempted to attach the turbidity sensor directly to the pi using soldered connections to an analog digital converter (ADC). I made a case from 2-inch PVC pipe and duct tape, and sealed the sensor into it. Then I tried to read the sensor, but reading the voltage level … didn’t work – possibly due to faulty connections, but more likely due to having installed the cord between the two chips of the turbidity sensor backwards, as I discovered later. I also attached the wires to the Pi backward, and thought I had fried the turbidity sensor.
I was familiar with using Arduino to read a sensor signal and send it to a pi, so as a simplification step, I decided to get cheap Arduino nano chips and use that to read the turbidity sensor. I also ordered 2 new turbidity sensors. I attached the turbidity sensor to the Arduino nano’s analog pin, and it was easy to read the voltage level. I made the Arduino automatically send the data in JSON format across the serial interface with a USB. As I was breadboard-testing the turbidity sensor I discovered that I had installed the cord backward the first time. Unfortunately, the first sensor had already been sealed into its case and was practically inaccessible. I was able to read a varying voltage when an obstacle was placed between the prongs of the turbidity sensor this time. I noticed that it also varied with the light conditions.
With the experience of ruining the first sensor, due, ultimately, to having made the case inaccessible, and with the knowledge that the sensor had to be kept in the dark to avoid light contamination, I went to the hardware store to design an opaque, repairable, and accessible case. I made the case in two sections: a waterproof container made from PVC pipe and a cleanout, with a threaded cap at one end and the sensor at the other, and another part which fitted over the sensor, blocking light but allowing water to flow through. I painted these parts flat black to block light.

The sensor pretty much just worked, in that it could indicate varying turbidity with varying voltage levels. It sent the voltage level over USB in JSON format.
With the turbidity sensor in hand, I used AI to write a few simple, testable, python scripts that logged the turbidity data and lidar distance and signal quality. The scripts ran on the raspberry pi, which had the Arduino and the lidar connected and separately powered with a USB power bank. I started out simply taking a single measurement on command, but eventually moved to a script that automatically collected 40 measurements in a row and saved them to a csv file to allow statistical analysis. Then we began field testing.
To support field testing, we collaborated with my other employer at Seaduce LLC where we had access to a scrapyard. I welded together a steel frame with a sliding carriage. The idea was to find an area with a transition between sand and silt, set the frame over the transition, and record the position of the sensor carriage in relationship to the underwater material as corresponding measurements were taken.
On our first field test, my uncle Joe Rudmin who is a lab tech at JMU joined Tania, Joe Betit, and my sister Noah Rudmin to support the mission. In the field we ran into several obstacles to our plan. The first problem was that we were unable to find a cleanly demarcated underwater transition between silt and sand. Instead, there were large areas of dark silty sand, some areas mixed with sandy silt, and other areas with rocks and dirty sand. The boundaries between these areas were not clear. The frame ended up only providing a base on which to set the sensor. We tested the sensor in a few different areas. The next issue was that the lidar … just seemed to not be working. In the lab, we had tested it and gotten reasonable distance measurements reliably, but now in the water it was often sending invalid data and occasionally sending valid but unreasonable data. I was confused and wondered if I had broken the sensor by accidentally getting it wet, but when we brought it back on land it worked well as before. Upon doing further testing, we discovered that under shallow water it did occasionally return a valid and reasonable distance … which was encouraging. We ended up not getting any useful data that day other than the understanding that in theory the idea worked, but in reality there were environmental restrictions that had to be identified before characterization experimentation could begin.
Our next step was to attempt to recreate similar, but controlled environments at the workshop. We obtained three large, opaque, Sterilite bins. My sister and I went back to Harborton and obtained some samples of sand, silt, oyster shells, and seagrass, which represented some of the surfaces we wished to identify. We also collected 2 bins worth of seawater. Joe Betit obtained some “cleaned” play sand. We set up the three bins with: tapwater and clean play sand, seawater and natural sand, and seawater with natural silt.
I began testing with the seawater and natural sand bin. By placing the sensor frame over the bin, I was able to submerge the lidar sensor underwater and allow it to point at the substance underneath. My first experiments varied the light level – I started out in shade, and the distance measurements were as a rule reasonable. When I moved it into full sunlight, many unreasonable and invalid measurements were recorded. Placing a dark shade directly over the bin resulted in even higher reliability. I also varied the angle: in full sunlight, placing the sensor at an angle resulted in completely invalid data, while shading the sensor resulted in more reliable measurements, even at an angle. I also recorded sets of 40 measurements, in the shade, in the other two bins, but the main takeaway was that the sensor required the absence of direct sunlight. This guided us into our next two field tests: collecting data on a cloudy day and at night. To support these missions, we attached the sensor unit to an old PVC frame hull prototype so that it could float around at different heights above the seafloor. Our goal of these sessions was simply to determine the conditions under which valid data could be collected.

On the cloudy day test, the first results were discouraging. We floated the hull out alongside a dock over sandy seafloor, and were continually getting invalid and nonsensical distance data. We tried moving the sensor into the shade of the dock to eliminate the direct sunlight that was seeping through the clouds, but that changed nothing. Our first conclusion was that the natural lighting conditions, even on cloudy days, were not suitable for the sensor setup. Before we left however, we brought the sensors closer and closer to the shore. When they were almost touching the subsurface, we finally began getting reasonable distance data. We measured the water depth with a measuring stick and compared it with the reported distance, and it was very accurate! This indicated that the problem is partly a laser-power issue.
On the night test, we did the same thing, but found that the lidar, while it was reporting valid data, was reporting the distance from the sensor to the edge of the case. The dark water was classified as opaque by the sensor at night. The takeaway from that test was that the lidar sensor’s lens should be directly immersed in the water. That could improve the depth range of the sensor in daylight as well.
At this point, the summer was running out, and the lidar sensor case required a major redesign before continued development. Therefore, I began focusing on characterizing the turbidity sensor.
As mentioned above, Secchi depth was the quantity most useful for characterizing the lidar signal, but turbidity is easier to measure electronically. Thus a turbidity-to-secchi depth conversion model was necessary. We also did not have good methods to quantify the turbidity sensor in terms of NTU in our lab, so we resorted to developing a sensor-specific voltage-to-secchi depth conversion model. We were aware that the conversion between turbidity and Secchi depth was a local problem, and that models were developed locally, but we hoped that a controlled, easy to perform, experimental model with on-hand materials would suffice to give a rough idea of the Secchi depth from voltage. This hope did not play out. Still, it was an exercise in developing a conversion model that should work in the real world.
First, I made a Secchi disk by double-printing it on paper, laminating the disk, and attaching it to a rod. At around 3 p.m., I filled a trash can with tap water. Then I performed the following procedure repeatedly: I added a handful of play sand with enough small dust to increase the turbidity significantly. Then I lowered the Secchi disk into the water until it was no longer visible, made a mark, lowered it further, raised it until it was visible, made another mark, and then recorded the Secchi depth as the average of the two depths. Most measurements were about a 1/4-inch apart between the disappearance and the appearance of the disk. Next, I submerged the turbidity sensor and waited for the reading to stabilize. Since there was slight noise in the turbidity reading, I averaged the value of 50 voltage measurements.
After taking 19 measurements in this way with slowly increasing turbidity, we plotted the Secchi disk vs voltage distribution, and used Excel’s curve fit to establish a 4th degree polynomial fit model. We wanted to let the Arduino report the estimated Secchi depth in its json report, however, we found that the precision of the Arduino nano was insufficient to calculate the polynomial within the data range. To solve this issue, I used AI to program a binned-linear-interpolation model that linearly interpolated between the average of bins of data. This was possible to calculate directly on the pi. While the model made by AI fit the data well enough, it did not follow a repeatable process of binning, averaging, and interpolating that could be applied to any dataset, so, I later rewrote the important parts of the script more manually so that it could be applied to any dataset. The second version of the script also automatically graphed the model and wrote an Arduino function that implemented the model.
Unfortunately, when I tested the model in the bay at Harborton to see if it could predict the Secchi depth with any reasonable accuracy, it was way off. Both the 4th degree polynomial and the linear interpolation model predicted a depth of about 7 inches, but the actual Secchi depth was around 2.6 ft. We believe this is because the models were created with sand dust in fresh water, while the water in the bay is brackish and the particles are of a different size and composition, and therefore block light differently. A model for Harborton will have to be developed more slowly over time, considering the environmental and seasonal conditions of the recorded data. Very likely, regional models will have to be developed for each water body we wish to use the sensor in, although perhaps similar environments will be sufficiently characterized by a single model. That remains to be seen.
Future development of this set of sensors will need to include creating a case for the lidar that submerges its lens in the water. Then, after establishing the conditions under which the lidar can take useful data, correlation between the underwater surface and the distance, signal quality, and water turbidity can be established. For the turbidity sensor, the next step is to integrate a temperature sensor, as the turbidity values vary based on temperature. Then Secchi depth models can be developed for Harborton by daily measuring the Secchi depth and recording the voltage and temperature. Additionally, the turbidity-temperature-voltage relationship can be characterized in NTU using a lab-grade turbidity sensor. Other potential applications of the turbidity sensor include having remote sensors that monitor the water quality of streams and tributaries and wirelessly transmit their data to a collection server. While further work is required to bring these sensors to utility, significant progress has been made and the problem space has been significantly narrowed over this summer internship.