Optical-Based Collision-Avoidance Tech

Optical-based collision-avoidance systems have evolved and gained widespread use, and are improving safety at sea.
Optical-based collision-avoidance
Optical-based collision-avoidance tech is an offshoot of automotive-based, advanced driver-assistance systems. Julien Champolion – polaRYSE

Imagine ripping along at 25 to 30 knots in the dark, in a big seaway, singlehanded aboard a 60-foot offshore racing sailboat in the nonstop around-the-world Vendée Globe race. Land and help are hundreds of miles away. Sleep is one of your most valuable currencies, but commercial vessels, fishing boats and whales also transit these waters. Trusting the big-ocean theory while you get some shut-eye can be risky business.

Optical-based collision-avoidance systems are a solution to this problem. One example is Sea.AI (née Oscar), which was developed in 2018 to help keep these kinds of sailors safe. Flash-forward seven years, and this type of technology is protecting boaters of all stripes, with numerous brands on the market and companies competing to advance the systems in various ways.

Optical-based collision-avoidance tech is an offshoot of automotive-based, advanced driver-assistance systems. This technology is quickly becoming an invaluable safety net, alongside radar and the automatic identification system, aboard well-equipped yachts. Elements of this technology are also critical for enabling assisted and autonomous docking and navigation systems. Contemporary systems alert captains of potential collision threats, with AI’s evolutionary curve suggesting more to come. Much like a car’s ADAS, this tech could soon also be standard kit aboard boats.

Most optical-based collision-avoidance systems have one or more cameras, an AI-enabled black-box processor and a display. Systems can include a daylight camera with low-light capabilities or a thermal-imaging camera, or both. The processor typically contains a library of annotated images that depict, for example, a vessel at sunset, a buoy in waves or a partially submerged container. The screen, which can be dedicated glass or a networked multifunction display, presents visual and audible alarms and real-time video imagery of any camera-captured targets.

Sea.AI camera
Sea.AI uses machine vision technology to prevent at-sea collisions. Marin Le Roux – polaRYSE

The camera’s video is fed through the processor using AI computer vision and machine learning. It essentially lets the processor “see” through the camera. The processor then compares the camera’s real-time video feed with its imagery database, or it uses its knowledge of how to identify targets based on its annotated imagery database to identify nonwater objects in the camera’s field of view—a sailboat in the fog, for example.

“Our database contains more than 20 million objects in different scenarios, like sea states, weather conditions, geographic locations,” says Christian Rankl, Sea.AI’s chief technical officer. “It’s key to have a database with a wide range of objects and scenarios to build a highly reliable collision-avoidance system.”

Once the system has identified an object, it tracks it and calculates the real-time distance and bearing to the object, as well as a safe course (depicted on the display) around it.

The math isn’t trivial, says Sangwon Shin, vice president of recreational marine for Avikus, a subsidiary of HD Hyundai that specializes in autonomous navigation: “The hardest part about creating a collision-avoidance system is calculating the distance.” Factors include the boat’s pitch and roll, plus the marine environment’s diverse conditions. A boat’s distance from an object and its velocity also factor into calculating an avoidance path.

This all unfurls almost instantaneously with Avikus’ Neuboat Navi system. “It takes about 20 to 30 milliseconds,” Shin says about the time frame required to identify an object. The system, which uses an electro-optical camera and a lidar sensor to measure distance, recalculates this 10 times per second to ensure accuracy. “Sending the alarm to the boaters takes about 100 to 200 milliseconds,” Shin adds.

Sea Machines’ AI-ris system
Sea Machines’ AI-ris system uses a camera to detect, track, identify and geolocate marine targets. Courtesy Sea Machines

Other systems also offer processing times that are lightning-fast. Phil Bourque, Sea Machines’ vice president of global sales, says his company’s AI-ris system has latency of less than 0.25 seconds at full 4K resolution. “So, it does a lot of thinking very quickly.”

But speed is only one necessary component of these systems. They also have to minimize false alarms. Rankl says Sea.AI continuously refines its AI model by analyzing scenarios where it performed poorly. “It’s crucial for the AI to accurately distinguish real threats from benign objects.”

Sensor payload is another area where evolution is occurring, beyond hardware, software and AI models.

“While optical and thermal sensors are highly effective in detecting various floating objects, they, like all sensors, have limitations,” Rankl says, noting that these limitations could be addressed by integrating radar, AIS, lidar and sonar. “Our research department is actively evaluating the value these sensors can provide to our customers and how they can further enhance their safety at sea.”

Bourque agrees, noting that Sea Machines is working to integrate AIS and radar into AI-ris. “We certainly see the demand for the fusion of computer vision, radar and AIS,” he says.

Another important integration involves displayed cartography and data overlays. Anyone who cruises with radar and AIS is familiar with how multifunction displays can overlay AIS targets and radar data atop vector cartography. To that end, Sea.AI recently partnered with TimeZero to display targets detected by Sea.AI’s Sentry system atop TimeZero’s TZ Professional navigation software. “We are actively working toward integrating our machine vision with other platforms as well,” Rankl says.

Sea.AI isn’t alone in this thinking. Avikus’ Neuboat Navi presents camera-detected targets in its real-time head-up display, and Sea Machines’ SM300 autonomous command and control system displays camera-detected targets atop cartography.

The trick, of course, will be getting optically detected targets onto mainstream multifunction displays, but multiple sources say this is already in the works.

Optical-based collision-avoidance
Optical-based collision-avoidance systems are typically trained to identify all nonwater objects. Yann Riou – polaRYSE/Oscar

Accurately assessing the future of optical-based collision-avoidance systems is a tougher ask.

Bourque says the next five years should see these systems mature and progress—much like the ADAS performance curve. He also says today’s refit customers will want this technology to come factory-installed aboard their next yachts, necessitating that designers and builders allocate physical space for these systems.

In addition, Rankl says, optical-based collision-avoidance technology will become a standard feature on boats, akin to radar and AIS. He sees low-Earth-orbit satellites such as Starlink playing a big role with their fast, global connectivity.

“This will enable the development of large vision models specialized for maritime use,” he says. Rankl also predicts that the rise of AI spatial intelligence, which allows AI models to understand and interact with geographic information, will let collision-avoidance systems better predict the movements of detected targets based on their positions and trajectories.

“Over the next five to 10 years, we expect multimodal systems that integrate data from all available boat sensors—cameras, radars, AIS, etc.—into a unified AI acting as a 24/7 co-skipper,” Rankl says.

Shin agrees but is more bullish about the time frame, which he puts at three to five years. “This technology will be developed in a way that combines multiple sensors and provides more accurate information,” he says. In five to 10 years, he adds, a single piece of hardware will provide “all the necessary data for collision avoidance.” As far as autonomous docking and navigation, Shin says: “We do not aim only to give situational awareness and provide suggested collision-avoidance routing. Our ultimate goal is to provide [an] autonomous system for boats, which is only possible with accurate distance calculation.”

Sea Machines is also integrating its optical-based collision-avoidance system with autopilot and engine controls to enable autonomous decision-making. Sea.AI is exploring options and applications for its technology.

As with all technologies, optical-based collision-avoidance systems aren’t without their high and low tides. On the positive side, these stand-alone systems add significant safety margins and don’t rely on signals transmitted from other vessels. Conversely, all technologies add cost and complexity, and false alarms can trigger unnecessary stress.

While today’s optical-based collision-avoidance systems offer a sea-change advancement over trusting the big-ocean theory, it will be fascinating to see what future directions the technology takes. Either way, there’s no question that technology which began as specialized equipment for racing sailors is already having a massive impact on the wider boating world.

Evading Other Emergencies

In addition to spotting potential collision targets, optical-based detection systems can be used to locate and track a crewmember who has fallen overboard. Since these systems don’t rely on incoming AIS signals or radar returns, they can be key for detecting, identifying and tracking possible piracy threats.

Nautical Nightmare

A crewmember overboard is one of every captain’s worst fears, but the same camera systems that can help avoid collisions can be used to locate crewmembers in distress.