Sea Machines – Yachting https://www.yachtingmagazine.com Yachting Magazine’s experts discuss yacht reviews, yachts for sale, chartering destinations, photos, videos, and everything else you would want to know about yachts. Wed, 18 Jun 2025 17:10:34 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://www.yachtingmagazine.com/wp-content/uploads/2021/09/favicon-ytg-1.png Sea Machines – Yachting https://www.yachtingmagazine.com 32 32 Optical-Based Collision-Avoidance Tech https://www.yachtingmagazine.com/electronics/optical-based-collision-avoidance-tech/ Wed, 18 Jun 2025 19:00:00 +0000 https://www.yachtingmagazine.com/?p=70408 Optical-based collision-avoidance systems have evolved and gained widespread use, and are improving safety at sea.

The post Optical-Based Collision-Avoidance Tech appeared first on Yachting.

]]>
Optical-based collision-avoidance
Optical-based collision-avoidance tech is an offshoot of automotive-based, advanced driver-assistance systems. Julien Champolion – polaRYSE

Imagine ripping along at 25 to 30 knots in the dark, in a big seaway, singlehanded aboard a 60-foot offshore racing sailboat in the nonstop around-the-world Vendée Globe race. Land and help are hundreds of miles away. Sleep is one of your most valuable currencies, but commercial vessels, fishing boats and whales also transit these waters. Trusting the big-ocean theory while you get some shut-eye can be risky business.

Optical-based collision-avoidance systems are a solution to this problem. One example is Sea.AI (née Oscar), which was developed in 2018 to help keep these kinds of sailors safe. Flash-forward seven years, and this type of technology is protecting boaters of all stripes, with numerous brands on the market and companies competing to advance the systems in various ways.

Optical-based collision-avoidance tech is an offshoot of automotive-based, advanced driver-assistance systems. This technology is quickly becoming an invaluable safety net, alongside radar and the automatic identification system, aboard well-equipped yachts. Elements of this technology are also critical for enabling assisted and autonomous docking and navigation systems. Contemporary systems alert captains of potential collision threats, with AI’s evolutionary curve suggesting more to come. Much like a car’s ADAS, this tech could soon also be standard kit aboard boats.

Most optical-based collision-avoidance systems have one or more cameras, an AI-enabled black-box processor and a display. Systems can include a daylight camera with low-light capabilities or a thermal-imaging camera, or both. The processor typically contains a library of annotated images that depict, for example, a vessel at sunset, a buoy in waves or a partially submerged container. The screen, which can be dedicated glass or a networked multifunction display, presents visual and audible alarms and real-time video imagery of any camera-captured targets.

Sea.AI camera
Sea.AI uses machine vision technology to prevent at-sea collisions. Marin Le Roux – polaRYSE

The camera’s video is fed through the processor using AI computer vision and machine learning. It essentially lets the processor “see” through the camera. The processor then compares the camera’s real-time video feed with its imagery database, or it uses its knowledge of how to identify targets based on its annotated imagery database to identify nonwater objects in the camera’s field of view—a sailboat in the fog, for example.

“Our database contains more than 20 million objects in different scenarios, like sea states, weather conditions, geographic locations,” says Christian Rankl, Sea.AI’s chief technical officer. “It’s key to have a database with a wide range of objects and scenarios to build a highly reliable collision-avoidance system.”

Once the system has identified an object, it tracks it and calculates the real-time distance and bearing to the object, as well as a safe course (depicted on the display) around it.

The math isn’t trivial, says Sangwon Shin, vice president of recreational marine for Avikus, a subsidiary of HD Hyundai that specializes in autonomous navigation: “The hardest part about creating a collision-avoidance system is calculating the distance.” Factors include the boat’s pitch and roll, plus the marine environment’s diverse conditions. A boat’s distance from an object and its velocity also factor into calculating an avoidance path.

This all unfurls almost instantaneously with Avikus’ Neuboat Navi system. “It takes about 20 to 30 milliseconds,” Shin says about the time frame required to identify an object. The system, which uses an electro-optical camera and a lidar sensor to measure distance, recalculates this 10 times per second to ensure accuracy. “Sending the alarm to the boaters takes about 100 to 200 milliseconds,” Shin adds.

Sea Machines’ AI-ris system
Sea Machines’ AI-ris system uses a camera to detect, track, identify and geolocate marine targets. Courtesy Sea Machines

Other systems also offer processing times that are lightning-fast. Phil Bourque, Sea Machines’ vice president of global sales, says his company’s AI-ris system has latency of less than 0.25 seconds at full 4K resolution. “So, it does a lot of thinking very quickly.”

But speed is only one necessary component of these systems. They also have to minimize false alarms. Rankl says Sea.AI continuously refines its AI model by analyzing scenarios where it performed poorly. “It’s crucial for the AI to accurately distinguish real threats from benign objects.”

Sensor payload is another area where evolution is occurring, beyond hardware, software and AI models.

“While optical and thermal sensors are highly effective in detecting various floating objects, they, like all sensors, have limitations,” Rankl says, noting that these limitations could be addressed by integrating radar, AIS, lidar and sonar. “Our research department is actively evaluating the value these sensors can provide to our customers and how they can further enhance their safety at sea.”

Bourque agrees, noting that Sea Machines is working to integrate AIS and radar into AI-ris. “We certainly see the demand for the fusion of computer vision, radar and AIS,” he says.

Another important integration involves displayed cartography and data overlays. Anyone who cruises with radar and AIS is familiar with how multifunction displays can overlay AIS targets and radar data atop vector cartography. To that end, Sea.AI recently partnered with TimeZero to display targets detected by Sea.AI’s Sentry system atop TimeZero’s TZ Professional navigation software. “We are actively working toward integrating our machine vision with other platforms as well,” Rankl says.

Sea.AI isn’t alone in this thinking. Avikus’ Neuboat Navi presents camera-detected targets in its real-time head-up display, and Sea Machines’ SM300 autonomous command and control system displays camera-detected targets atop cartography.

The trick, of course, will be getting optically detected targets onto mainstream multifunction displays, but multiple sources say this is already in the works.

Optical-based collision-avoidance
Optical-based collision-avoidance systems are typically trained to identify all nonwater objects. Yann Riou – polaRYSE/Oscar

Accurately assessing the future of optical-based collision-avoidance systems is a tougher ask.

Bourque says the next five years should see these systems mature and progress—much like the ADAS performance curve. He also says today’s refit customers will want this technology to come factory-installed aboard their next yachts, necessitating that designers and builders allocate physical space for these systems.

In addition, Rankl says, optical-based collision-avoidance technology will become a standard feature on boats, akin to radar and AIS. He sees low-Earth-orbit satellites such as Starlink playing a big role with their fast, global connectivity.

“This will enable the development of large vision models specialized for maritime use,” he says. Rankl also predicts that the rise of AI spatial intelligence, which allows AI models to understand and interact with geographic information, will let collision-avoidance systems better predict the movements of detected targets based on their positions and trajectories.

“Over the next five to 10 years, we expect multimodal systems that integrate data from all available boat sensors—cameras, radars, AIS, etc.—into a unified AI acting as a 24/7 co-skipper,” Rankl says.

Shin agrees but is more bullish about the time frame, which he puts at three to five years. “This technology will be developed in a way that combines multiple sensors and provides more accurate information,” he says. In five to 10 years, he adds, a single piece of hardware will provide “all the necessary data for collision avoidance.” As far as autonomous docking and navigation, Shin says: “We do not aim only to give situational awareness and provide suggested collision-avoidance routing. Our ultimate goal is to provide [an] autonomous system for boats, which is only possible with accurate distance calculation.”

Sea Machines is also integrating its optical-based collision-avoidance system with autopilot and engine controls to enable autonomous decision-making. Sea.AI is exploring options and applications for its technology.

As with all technologies, optical-based collision-avoidance systems aren’t without their high and low tides. On the positive side, these stand-alone systems add significant safety margins and don’t rely on signals transmitted from other vessels. Conversely, all technologies add cost and complexity, and false alarms can trigger unnecessary stress.

While today’s optical-based collision-avoidance systems offer a sea-change advancement over trusting the big-ocean theory, it will be fascinating to see what future directions the technology takes. Either way, there’s no question that technology which began as specialized equipment for racing sailors is already having a massive impact on the wider boating world.

Evading Other Emergencies

In addition to spotting potential collision targets, optical-based detection systems can be used to locate and track a crewmember who has fallen overboard. Since these systems don’t rely on incoming AIS signals or radar returns, they can be key for detecting, identifying and tracking possible piracy threats.

Nautical Nightmare

A crewmember overboard is one of every captain’s worst fears, but the same camera systems that can help avoid collisions can be used to locate crewmembers in distress.

The post Optical-Based Collision-Avoidance Tech appeared first on Yachting.

]]>
Introducing AI-ris from Sea Machines https://www.yachtingmagazine.com/electronics/sea-machines-ai-ris/ Wed, 27 Nov 2024 20:00:00 +0000 https://www.yachtingmagazine.com/?p=67325 This technology leverages the learning power of artificial intelligence to enhance situational awareness for boaters.

The post Introducing AI-ris from Sea Machines appeared first on Yachting.

]]>
Sea Machines AI-ris
AI-ris uses computer vision, a custom machine-learning model and fast processors to provide collision-avoidance alerts. unsplash/redcharlie

Some lessons must be learned the hard way. Take the Atlantis, an 80-foot express cruiser that was motoring 3 miles off the coast of St. Augustine, Florida, this past May during a Memorial Day weekend voyage. The yacht was reportedly operating under mostly clear skies when it struck an object, likely a large metal marker denoting a submerged dredge pipe. At 11:37 a.m., the US Coast Guard in Jacksonville received emergency notification via VHF channel 16 that Atlantis was sinking. The Coast Guard dispatched a boat and contacted the St. Johns County Fire Rescue Division, which rescued two mariners.

While it’s unclear how Atlantis’ crew failed to spot the marker, better watchkeeping and collision-avoidance technology likely could have prevented this accident.

Watchkeeping often involves long periods of monotony punctuated by occasional moments of stress. AI-enabled technologies can ease this burden. Sea Machines’ AI-ris (pronounced “eye-ris”) Computer Vision Sensor alerts recreational mariners about dangerous targets. Although it doesn’t autonomously evade danger, the system processes targets in milliseconds, supports fast cruising speeds and enhances situational awareness.

AI-ris ($27,900) is an optical-based system that uses AI technology called computer vision to detect, classify, geolocate and track multiple targets. The system accomplishes this via a custom machine-learning model that Sea Machines trained on millions of images. “AI-ris is designed to enhance situational awareness for all vessels under power that are 33 feet or longer,” says James Miller, Sea Machines’ AI-ris product manager.

Sea Machines AI-ris
The daylight camera must be mounted at least 25 feet above the waterline to deliver its full range of 5 nautical miles. Courtesy Sea Machines

The system has a forward-looking daylight camera and a rugged black-box processor. It also has a touchscreen user-interface screen, or boaters can substitute a compatible Furuno or Raymarine multifunction display, or a generic touchscreen display. AI-ris requires NMEA 2000 connectivity to access the vessel’s GPS/GNSS sensor, and it can be spec’d with a Sea Machines thermal-imaging camera.

The daylight camera must be mounted at least 25 feet above the waterline to deliver its full range of 5 nautical miles. AI-ris can simultaneously classify and track 50 targets. In terms of target size, AI-ris can detect a 13-foot object at 0.25 nautical miles; at 1 mile, it can detect a 49- to -59-foot object; and at 5 miles, it can detect a 246- to 295-foot object. AI-ris reportedly has 99 to 100 percent accuracy when it can place 20 pixels on a target.

Given that this target classification wizardry resides in a machine-learning model, the optical-based system requires imagery. The daylight camera captures 30 frames per second, has a 90-degree field of view (horizontal and vertical), and uses a low-light mode to capture moon- and starlight. The 10-megapixel sensor yields 4K onscreen imagery that’s shared with the processor via an Ethernet cable.

Processed imagery is then streamed onto the user’s screen. Users can take screen grabs and capture video, a feature that Miller says can be useful in documenting incidents.

“The custom machine-learning model was trained on over 25 million images of vessels and objects [taken] from a variety of vessels operating globally in different sea and lighting conditions,” Miller says.

These images have yielded more than 35 million examples of marine targets—but the system doesn’t work like a search engine. “Rather than looking up a vessel or object within a database, the computer-vision model recognizes important objects in view by its understanding of how these objects appear and behave,” he says.

AI-ris does this very quickly. Miller says it will detect, classify and track multiple targets in less than 250 milliseconds. Depending on environmental factors, the number of targets and the distance to the object, he adds, “This can occur in significantly less time.”

Sea Machines AI-ris
AI-ris employs a ruggedized black-box PC networked to a camera, the N2K network and a multifunction display. Courtesy Sea Machines

On the user-interface side, AI-ris creates a 2D augmented-reality display on its networked screen. Targets are graphically boxed, color-coded and placed into four classification buckets. Yellow indicates powerboats, blue denotes sailboats, and green represents marine mammals. White refers to miscellaneous objects, including aids to navigation, kayaks, swimmers and logs. Miller says Sea Machines is adding eight additional classification buckets soon.

Alternatively, AI-ris can display a radar-style target-range viewer that depicts the vessel in the center, with outward-extending range bands. Targets appear as color-coded triangles, providing classification along with visual range and bearing information.

Users can set guard zones (think radar) in both modes. Once a target is detected, classified and tracked within a guard zone, AI-ris provides visual and auditory warnings. These begin with a banner at the top of the screen; optional auditory alerts are played twice, 20 seconds apart, while the escalatory auditory alarms are played every 20 seconds until the alarms are manually cleared.

“Customer feedback emphasized that the system shouldn’t become something that an operator has to constantly attend to; rather, [it’s] something that supports safe navigation,” Miller says. “For this reason, we have concentrated on a fine balance between passive and active notifications.”

While AI-ris has interesting capabilities, it has limitations like all technologies. For example, its 25-foot mounting-height requirement is a big ask for smaller yachts. The camera’s 90-degree field of view leaves a 270-degree blind spot unless the vessel also carries an automatic identification system receiver or radar. The system isn’t compatible with third-party cameras, and it can’t draw information from the vessel’s vector cartography to verify the position of, say, aids to navigation. As of this writing, the system also can’t autonomously command the autopilot to avoid collisions.

That said, AI-ris does provide unflagging situational awareness within its field of view. It supports vessel speeds up to 45 knots, and technology is on the way that will provide the same daylight-camera functionality on the company-supplied thermal-imaging cameras. Sea Machines is planning a release that will allow a yacht with AI-ris, a Sea Machines SM360 advanced autopilot system, and a Rolls-Royce power and bridge system to dodge dangerous targets autonomously.

For now, if the goal is to enjoy less-memorable Memorial Day weekends than the crew on Atlantis experienced, the Sea Machines AI-ris system provides a tireless eye on the horizon.

Lifelong Learners

AI-ris is based on a model that has already been trained on more than 25 million images, but more data equals increased safety. Sea Machines collects new imagery using a fleet of test boats and working with customers who share voyage data. The company then releases yearly updates that expand the model’s identification capabilities.

The post Introducing AI-ris from Sea Machines appeared first on Yachting.

]]>
Sea Machines Tech Keeping Humans in the Loop https://www.yachtingmagazine.com/electronics/sea-machines-human-loop/ Thu, 26 Jan 2023 18:00:00 +0000 https://www.yachtingmagazine.com/?p=59585 Sea Machines technology is working toward new levels of vessel automation and collision avoidance.

The post Sea Machines Tech Keeping Humans in the Loop appeared first on Yachting.

]]>
Sea Machines’ SM200
The SM200’s user interface communicates wirelessly with its networked onboard cabinet using dynamic frequency-hopping radio. Courtesy Sea Machines

In 1889, American journalist Nellie Bly circumnavigated the world in 72 days by train, steamship, rickshaw, horse and donkey, besting the timeline that author Jules Verne laid out in his fictional challenge.

In 2021, Nellie Bly—a modern tug built by Damen Shipyards in the Netherlands with a Sea Machines SM300 autonomous command-and-control system—completed a 1,000-nautical-mile circumnavigation of Denmark in 13 days, sailing from Hamburg, Germany.

The tug’s journey also had an element of reality overtaking the imagination, given that Nellie Bly’s two commanding officers were in Boston. And while the tug carried two professional mariners on board, the SM300 called the shots for more than 95 percent of the journey. This included 14 port calls and 30-plus self-directed collision-avoidance and traffic-separation course changes.

It seems that if humans can dream it, we also can do it.

Such is the thinking at Sea Machines, a company that Michael Gordon Johnson founded in Boston in 2017. He had spent years in a leadership role at a marine-salvage company, where he realized that the marine environment is an increasingly complex space but that human minds have limited attention spans and information bandwidth to handle it.

“You start to see the [accidents] that can be avoided with new types of technology,” he says, adding that the marine environment is “the most challenging domain on our planet, with the dynamic liquid environment and the forces it puts on vessels.”

The team at Sea Machines is working to address that challenge with autonomous and remotely commanded technologies for the maritime community that, if widely adopted, could help make boating safer for everyone. Johnson’s vision includes autonomous technologies with some level of human oversight—what he terms “human on the loop”—to reduce risk.

“The goal is shifting 99 percent of onboard perception and navigation effort to our system, making the overall voyage safer and easier for those on board,” Johnson says.

Sea Machines raised $1.4 million in seed money; to date, the venture-backed startup has raised more than $30 million. Investors include Toyota Ventures, Brunswick Corp. and military ship-building company Huntington Ingalls. Contracted clients include shipping giant A.P. Moller-Maersk and the US Department of Transportation.

Sea Machines’ SM200
Sea Machines’ SM200 gives users full control over their helm from 0.54 nautical miles. Courtesy Sea Machines

Sea Machines’ first commercially available products debuted in 2019. The SM200 wireless remote-helm-control system is a ruggedized, portable setup that lets a person remotely control a vessel from a range of up to 0.54 nautical miles. It uses industrial-grade components, including a Siemens-based programmable logic controller (PLC) for steering, propulsion and payload (such as water pumps on a fireboat). The SM200 has analog and digital inputs and outputs that let it network with a vessel’s onboard systems and instrumentation.

The SM200’s user interface communicates wirelessly with its networked onboard cabinet (a black-box computer) using dynamic frequency-hopping radio. Operators control the vessel via a wearable belt pack. This allows onboard operators to better position themselves for maneuvering in view-obstructed scenarios and lets remote operators handle vessels in environments that are hazardous to human health.

If the SM200 suffers a communications loss, the system enforces a “time-out leash” and stops the vessel.

The SM300 autonomous command-and-control system used aboard Nellie Bly in 2021 is even more advanced than its predecessor. It has sensors, computers and satellite communications to give remote operators full control over a vessel’s engines, steering, transmission and auxiliary systems. As seen aboard Nellie Bly, it can also call most of its own shots.

The system accomplishes both these things via an onboard, self-contained cabinet and a ruggedized Windows-based laptop that serves as a remote user interface.

“It’s called the 300 because there are three main processors,” Johnson says. “There’s the low-level control, which is PLC-based; there’s the autonomy computer, which makes decisions on path planning and collision avoidance; and the third processor is the user interface.”

The SM300 can interface with a variety of third-party vessel sensors. These include AIS, depth transducers, multi-antenna GPS/GNSS, and compatible ARPA-enabled radars. “We have a number of protections in our system to keep the vessel and the people on board safe, from collision avoidance to low-depth [scenarios],” Johnson says.

While Sea Machines marketed and sold the SM200 and SM300 directly, the Boston-based technology firm plans to sell its second-generation products through third-party OEMs, including engine manufacturers. Its is already working with companies such as Rolls-Royce and HamiltonJet.

The first of these second-generation products—the SM360—is a Linux-based system that Johnson describes as an ultra-smart autopilot capable of autonomously managing a vessel’s steering, propulsion and collision avoidance. HamiltonJet launched this system as JetSense.

“It’s pilot assist,” Johnson says, comparing the marine-focused technology to driver-assist automotive features. “It’s controlling steering, propulsion—perceiving the domain and doing collision avoidance as necessary.”

The SM360 also includes Sea Machines’ computer vision, dubbed AI-ris (for artificial intelligence recognition and identification system). AI-ris is an image-based form of AI that detects, tracks, geolocates and classifies objects it “sees” into 10 different types (think buoys, ships or whales). It incorporates a daylight camera and compares the real-time image feed with a database of stored images. The system then uses this information—plus data derived from AIS, electronic charts, radar, GNSS and other sensors—to make navigation and collision-avoidance determinations.

Should the system detect something unexpected, or if a situation exceeds certain thresholds, it notifies its human operator.

These are nascent days for Sea Machines, but the future appears bright. Provided that these systems have steady communications with their human on the loop, and willing adopters, the technology could help flatten collision curves.

Weather Beater

In addition to autonomous operations, Sea Machines’ Linux-based SM360 includes an API (application programming interface). It lets third-party developers create weather-routing software, which the SM360 can then use for route planning. This technology could open the door to greener passages and less pain at the pump for mariners.

The post Sea Machines Tech Keeping Humans in the Loop appeared first on Yachting.

]]>