Avikus – Yachting https://www.yachtingmagazine.com Yachting Magazine’s experts discuss yacht reviews, yachts for sale, chartering destinations, photos, videos, and everything else you would want to know about yachts. Wed, 18 Jun 2025 17:10:34 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://www.yachtingmagazine.com/wp-content/uploads/2021/09/favicon-ytg-1.png Avikus – Yachting https://www.yachtingmagazine.com 32 32 Optical-Based Collision-Avoidance Tech https://www.yachtingmagazine.com/electronics/optical-based-collision-avoidance-tech/ Wed, 18 Jun 2025 19:00:00 +0000 https://www.yachtingmagazine.com/?p=70408 Optical-based collision-avoidance systems have evolved and gained widespread use, and are improving safety at sea.

The post Optical-Based Collision-Avoidance Tech appeared first on Yachting.

]]>
Optical-based collision-avoidance
Optical-based collision-avoidance tech is an offshoot of automotive-based, advanced driver-assistance systems. Julien Champolion – polaRYSE

Imagine ripping along at 25 to 30 knots in the dark, in a big seaway, singlehanded aboard a 60-foot offshore racing sailboat in the nonstop around-the-world Vendée Globe race. Land and help are hundreds of miles away. Sleep is one of your most valuable currencies, but commercial vessels, fishing boats and whales also transit these waters. Trusting the big-ocean theory while you get some shut-eye can be risky business.

Optical-based collision-avoidance systems are a solution to this problem. One example is Sea.AI (née Oscar), which was developed in 2018 to help keep these kinds of sailors safe. Flash-forward seven years, and this type of technology is protecting boaters of all stripes, with numerous brands on the market and companies competing to advance the systems in various ways.

Optical-based collision-avoidance tech is an offshoot of automotive-based, advanced driver-assistance systems. This technology is quickly becoming an invaluable safety net, alongside radar and the automatic identification system, aboard well-equipped yachts. Elements of this technology are also critical for enabling assisted and autonomous docking and navigation systems. Contemporary systems alert captains of potential collision threats, with AI’s evolutionary curve suggesting more to come. Much like a car’s ADAS, this tech could soon also be standard kit aboard boats.

Most optical-based collision-avoidance systems have one or more cameras, an AI-enabled black-box processor and a display. Systems can include a daylight camera with low-light capabilities or a thermal-imaging camera, or both. The processor typically contains a library of annotated images that depict, for example, a vessel at sunset, a buoy in waves or a partially submerged container. The screen, which can be dedicated glass or a networked multifunction display, presents visual and audible alarms and real-time video imagery of any camera-captured targets.

Sea.AI camera
Sea.AI uses machine vision technology to prevent at-sea collisions. Marin Le Roux – polaRYSE

The camera’s video is fed through the processor using AI computer vision and machine learning. It essentially lets the processor “see” through the camera. The processor then compares the camera’s real-time video feed with its imagery database, or it uses its knowledge of how to identify targets based on its annotated imagery database to identify nonwater objects in the camera’s field of view—a sailboat in the fog, for example.

“Our database contains more than 20 million objects in different scenarios, like sea states, weather conditions, geographic locations,” says Christian Rankl, Sea.AI’s chief technical officer. “It’s key to have a database with a wide range of objects and scenarios to build a highly reliable collision-avoidance system.”

Once the system has identified an object, it tracks it and calculates the real-time distance and bearing to the object, as well as a safe course (depicted on the display) around it.

The math isn’t trivial, says Sangwon Shin, vice president of recreational marine for Avikus, a subsidiary of HD Hyundai that specializes in autonomous navigation: “The hardest part about creating a collision-avoidance system is calculating the distance.” Factors include the boat’s pitch and roll, plus the marine environment’s diverse conditions. A boat’s distance from an object and its velocity also factor into calculating an avoidance path.

This all unfurls almost instantaneously with Avikus’ Neuboat Navi system. “It takes about 20 to 30 milliseconds,” Shin says about the time frame required to identify an object. The system, which uses an electro-optical camera and a lidar sensor to measure distance, recalculates this 10 times per second to ensure accuracy. “Sending the alarm to the boaters takes about 100 to 200 milliseconds,” Shin adds.

Sea Machines’ AI-ris system
Sea Machines’ AI-ris system uses a camera to detect, track, identify and geolocate marine targets. Courtesy Sea Machines

Other systems also offer processing times that are lightning-fast. Phil Bourque, Sea Machines’ vice president of global sales, says his company’s AI-ris system has latency of less than 0.25 seconds at full 4K resolution. “So, it does a lot of thinking very quickly.”

But speed is only one necessary component of these systems. They also have to minimize false alarms. Rankl says Sea.AI continuously refines its AI model by analyzing scenarios where it performed poorly. “It’s crucial for the AI to accurately distinguish real threats from benign objects.”

Sensor payload is another area where evolution is occurring, beyond hardware, software and AI models.

“While optical and thermal sensors are highly effective in detecting various floating objects, they, like all sensors, have limitations,” Rankl says, noting that these limitations could be addressed by integrating radar, AIS, lidar and sonar. “Our research department is actively evaluating the value these sensors can provide to our customers and how they can further enhance their safety at sea.”

Bourque agrees, noting that Sea Machines is working to integrate AIS and radar into AI-ris. “We certainly see the demand for the fusion of computer vision, radar and AIS,” he says.

Another important integration involves displayed cartography and data overlays. Anyone who cruises with radar and AIS is familiar with how multifunction displays can overlay AIS targets and radar data atop vector cartography. To that end, Sea.AI recently partnered with TimeZero to display targets detected by Sea.AI’s Sentry system atop TimeZero’s TZ Professional navigation software. “We are actively working toward integrating our machine vision with other platforms as well,” Rankl says.

Sea.AI isn’t alone in this thinking. Avikus’ Neuboat Navi presents camera-detected targets in its real-time head-up display, and Sea Machines’ SM300 autonomous command and control system displays camera-detected targets atop cartography.

The trick, of course, will be getting optically detected targets onto mainstream multifunction displays, but multiple sources say this is already in the works.

Optical-based collision-avoidance
Optical-based collision-avoidance systems are typically trained to identify all nonwater objects. Yann Riou – polaRYSE/Oscar

Accurately assessing the future of optical-based collision-avoidance systems is a tougher ask.

Bourque says the next five years should see these systems mature and progress—much like the ADAS performance curve. He also says today’s refit customers will want this technology to come factory-installed aboard their next yachts, necessitating that designers and builders allocate physical space for these systems.

In addition, Rankl says, optical-based collision-avoidance technology will become a standard feature on boats, akin to radar and AIS. He sees low-Earth-orbit satellites such as Starlink playing a big role with their fast, global connectivity.

“This will enable the development of large vision models specialized for maritime use,” he says. Rankl also predicts that the rise of AI spatial intelligence, which allows AI models to understand and interact with geographic information, will let collision-avoidance systems better predict the movements of detected targets based on their positions and trajectories.

“Over the next five to 10 years, we expect multimodal systems that integrate data from all available boat sensors—cameras, radars, AIS, etc.—into a unified AI acting as a 24/7 co-skipper,” Rankl says.

Shin agrees but is more bullish about the time frame, which he puts at three to five years. “This technology will be developed in a way that combines multiple sensors and provides more accurate information,” he says. In five to 10 years, he adds, a single piece of hardware will provide “all the necessary data for collision avoidance.” As far as autonomous docking and navigation, Shin says: “We do not aim only to give situational awareness and provide suggested collision-avoidance routing. Our ultimate goal is to provide [an] autonomous system for boats, which is only possible with accurate distance calculation.”

Sea Machines is also integrating its optical-based collision-avoidance system with autopilot and engine controls to enable autonomous decision-making. Sea.AI is exploring options and applications for its technology.

As with all technologies, optical-based collision-avoidance systems aren’t without their high and low tides. On the positive side, these stand-alone systems add significant safety margins and don’t rely on signals transmitted from other vessels. Conversely, all technologies add cost and complexity, and false alarms can trigger unnecessary stress.

While today’s optical-based collision-avoidance systems offer a sea-change advancement over trusting the big-ocean theory, it will be fascinating to see what future directions the technology takes. Either way, there’s no question that technology which began as specialized equipment for racing sailors is already having a massive impact on the wider boating world.

Evading Other Emergencies

In addition to spotting potential collision targets, optical-based detection systems can be used to locate and track a crewmember who has fallen overboard. Since these systems don’t rely on incoming AIS signals or radar returns, they can be key for detecting, identifying and tracking possible piracy threats.

Nautical Nightmare

A crewmember overboard is one of every captain’s worst fears, but the same camera systems that can help avoid collisions can be used to locate crewmembers in distress.

The post Optical-Based Collision-Avoidance Tech appeared first on Yachting.

]]>
Future-Proofing Multifunction Displays https://www.yachtingmagazine.com/electronics/multifunction-displays-planned-relevance/ Thu, 11 Jul 2024 19:00:00 +0000 https://www.yachtingmagazine.com/?p=64991 Modern multifunction displays are feature rich and can be long-lasting, creating consumer upsides that didn’t exist previously.

The post Future-Proofing Multifunction Displays appeared first on Yachting.

]]>
Multifunction Displays
As displays have gotten bigger and better, their user interfaces have gotten smoother and more intuitive. Courtesy Raymarine

For years, I eagerly anticipated Apple’s fall event and news of the latest iPhone release. Back then, my purchasing latency was limited to locating the website’s “buy” button, as my incumbent phone was often struggling to keep pace with new apps and software updates. Then, starting around 2015 (the iPhone 6S), I was able to start squeezing extra years out of my phones. This trend accelerated, and as of today, I still rely on my iPhone 11 Pro from 2019. To be fair, I always buy the top-end model with maximum storage, but four and a half years on, I haven’t crashed (at least not hard) into this phone’s silicone ceiling.

Multifunction displays perform different tasks than smartphones, but most marine-electronics manufacturers build MFDs with off-the-shelf componentry and, sometimes, software from the mobile-device market. This sourcing gives manufacturers options for high-resolution touchscreen displays, processors, connectivity and operating-system architecture, and it means that today’s MFDs can have longer working lives.

How we got here, however, requires a small rewind. After all, MFDs circa 2010 were different animals than today’s big, powerful displays.

“Back then, most displays were 4 to 7 inches,” says Dave Dunn, Garmin’s senior director of marine and RV sales. “A big display was 9 to 10 inches, and a 12-inch display was enormous.”

These MFDs were controlled via tactile buttons and knobs, or early touchscreen or hybrid-touch interfaces. They only tackled marine-facing applications such as chart-plotting.

Today’s MFDs excel at traditional marine tasks, but they also boast bigger glass, full video integration, touchscreen interfaces, high-speed data networks, and four- or six-core processors, opening the door to expanded job descriptions.

“Processing power has indeed increased over time, bringing with it the ability to drive higher-resolution screens,” says Steve Thomas, Simrad’s product director for digital systems. “[This] also lends itself to better integration by providing the responsiveness consumers expect.”

It also enables MFDs to perform nontraditional tasks, including streaming video from daylight and thermal-imaging cameras, tackling onboard security, controlling digital switching and, sometimes, providing entertainment. Today’s flagship MFDs also sport larger high-resolution displays, multisignal connectivity (with ANT, Bluetooth, Ethernet and Wi-Fi), embedded sonar modules, GPS or GNSS receivers, data backbones, and NMEA 2000 and HTML5 compatibility.

“NMEA 2000 protocol provides the basis of communication and is the linchpin connecting everything together for the MFD to display and control,” says Eric Kunz, Furuno’s senior product manager. Kunz adds that HTML5 compatibility allows MFDs to display and control third-party equipment via web-browser windows, sans any heavy lifting from the MFD.

Technology moves in step changes, and MFDs, brand depending, have experienced two major evolutions since 2010.

“The first was the transition from a completely closed-software architecture to something open source,” says Jim McGowan, Raymarine’s Americas marketing manager, referring to the company’s shift from a walled-garden operating system to Linux and then Android.

Others, including Simrad and Furuno, took similar steps. Garmin remains a holdout.

“We use Android, but not for marine,” Dunn says. “Will we eventually go to Android? Maybe.”

The second evolution involved hardware, with all MFD manufacturers now using mobile-device componentry.

“Suddenly, the requirements for shock resistance, heat resistance, water resistance, bright visibility and fast processing became available on a wide scale,” McGowan says. “Instead of us having to source expensive industrial or semicustom hardware that was proven but old, suddenly our system architects had multiple options to choose from that were all state of the art.”

Sourcing components became easier, yielding better MFDs, but it placed a higher premium on software. Case in point: Raymarine has released more than 30 updates, including new features, for its 2017-era Axiom MFDs.

Likewise, there’s the importance of supporting hardware as it ages. “We don’t like to leave customers behind,” Dunn says, noting that Garmin supports products for five years after they’re discontinued.

This opens the door to the fine art of good enough. Given that modern MFDs are robust, the same display—like my iPhone—can last for years, provided that its sensor network remains static. While this works for buy-and-hold customers, new sensors can dangle carrots.

For example, Furuno and Garmin unveiled Doppler-enabled radars in 2016. While older MFDs could often display radar imagery from these sensors, some customers had to refit their displays to access the best features. One can imagine automation and AI presenting similar incentives.

“AI will combine multiple facets of different sensors to create a more sophisticated and enhanced navigation experience,” Kunz says. “Look for MFDs to take a larger and larger part in overall vessel control and automation.”

Avikus, for instance, is developing its NeuBoat autonomous navigation system with Raymarine. As for Garmin, Dunn says: “There’s nothing coming in the near future, but there’s some cool stuff coming with lidar and cameras.” He’s referring to the light-detection and ranging sensors that help enable automotive driver-assist features and autonomous driving.

Future hardware and capabilities aside, all experts agree on the importance of regularly updating a vessel’s MFD to keep the operating system current and to access the latest software features. While updates are free, all four companies have adopted subscription models for cartography.

“In some ways, the marine-electronics business model is changing in the same way it is happening in the consumer-electronics industry,” Kunz says. “This will most likely lead to more of a subscription-based model for certain aspects of the market.”

While subscription models make sense for a dynamic media like cartography, it’s harder to envision this business practice extending throughout the sensor ecosystem.

“We don’t want to get to the point where people have to pay for software updates,” Dunn says, pointing to BMW’s belly-flopped attempt to charge customers fees to use their existing heated steering wheels.

New hardware, however, is a different story. “More than anything, we’re a sensor company,” McGowan says of Raymarine. “We keep offering new and improved sensors.”

Given the adoption rates of Doppler-enabled radar, there’s little question that the recreational marine market stands ready to embrace step-change sensors, so long as they come bundled with newfound capabilities—say, auto-docking or autonomous navigation.

As for my ancient iPhone, I’m again counting the days until Apple’s fall event. I just hope my next iPhone will last as long as today’s flagship MFDs.  

UI Options

Recent years have seen most manufacturers adopt touchscreen-only user interfaces for their flagship multifunction displays. This technology creates user-friendly interfaces in most conditions, but some users prefer tactile buttons when the weather sours. All manufacturers build optional external keypads or hard-button remote controls.

The post Future-Proofing Multifunction Displays appeared first on Yachting.

]]>
AI-Assisted Piloting Is Coming https://www.yachtingmagazine.com/electronics/ai-assisted-piloting-is-coming/ Fri, 24 May 2024 19:00:08 +0000 https://www.yachtingmagazine.com/?p=64279 Avikus and Raymarine see artificial intelligence and sensor networks making boating easier and safer.

The post AI-Assisted Piloting Is Coming appeared first on Yachting.

]]>
Eric Powell illustration
AI-assisted yachting is rapidly evolving, offering a future of computer-assisted docking and navigation. Eric Powell

In 2018, I watched my buddy Allan engage the Mad Max autopilot mode on his Tesla Model S, cuing the car to switch lanes aggressively on Interstate 95. While the experience as a human was unnerving, the car leveraged cameras, sensors and artificial intelligence to maneuver safely.

Months later, I rode on a Boston Whaler 330 Outrage fitted with Mercury Marine’s Advanced Pilot Assist and Raymarine’s DockSense systems. As we approached the boat’s slip, the preproduction system used cameras, AI and the outboard engines to maintain a 3-foot safety buffer.

At the 2022 Fort Lauderdale International Boat Show, I saw these ideas meld in Avikus’ prototype NeuBoat autonomous operations system. The boat, with a human-in-the-loop operator, navigated itself out of its slip, up a river and around a lake before reversing course and docking itself.

Ready or not, autonomous technology is coming. This is likely good news for novice boaters—and for boaters who hate docking—because some of the marine industry’s smartest minds have been combining sensors and AI to smooth out boating’s rough corners. One example is NeuBoat (neuron plus boat), which Avikus is developing in partnership with Raymarine.

While experts say the sensors and software already exist to enable fully autonomous docking and navigation, Avikus and Raymarine foresee a road map to autonomy that earns trust with boaters while buying time for engine manufacturers to integrate the technology, and for agencies and organizations to create regulations.

“We’re intentionally paralleling the automotive market,” says Jamie Cox, Raymarine’s senior global product manager. “But I think we will beat automotive.”

Others agree. Sangwon Shin, Avikus’ director of strategic planning and business development, says: “In our view, the boating environment is less complicated than the car environment. So, we expect a little bit faster adoption rate.”

Eric Powell illustration
Avikus and Raymarine’s NeuBoat employs a sensor network that includes daylight cameras, light detection and ranging instruments. Eric Powell

For boaters who are ready to start now, Avikus and Raymarine are releasing NeuBoat Dock this year. The assisted-docking system includes at least six self-calibrating, 360-degree cameras; a Raymarine multifunction display; an Avikus object-recognition unit; camera control boxes; and Avikus’ AI to provide bird’s-eye views and distance guides. (Garmin’s Surround View camera system provides similar capabilities.)

NeuBoat Dock is a level-one autonomous navigation system, which means it serves as a virtual assistant to human operators who remain in control. Level-two systems provide partial driving automation but still require a human operator. Level-three systems have conditional driving automation, requiring some human oversight, while level four has zero expectations of driver involvement. Level five is full driving automation.

Avikus, which is a spin-off of HD Hyundai, began developing NeuBoat in 2019. The resulting level-three-plus black-box prototype, which I got aboard in 2022, used the global navigation satellite system and vector cartography to establish position. The local device didn’t require internet connectivity. Instead, it employed daylight cameras and lidar (light detection and ranging) sensors to detect objects, measure distances, and scan and map berths. It also used Avikus’ AI to detect and classify nearby objects and vessels, assist with route planning, and suggest navigable courses.

This latter information was presented as screen views showing vector cartography with recommended courses, head-up displays and live camera views with augmented-reality data tags.

While impressive, the prototype didn’t use radar or the automatic identification system, so its range of object detection was limited to lidar’s 400-foot-range capacity. This range worked at our 6-knot speed, giving us 39 seconds of reaction time, but it wouldn’t work at 25 knots, only allowing for nine seconds.

Enter Raymarine, which integrated its own radar technology with Avikus’ AI. This combination extended NeuBoat’s detection range from 400 feet to 1.5 nautical miles. Shin says Avikus plans to integrate radar, sonar and infrared cameras within five years.

Eric Powell illustration
“The technology is there today. We need to make sure that people are ready to use the technology responsibly and that regulations are there.” Eric Powell

While extra range is important for recreational users, it’s critical for letting Avikus develop autonomous systems on large ships. “We use the same technology and the same algorithms for commercial and recreational, but the hardware specs are different,” Shin says.

In addition to radar expertise, Raymarine has amassed experience using computer vision from its DockSense and ClearCruise AR products. The latter places augmented-reality tags atop a video feed. Computer vision is a branch of AI that lets computers recognize, categorize and identify objects and people in digital images or video feeds; as such, it is critical to autonomous operations.

Looking ahead, Shin says, commercial ships and recreational vessels will first use autonomous navigation with human-in-the-loop operators, followed by autonomous operations. This isn’t a hypothetical; in 2022, Avikus’ commercial version of NeuBoat autonomously guided an LNG tanker across an ocean with human-in-the-loop oversight.

“The technology is there today,” Cox says. “We need to make sure that people are ready to use the technology responsibly and that regulations are there.”

When asked what milestones need to be met for autonomous operations aboard recreational yachts, Cox and Shin made clear they aren’t talking about distant horizons. “None are 10 years out,” Cox says, adding that by mid-2024, Avikus and Raymarine expect to have achieved sensor fusion, where the system can combine data from the vessel’s AIS, cameras, GNSS, lidar and radar. “In two years, on the control side, boats will be docking and driving themselves.”

Shin agrees: “In five years, we’re expecting lots of the boating community to accept the possibility of autonomous navigation or partial assistance on their boat.”

Before this can happen, however, Cox and Shin point to two technical complexities: networking with autopilots and engines. As with radars, Raymarine has decades of experience manufacturing autopilots, so engine interfaces could prove to be the sticky wicket. “Engine manufacturers need to become more progressive,” Shin says. “They are the powerful guys.” Cox says the goal is to integrate NeuBoat with every major engine manufacturer.

Eric Powell illustration
Automotive-style bird’s-eye cameras are an important tool for assisted or autonomous docking systems. Eric Powell

Cox and Shin also point to a need for regulations to govern autonomous vessels. This is already happening; in 2022, the American Bureau of Shipping published a white paper that detailed 10 points—from maintaining propulsion to maintaining communications—intended to create a structure for autonomous-vessel design and operations. The US Coast Guard also published guidelines on testing remote- and autonomously controlled vessels.

Convincing experienced boaters that autonomous technology is the path forward could be a hard sell for some, but this is where Avikus and Raymarine plan to parallel the automotive world. Most contemporary cars have adaptive cruise control, making these types of assistance features feel familiar. Many boaters also own cars with an autopilot feature.

But driving to work is different than taking the boat out for a spin. Here, Cox says NeuBoat isn’t going to take away boating’s joys. Instead, the idea is to reduce stress. For example, Cox describes allowing the boat to navigate autonomously to the fishing grounds or home from a cruise.

Cox also says autopilots have served boaters for decades, and that autonomous navigation is an extension of this capability, combined with the ability to avoid collisions autonomously.

For newer boaters, autonomous technology is an easier proposition. “I’m a new boater, and I get nervous a lot,” Shin says. “We target new boaters. We want more people to enjoy boating.”

Then there is boating’s greatest equalizer. “People don’t like docking,” Cox says. “We’re never going to stop you from driving your boat, but it might be nice, if you’re coming into a dock and are getting stressed out, to switch it on.”

The wait won’t be long, either. While Avikus is paralleling the automotive sector, Cox and Shin expect NeuBoat technology to navigate and dock recreational vessels sooner than cars. “People will be surprised with how quickly we will get to market,” Cox says.

Having experienced Tesla’s Mad Max mode and Avikus’ level-three-plus sea trials,

I can say that far less adrenaline is involved watching a demonstration boat dock itself than when I pawed for a nonexistent passenger-side brake pedal in my buddy Allan’s Tesla.

Better Optics

While NeuBoat Dock uses six 360-degree cameras, they only work for daytime operations. The obvious move is to add thermal-imaging cameras, and Raymarine’s parent company, Teledyne, owns FLIR. Thermal-imaging cameras would add cost, but Cox says these sophisticated optical sensors could be included aboard higher-end NeuBoat installations.

The post AI-Assisted Piloting Is Coming appeared first on Yachting.

]]>
Hands Free: Avikus’s Autonomous Navigation System https://www.yachtingmagazine.com/electronics/avikus-autonomous-navigation-system/ Wed, 14 Jun 2023 18:00:00 +0000 https://www.yachtingmagazine.com/?p=60428 Autonomous-navigation technology from Avikus could change the future of yachting.

The post Hands Free: Avikus’s Autonomous Navigation System appeared first on Yachting.

]]>
Avikus autonomous-navigation system
During our demo ride, the Avikus autonomous-navigation system successfully negotiated a busy Florida waterway. Courtesy Avikus

It’s one thing to experience computer-assisted docking, but it’s different to ride aboard a vessel that’s autonomously negotiating the nautical road. I learned this during an on-water demo of NeuBoat technology from Avikus at the recent Fort Lauderdale International Boat Show. Our human pilot guided the demo boat out of its slip and into the Stranahan River. Minutes later, his hands left the helm, not to return. While it initially felt strange to place so much trust in silicone and sensors, trepidation morphed into amazement as we transited under the Southeast 17th Street bridge and into Lake Mabel. We passed hundreds of millions of dollars’ worth of gleaming fiberglass, aluminum and steel waterlines, but NeuBoat plotted a safe course through all of it and then back to, and into, the slip.

Like it or not, artificial intelligence is here, and it will only become a bigger, more integrated part of our world in the future. Hyundai Heavy Industries, the world’s largest shipbuilder and the parent company of Avikus, is already using autonomous capabilities to navigate ships across oceans, albeit with human oversight. While the NeuBoat isn’t the only autonomous-vessel technology afloat, it’s the only solution created by the commercial-marine sector with the parallel intention of innovating for the recreational-marine market. Avikus plans to offer two AI-assisted products, which can each be spec’d with two levels of operational capability.

The first is NeuBoat Navigation, which should be released in the second half of this year. It is designed to help human operators who are directly controlling their vessels make more situationally aware decisions. The basic option delivers navigational assistance through augmented reality to help skippers make smart choices, while the more advanced option provides navigation assistance along with camera- and sensor-collected informational assistance while docking.

The second product is NeuBoat Navigation and Docking Control, which Avikus plans to release in 2024. It’s a step up. Its basic option will include AI-controlled route planning, navigation and collision avoidance, with humans providing oversight. Its advanced option adds AI-controlled docking.

While  NeuBoat’s operating system and AI are scalable, the two systems (and their options) have different hardware requirements. NeuBoat Navigation installations have a black-box AI recognition processor and NeuBoat’s graphical user interface (GUI), which can be displayed on a compatible multifunction display or an Android-based tablet or smartphone. For a sensor, there is a forward-looking daylight camera and a forward-looking  light-detection-and-ranging (lidar) sensor. If applicable, NeuBoat can incorporate cameras for thermal imaging. Additional cameras and sensors can be added.

NeuBoat Navigation and Docking Control-equipped yachts will employ the same AI-recognition processor and GUI, but will also have an autonomous-control processor and an engine-interface module. Steering is via the vessel’s networked autopilot. The system will also use five to 10 daylight cameras and at least two lidar sensors for 360-degree situational awareness.

Both systems require either two networked global navigation satellite system receivers or a  networked satellite compass to  determine the  vessel’s position, heading and  rate-of-turn  information. The cameras collect imagery that’s sent to the system’s AI-recognition processor, where it’s compared, in near real time, with a growing image database that includes at least 2.5 million images. The system then uses a form of AI called computer vision to sort objects into one of eight buckets: motorboats, vessels, sailing yachts, rowboats, channel markers, buoys, structures or “other.”

“We determine obstacles in the image input from the front [camera] in real time with AI based on the pre-learned image data,” says Lim Dohyeong, CEO of Avikus.

The video stream is also sent to a helm display, where it can be seen by human operators. For navigation, the system can apply augmented-reality-style information tags above camera-captured targets, advising on their range and target type.

As mentioned, both systems employ a lidar sensor or sensors to determine precise distances. NeuBoat Navigation and Docking Control, however, also uses lidar to create distance maps between the sensor and objects in its surrounding environment (for example, docks and pilings) during autonomous docking.

“It scans the surrounding environment in real time whenever it enters or leaves the port,” Dohyeong says. The system basically creates a map from scratch each time—even if the dock is one that the yacht’s owners often frequent—to account for dynamic variables.

In addition to video feeds, part of NeuBoat’s GUI includes electronic cartography and a chartplotter like page view. The information serves as a database that the system uses for auto-routing and autonomous navigation. (For my demo ride, the system used an official electronic navigation chart; however, Avikus plans to integrate with third-party  vector-cartography  products, including C-Map and  Navionics.) This presentation also allows a human operator to understand the yacht’s position quickly, relative to landmasses, channels, navigational marks and other vessels.

As with other self-learning, AI-based systems, the more time NeuBoat spends navigating and capturing video imagery, the better it should perform. NeuBoat will be supported by biannual software updates.

While NeuBoat products have yet to be released, Avikus is working with several marine-electronics manufacturers on projects that will allow NeuBoat to incorporate third-party equipment, including AIS and radar, for collision-avoidance work. As an example, Avikus has signed a memorandum of understanding with Raymarine to collaborate on integrating NeuBoat technology with Raymarine’s product portfolio and to explore the  future of autonomous recreational  vessels  overall.

As of this writing, Avikus plans to let owners purchase NeuBoat Navigation as OEM equipment aboard a new build or in the aftermarket during a refit. Dohyeong says NeuBoat Navigation can even be added as a DIY project. Owners interested in NeuBoat Navigation and Docking Control systems will have to wait until they’re offered aboard new builds, but Dohyeong says aftermarket upgrades could become available for that system too, as its auto-calibration technology advances.

While adoption rates and boaters’ willingness to entrust potentially consequential operations to artificial intelligence remain an open-ended question, our test boat successfully navigated and self-docked. The gleaming nearby waterlines, I’m happy to report, remained unaffected by our autonomous passage.

Heavy Metal

In addition to NeuBoat,  Avikus builds HiNAS 2.0 (that’s  Hyundai intelligent Navigation Assistant System) for commercial mariners. In 2022, HiNAS 2.0 helped navigate a carrier full of liquefied natural gas across an ocean with human oversight. The ship’s fuel  efficiency increased by about 7 percent. Greenhouse-gas emissions dropped by about 5 percent.

The post Hands Free: Avikus’s Autonomous Navigation System appeared first on Yachting.

]]>
Raymarine Is Working on an Autonomous Boat https://www.yachtingmagazine.com/yachts/raymarine-hd-hundai-avikus-neuboat/ Tue, 22 Nov 2022 19:00:00 +0000 https://www.yachtingmagazine.com/?p=59309 The company signed a memorandum of understanding with Avikus to develop the idea.

The post Raymarine Is Working on an Autonomous Boat appeared first on Yachting.

]]>
Avikus Neuboat
The Avikus Neuboat will have Raymarine’s navigational products. Courtesy Avikus

Avikus, HD Hyundai’s autonomous navigation in-house startup, has signed a memorandum of understanding with Raymarine to cooperate on what’s being billed as the world’s first autonomous leisure boat, the Avikus NeuBoat.

The boat will have Raymarine’s navigational products on board, and the two companies say they will explore “the future of autonomous leisure boating,” hinting at other projects to come.

“Raymarine brings world-leading expertise in marine electronics and navigational equipment for leisure boats to the project, while Avikus is known for the most advanced autonomous leisure-boat technology available, coupled with vast experience from the world’s largest order book of autonomous commercial marine navigation systems,” according to a press release.

What Raymarine’s general manager, Gregoire Outters, says about the deal: “Raymarine strives to provide the most innovative, user-friendly and reliable electronics to make boating accessible and safe for everyone. With Avikus’ proven solution in autonomous commercial marine, the signing of this MOU will pave the way for our engineers to work closely together, to deliver this exciting new technology to our leisure boat customers.”

Take the next step: go to raymarine.com

The post Raymarine Is Working on an Autonomous Boat appeared first on Yachting.

]]>