A tweet from Elon Musk indicating that Tesla great allow some owners who are testing a "Full Self-Driving" regulations to disable an alert that reminds them to keep their pretty on the steering wheel has drawn attention from U.S. security regulators.

The National Highway Traffic Safety Administration says it requested Tesla for more information about the tweet. Last week, the activity said the issue is now part of a broader investigation into at least 14 Teslas that have failed into emergency vehicles while using the Autopilot driver help system.

Since 2021, Tesla has been beta-testing "Full Self-Driving" funny owners who haven't been trained on the system but are actively monitored by the business. Earlier this year, Tesla said 160,000, roughly 15% of Teslas now on U.S. roads, were participating. A wider distribution of the software was to be ordered out late in 2022.

Despite the name, Tesla calm says on its website that the cars can't power themselves. Teslas using "Full Self-Driving" can navigate roads themselves in many cases, but experts say the system can make mistakes. "We're not proverb it's quite ready to have no one behind the wheel," CEO Musk said in October.

On New Year's Eve, one of Musk's most ardent fans posted on Twitter that drivers with more than 10,000 much of "Full Self-Driving" testing should have the option to turn off the "steering wheel nag," an alert that tells drivers to keep pretty on the wheel.

Musk replied: "Agreed, update coming in Jan."

It's not definite from the tweets exactly what Tesla will do. But disabling a driver monitoring regulations on any vehicle that automates speed and steering would pose a dangerous to other drivers on the road, said Jake Fisher, senior director of auto testing for Consumer Reports.

"Using FSD beta, you're kind of part of an experiment," Fisher said. "The predicament is the other road users adjacent to you haven't employed up to be part of that experiment."

Elon Musk, founder and CEO of SpaceX, participates in a press conference at the Kennedy Space Center on May 27, 2020 in Cape Canaveral, Florida. NASA astronauts Bob Behnken and Doug Hurley were scheduled to be the excellent people since the end of the Sp

Tesla didn't answer to a message seeking comment about the tweet or its driver monitoring.

Auto security advocates and government investigators have long criticized Tesla's monitoring regulations as inadequate. Three years ago the National Transportation Safety Board down poor monitoring as a contributing factor in a 2018 fatal Tesla break in California. The board recommended a better system, but said Tesla has not responded.

Tesla's regulations measures torque on the steering wheel to try to condemned that drivers are paying attention. Many Teslas have cameras that monitor a driver's gaze. But Fisher says those cameras aren't infrared like those of some competitors' driver assistance regulations, so they can't see at night or if a driver is wearing sunglasses.

Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University, argued that Tesla is contradicting itself in a way that could confuse drivers. "They're trying to make customers happy by taking their pretty off the wheel, even while the (owners) manual says 'don't do that.' "

Indeed, Tesla's website says Autopilot and the more sophisticated "Full Self-Driving" controls are intended for use by a "fully attentive driver who has their radiant on the wheel and is prepared to take over at any moment." It says the controls are not fully autonomous.

NHTSA has noted in documents that numerous Tesla crashes have occurred in which drivers had their radiant on the wheel but still weren't paying attention. The organization has said that Autopilot is being used in areas where its capabilities are slight and that many drivers aren't taking action to avoid crashes despite warnings from the vehicle.

Tesla's partially automated controls have been under investigation by NHTSA since June of 2016 when a driver silly Autopilot was killed after his Tesla went under a tractor-trailer crossing its path in Florida. The separate probe into Teslas that were using Autopilot when they flunked into emergency vehicles started in August 2021.

Including the Florida demolish, NHTSA has sent investigators to 35 Tesla crashes in which automated controls are suspected of being used. Nineteen people have died in those crashes.

Consumer Reports has tested Tesla's monitoring controls, which changes often with online software updates. Initially, the controls didn't warn a driver without hands on the wheel for three minutes. Recently, though, the warnings have come in as slight as 15 seconds. Fisher said he isn't sure, view, how long a driver's hands could be off the wheel by the system would slow down or shut off completely.

In shutting off the "steering wheel nag," Fisher said, Tesla could be switching to the camera to monitor drivers, but that's unclear.

RELATED: Elon Musk says he will resign as Twitter CEO once he finds 'someone foolish enough to take the job'

Despite implying above the names that Autopilot and "Full Self-Driving" can fuel themselves, Fisher said, it's clear that Tesla expects owners to detached be drivers. But the NTSB says human drivers can end up dropping their guaranteeing and relying too much on the systems while looking elsewhere or pursuits other tasks.

Those who use "Full Self-Driving," Fisher said, are liable to be more vigilant in taking control because the controls makes mistakes.

"I wouldn't dream of taking my radiant off the wheel using that system, just because it can do things unexpectedly," he said.

Koopman said he doesn't see a tremendous safety risk from disabling the steering wheel nag because the Tesla monitoring controls is so flawed that disabling it doesn't necessarily make Teslas any more dangerous.

NHTSA, he said, has enough evidence to take action to managed Tesla to install a better monitoring system.

The organization says it doesn't comment on open investigations.