Questions abound about the eventual role autonomous trucks will play in the freight-hauling market. Concerns about cost, safety and productivity are often raised.
Before autonomous trucks can become something more than an experiment, however, a host of other questions must be addressed. In the case of an accident, for example, who is liable? Is it the company that owns the truck? The manufacturer who built it? The firm that designed the computerized system that “drives” the vehicle?
Will the answers to these questions vary from region to region?
It wouldn’t be the first time for laws and regulations to differ in various states. Before the Surface Transportation Assistance Act (STAA) was enacted in the 1980s, states had differing laws about truck dimensions and weight.
This played havoc for truckers on cross-country trips, since a truck might meet the legal requirements in every state traveled … except one. The STAA standardized the rules on a national network of roads, simplifying interstate transportation.
No such standardization presently exists for autonomous trucks.
Some states prohibit autonomous trucks, while some allow them. Some states allow autonomous trucks, but only with a human driver present — while others allow autonomous trucks on the roads without a human present.
Other laws govern liability insurance and other factors of ownership. Additionally, there is opposition to autonomous vehicles from various “safety” groups, labor unions and politicians.
Enter the SELF DRIVE act of 2026.
In an attempt to standardize the rules across the U.S., the Safely Ensuring Lives Future Deployment and Research In Vehicle Evolution (SELF DRIVE) Act of 2026 has been introduced in the U.S. House of Representatives. The official title, “To amend title 49, United States Code, regarding the authority of the National Highway Traffic Safety Administration over vehicles with automated driving systems to provide safety measures for such vehicles, and for other purposes,” is a bit convoluted for news stories.
It’s not the first time such a bill has been introduced.
Representative Robert Latta (R-Ohio) introduced a similar measure in June 2021. Like the 2026 version, it was immediately referred to committee. Latta introduced the current version as well — H.R. 7390 — on Feb. 5, 2026. On Feb. 10, the House Subcommittee on Commerce, Manufacturing and Trade voted 12-11 to refer it to the full Committee on Energy and Commerce for further discussion.
The bill refers to the system that drives autonomous vehicles as “ADS,” for Automated Driving System, not to be confused with ADAS, for “Advanced Driver Assistance Systems” such as automated braking, lane departure warning and other systems that actually make up parts of an ADS.
One tenant of the bill addresses the safety of ADS.
It reads, “A manufacturer may not manufacture for sale, sell, offer for sale, introduce or deliver for introduction into interstate commerce, or import into the United States any automated driving system or ADS-equipped vehicle unless the manufacturer has developed a safety case for the automated driving system or ADS-equipped vehicle that meets the requirements described in paragraph (3).”
The bill would require manufacturers of ADS-equipped vehicles to complete a safety case and to submit it to the Secretary of the U.S. Department of Transportation “upon request.” This safety case would include evidence that the “design, construction and performance” of the ADS “will not present an unreasonable risk of accidents, death or injury.”
Is self-certification really the answer?
Some objections have been raised to allowing the manufacturers of self-driving vehicles to make their own “safety cases” for those products.
Todd Spencer, president and CEO of the Owner-Operator Independent Driver Association (OOIDA) expressed those concerns and more in a March 9, 2026, letter of opposition to the bill, addressed to the Chairman of the House Committee on Energy and Commerce.
“Instead of holding autonomous vehicles to similar standards, H.R. 7390 would permit the operation of driverless 80,000-pound trucks based on the unverified assertions of companies with a vested financial interest in their deployment,” Spencer wrote.
“While companies would be required to develop a “safety case” describing how the vehicle would operate safely, there is no requirement that the federal government verify these plans. In fact, companies would not need to provide these cases to the government before deployment, or possibly even at all,” he continued. “This amounts to self-certification for the use of heavy-duty trucks on our nation’s roads.”
Spencer pointed out that the Federal Motor Carrier Safety Administration’s (FMCSA) reliance on self-certification has already resulted in a myriad of issues that the agency is currently trying to clean up. Hardly a month goes by without another announcement of a manufacturer’s “self-certified” electronic logging device (ELD) being removed from authorized use.
In addition, FMCSA’s current purge of “self-certified” CDL trainers and facilities that don’t meet requirements has resulted in thousands of educational facilities and instructors being removed from FMCSA’s database.
“The use of ‘self-certification’ has already proven to have serious shortcomings in multiple areas across the trucking industry, and taking this approach with autonomous CMVs would be the most disastrous use yet,” Spencer wrote.
Another issue raised in OOIDA’s letter is cybersecurity.
Noting a 2019 security bulletin that described the ability of hackers to exploit ELD weaknesses due to a lack of proper security practices, Spencer pointed out that cybersecurity requirements in the SELF DRIVE Act would allow the same vulnerabilities in ADS operation. The bill requires manufacturers to “maintain a written cybersecurity policy,” but doesn’t define the parameters of such a policy or require that the policy be reviewed by the government or made available to the public.
Cybercriminals — or even hackers employed by a government hostile to the U.S. — could potentially disrupt the operation of autonomous vehicles by disengaging critical safety systems or by disabling the systems entirely. This could be especially critical when many components of computerized systems are manufactured in countries that may be potential enemies of the U.S.
In a world where a huge percentage of the public supply of food, clothing and other essentials moves by truck, knocking out a percentage of the fleet with a few keystrokes could cause major economic disruptions and could easily compromise highway safety.
The letter urged the committee to reject the bill and called for “a framework that prioritizes adherence to proven safety requirements, independent validation, and full transparency.”
The key takeaway here is this:
Autonomous trucks will bring a marked transformation of truck transportation, but the legislation governing them is just getting started.
Cliff Abbott is an experienced commercial vehicle driver and owner-operator who still holds a CDL in his home state of Alabama. In nearly 40 years in trucking, he’s been an instructor and trainer and has managed safety and recruiting operations for several carriers. Having never lost his love of the road, Cliff has written a book and hundreds of songs and has been writing for The Trucker for more than a decade.










