ADVERTISEMENT

End to End AI - FSD Beta V12.2.1 (updated 02.20.24)

Uh huh.
The only "hit piece" here was the pieces of the driver littered all over the roadway in that Tweet photo.

Drivers in the US get in hundreds of crashes every day. Of course Teslas on basic autopilot get in crashes too. Assigning blame to the tech is moronic considering the driver is still responsible for safe operation of the vehicle. Every driver knows this and is made aware of this fact. Any driver can “take over” full control of the vehicle at any moment simply and reliably by either turning the wheel with minimal force or tapping the brake.

You still don’t seem to understand this because you are an obtuse prick with your head up your own ass.
 
Here were the FSD Beta V11.3 Notes from 2.19.23.



Here are the Full Self-Driving Beta V11.3 release notes in text form for improved readability.

Key points: FSD on Highways, improved lane changes, resolves recall concerns, improves AEB, report autopilot mistakes via voice notes.

More details are below, by clicking “Read more”...

1️⃣ Enabled FSD Beta on highway. This unifies the vision and planning stack on and off-highway and replaces the legacy highway stack, which is over four years old. The legacy highway stack still relies on several single-camera and single-frame networks, and was setup to handle simple lane-specific maneuvers. FSD Beta's multi-camera video networks and next-gen planner, that allows for more complex agent interactions with less reliance on lanes, make way for adding more intelligent behaviors, smoother control and better decision making.

2️⃣ Added voice drive-notes. After an intervention, you can now send Tesla an anonymous voice message describing your experience to help improve Autopilot.

3️⃣ Expanded Automatic Emergency Braking (AEB) to handle vehicles that cross ego's path. This includes cases where other vehicles run their red light or turn across ego's path, stealing the right-of-way.

Replay of previous collisions of this type suggests that 49% of the events would be mitigated by the new behavior. This improvement is now active in both manual driving and autopilot operation.

4️⃣ Improved autopilot reaction time to red light runners and stop sign runners by 500ms, by increased reliance on object's instantaneous kinematics along with trajectory estimates.

5️⃣ Added a long-range highway lanes network to enable earlier response to blocked lanes and high curvature.

6️⃣ Reduced goal pose prediction error for candidate trajectory neural network by 40% and reduced runtime by 3X. This was achieved by improving the dataset using heavier and more robust offline optimization, increasing the size of this improved dataset by 4X, and implementing a better architecture and feature space.

7️⃣ Improved occupancy network detections by oversampling on
180K challenging videos including rain reflections, road debris, and high curvature.

8️⃣ Improved recall for close-by cut-in cases by 20% by adding 40k autolabeled fleet clips of this scenario to the dataset. Also improved handling of cut-in cases by improved modeling of their motion into ego's lane, leveraging the same for smoother lateral and longitudinal control for cut-in objects.

9️⃣ Added "lane guidance module and perceptual loss to the Road Edges and Lines network, improving the absolute recall of lines by 6% and the absolute recall of road edges by 7%.

1️⃣0️⃣ Improved overall geometry and stability of lane predictions by updating the "lane guidance" module representation with information relevant to predicting crossing and oncoming lanes.

1️⃣1️⃣ Improved handling through high speed and high curvature scenarios by offsetting towards inner lane lines.

1️⃣2️⃣ Improved lane changes, including earlier detection and handling for simultaneous lane changes, better gap selection when approaching deadlines, better integration between speed-based and nav-based lane change decisions and more differentiation [...]

Some driving on V11.3.3

 
Joe you never answered my question you stupid prick.

Thousands of people have crashed on cruise control. Thousands have died. Why is it allowed?
 
Here were the FSD Beta V11.3 Notes from 2.19.23.



Here are the Full Self-Driving Beta V11.3 release notes in text form for improved readability.

Key points: FSD on Highways, improved lane changes, resolves recall concerns, improves AEB, report autopilot mistakes via voice notes.

More details are below, by clicking “Read more”...

1️⃣ Enabled FSD Beta on highway. This unifies the vision and planning stack on and off-highway and replaces the legacy highway stack, which is over four years old. The legacy highway stack still relies on several single-camera and single-frame networks, and was setup to handle simple lane-specific maneuvers. FSD Beta's multi-camera video networks and next-gen planner, that allows for more complex agent interactions with less reliance on lanes, make way for adding more intelligent behaviors, smoother control and better decision making.

2️⃣ Added voice drive-notes. After an intervention, you can now send Tesla an anonymous voice message describing your experience to help improve Autopilot.

3️⃣ Expanded Automatic Emergency Braking (AEB) to handle vehicles that cross ego's path. This includes cases where other vehicles run their red light or turn across ego's path, stealing the right-of-way.

Replay of previous collisions of this type suggests that 49% of the events would be mitigated by the new behavior. This improvement is now active in both manual driving and autopilot operation.

4️⃣ Improved autopilot reaction time to red light runners and stop sign runners by 500ms, by increased reliance on object's instantaneous kinematics along with trajectory estimates.

5️⃣ Added a long-range highway lanes network to enable earlier response to blocked lanes and high curvature.

6️⃣ Reduced goal pose prediction error for candidate trajectory neural network by 40% and reduced runtime by 3X. This was achieved by improving the dataset using heavier and more robust offline optimization, increasing the size of this improved dataset by 4X, and implementing a better architecture and feature space.

7️⃣ Improved occupancy network detections by oversampling on
180K challenging videos including rain reflections, road debris, and high curvature.

8️⃣ Improved recall for close-by cut-in cases by 20% by adding 40k autolabeled fleet clips of this scenario to the dataset. Also improved handling of cut-in cases by improved modeling of their motion into ego's lane, leveraging the same for smoother lateral and longitudinal control for cut-in objects.

9️⃣ Added "lane guidance module and perceptual loss to the Road Edges and Lines network, improving the absolute recall of lines by 6% and the absolute recall of road edges by 7%.

1️⃣0️⃣ Improved overall geometry and stability of lane predictions by updating the "lane guidance" module representation with information relevant to predicting crossing and oncoming lanes.

1️⃣1️⃣ Improved handling through high speed and high curvature scenarios by offsetting towards inner lane lines.

1️⃣2️⃣ Improved lane changes, including earlier detection and handling for simultaneous lane changes, better gap selection when approaching deadlines, better integration between speed-based and nav-based lane change decisions and more differentiation [...]
rzmSie7e
 

According to the NHTSA there are approximately 14,000 crashes and ~99 crash related deaths PER DAY in the USA you ****ing moron.


736 crashes and 17 deaths since 2016(?) for Tesla vehicles is unbelievably low.

Tesla is setting the new standard in Safety. Both in actively preventing crashes and limiting harm when drivers do crash.

Teslas are the safest cars on the road.
 
Tesla Model 3 achieved the “Lowest probability of Injury” of any automobile ever tested by the NHTSA.

Tesla Model Y achieved “Five Star” safety rating of every Category AND Sub Category tested by the NHTSA.

Tesla Model Y achieved the lowest rollover risk of any SUV ever tested and about twice as low as the AWD Volvo XC60.

AWD Model Y
7.9%

AWD Volvo XC60
14.7%

Not even close!!!!!!!

 
Tesla Model 3 achieved the “Lowest probability of Injury” of any automobile ever tested by the NHTSA.

Tesla Model Y achieved “Five Star” safety rating of every Category AND Sub Category tested by the NHTSA.

Tesla Model Y achieved the lowest rollover risk of any SUV ever tested and about twice as low as the AWD Volvo XC60.

AWD Model Y
7.9%

AWD Volvo XC60
14.7%

Not even close!!!!!!!

And they'll apparently NEED it with the FSD fails.
 
And they'll apparently NEED it with the FSD fails.

Autopilot is not FSD.

“Autopilot crashes” include every time another car crashes into a Tesla while it is engaged. The software can and does take evasive maneuvers but cannot get out of every bad driving situation cause by other drivers or poor driving conditions. Too much rain, poor visibility, the driver setting the speed at or above the speed limit and too high for driving conditions. All these situations leave less room for error and I’m certain crashes have happened while on autopilot, which is essentially cruise control, than maintains your lane on a highway, and maintains the set(by the driver) speed, and reduces speed upon coming up to another vehicle in the lane, slowing to maintain safe traveling distance.

If the driver is dumb enough to set the autopilot speed at 85 in a 70, while it is raining, in packed traffic with other irresponsible drivers weaving in and out of their path.

I’m certain accidents do happen. It would still not be the tech’s fault. THE DRIVER IS RESPONSIBLE FOR SAFE OPERATION. PERIOD.
 
poor driving conditions. Too much rain, poor visibility, the driver setting the speed at or above the speed limit and too high for driving conditions.

....yet, here you are, comparing FSD, used in only the BEST driving conditions, with nationwide averages, from ALL driving conditions... 🙄
 
FSD Beta Version 12 released 02.19.24 to Tesla customers.

This version is End to End neural nets artificial intelligence.

Some reports.



Wait what? From a Tesla software engineer. The AI learned how to do U turns from watching videos if humans doing U turns.










Seems to be improving. Each month that goes by the AI gets trained more and more.

March of 9’s marches on.
 


I just got a DM from a follower that got the new V12 update and wanted to remain anonymous.
He said he took a 30 minute drive, including partial on the highway, and said it was notably better than 11.4.9.
There were several points and drives where there were certain behaviors that consistently would make mistakes that now just completed them effortlessly. His wife was with him and she was very impressed because "there is something palpably different about this software". It’s so smooth, and it does pull over when you reach your destination, which is a pretty cool feature.
He said he did not notice any regressions.🔥
 
u-turns for highway driving?

I thought autopilot was only for highway driving? Is t that the defense for the crashes?
 
u-turns for highway driving?

I thought autopilot was only for highway driving? Is t that the defense for the crashes?

Negative. U turns on city streets or split lane roads apparently.

The interesting thing is that the neural networks (AI) learned how to do U turns by watching video clips of Tesla drivers do U turns. It is not hard-coded software programming for the decision making / control module any more.

Someday this software is going to save a lot of lives and you’ll feel wrong for ever doubting it.

 
Last edited:
Biggest beneficiaries of Tesla FSD:

Teenagers.
Elderly.
Poor drivers with spatial awareness deficiencies.
Distracted drivers.

Deaths and serious injuries will plummet.
Insurance rates will plummet.
Drunk driving will become a thing of the past. Everyone on the road benefits.
Freight costs will come down.

That’s a pretty good start.
 
Negative. U turns on city streets or split lane roads apparently.

The interesting thing is that the neural networks (AI) learned how to do U turns by watching video clips of Tesla drivers do U turns. It is not hard-coded software programming for the decision making / control module any more.

Someday this software is going to save a lot of lives and you’ll feel wrong for ever doubting it.

But I thought Tesla claimed it should only be used on the highway in the lawsuits from those who died while using it…
 
But I thought Tesla claimed it should only be used on the highway in the lawsuits from those who died while using it…

I don’t really know what was claimed or wasn’t claimed. I don’t know what you read and or who interpreted or misinterpreted before it got to you.

Basic Autopilot, Enhanced Autopilot, and ( ‘FSD Beta’ / Autosteer on City Streets ) all used to be sepearate software stacks. Autopilot was originally intended for highway use. Some roadways blur the line between highway and city driving and could at times be driven using autopilot. Typically it would simply not engage in a city setting. I’m not sure as I wasn’t using the software back then. Until “Autosteer on City Streets” came along there simply wasn’t any city “self driving.” I think that was 2021.

“FSD” or “FSD Beta” was originally designed for city driving. Driving on and off highways with FSD engaged would switch from one software stack and back to the other for me there for a period of time. It was obvious when this occurred.

Around Jan/Feb 2023. The FSD stack took over highway operation also. The FSD city driving software stack simply “took over” the highway operation and since that code was much newer and sophisticated it immediately improved the highway driving experience.

They have been more or less merged on my Model Y since Jan/Feb 2023. (Completely erasing / making obsolete the “Autopilot” software from 2018.)

If people “died on autopilot” that almost certainly meant they were distracted and did not take over when they needed to. In one case I read, a software engineer ignored about 1000 feet or more of on screen warnings and audible warnings to take over because he was distractedly playing something like Candy Crush, plowing into a highway barrier in a construction zone and killing himself instantly. You want to blame Tesla? Fine.

I understand that I am ultimately responsible for safe operation of the vehicle every time I get behind the wheel. Software or no software.

Think about it, commercial airlines have an “autopilot system” that doesnt mean that pilots can sleep through takeoffs and landings… if something, ie bad weather were to occur their ass would be liable. No one would blame the damn plane or the autopilot software. Or at least they shouldn’t. Edge cases occur and that is why people still need to be vigilant.
 
Last edited:
I don’t really know what was claimed or wasn’t claimed. I don’t know what you read and or who interpreted or misinterpreted before it got to you.

Basic Autopilot, Enhanced Autopilot, and ( ‘FSD Beta’ / Autosteer on City Streets ) all used to be sepearate software stacks. They have been more or less merged on my Model Y since Jan/Feb 2023. (Completely erasing / making obsolete the “Autopilot” software from 2018.

If people “died on autopilot” that almost certainly meant they were distracted and did not take over when they needed to. In one case I read, a software engineer ignored about 1000 feet or more of warnings and audible warnings to take over because he was distractedly playing something like Candy Crush, plowing into a highway barrier in a construction zone and killing himself instantly. You want to blame Tesla? Fine. I understand that I am ultimately responsible for safe operation of the vehicle every time I get behind the wheel. Software or no software.

Think about it, commercial airlines have an “autopilot system” that doesnt mean that pilots can sleep through takeoffs and landings… if something, ie bad weather were to occur their ass would be liable. No one would blame the damn plane or the autopilot software. Or at least they shouldn’t. Edge cases occur and that is why people still need to be vigilant.
Tesla has been pushing its driver-assist features, including Autopilot and what it calls “Full Self Driving,” which Tesla has insisted make driving safer than cars operated exclusively by humans. But NHTSA has been studying reports of accidents involving Autopilot and its Autosteer function for more than two years.

The recall comes two days after a detailed investigation was published by the Washington Post that found at least eight serious accidents, including some fatalities, in which the the Autopilot feature should not have been engaged in the first place.

Tesla’s owners manuals say: “Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver.” But the company has pushed the idea that its driver assist features allow the cars to safely make most driving decisions even away from those roads.

A NHTSA investigation, however, has found numerous accidents over the past several years that suggest that these features do not live up to their names of Autopilot and Full Self Driving.

The safety regulator in its letter to Tesla said “in certain circumstances when Autosteer is engaged, the prominence and scope of the feature’s controls may not be sufficient to prevent driver misuse [of the feature.]” It said that when drivers are not fully engaged and ready to take control of the car “there may be an increased risk of a crash.”

Enter your email to subscribe to the CNN Five Things Newsletter.
close dialog
CNN Five Things logo

You give us five minutes, we’ll give you five things you must know for the day.


Sign Me Up
By subscribing you agree to our
privacy policy.
 
Tesla has been pushing its driver-assist features, including Autopilot and what it calls “Full Self Driving,” which Tesla has insisted make driving safer than cars operated exclusively by humans. But NHTSA has been studying reports of accidents involving Autopilot and its Autosteer function for more than two years.

The recall comes two days after a detailed investigation was published by the Washington Post that found at least eight serious accidents, including some fatalities, in which the the Autopilot feature should not have been engaged in the first place.

Tesla’s owners manuals say: “Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver.” But the company has pushed the idea that its driver assist features allow the cars to safely make most driving decisions even away from those roads.

A NHTSA investigation, however, has found numerous accidents over the past several years that suggest that these features do not live up to their names of Autopilot and Full Self Driving.

The safety regulator in its letter to Tesla said “in certain circumstances when Autosteer is engaged, the prominence and scope of the feature’s controls may not be sufficient to prevent driver misuse [of the feature.]” It said that when drivers are not fully engaged and ready to take control of the car “there may be an increased risk of a crash.”

Enter your email to subscribe to the CNN Five Things Newsletter.
close dialog
CNN Five Things logo

You give us five minutes, we’ll give you five things you must know for the day.


Sign Me Up
By subscribing you agree to our
privacy policy.

The Washington Post is not a reliable source of reporting on Tesla. They have repeatedly and intentionally mischaracterized issues, investigations, and lawsuits because….drumroll please…..

They don’t know what the hell they are talking about.
 
  • Haha
Reactions: BelemNole
The Washington Post is not a reliable source of reporting on Tesla. They have repeatedly and intentionally mischaracterized issues, investigations, and lawsuits because….drumroll please…..

They don’t know what the hell they are talking about.
JFC google the key words. Tesla has argued in court that they're not liable for accidents using autopilot on surface streets.
You look more and more like a cult member when you refuse to admit known facts.
 
The Washington Post is not a reliable source of reporting on Tesla. They have repeatedly and intentionally mischaracterized issues, investigations, and lawsuits because….drumroll please…..

They don’t know what the hell they are talking about.


Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray's case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the "wet streets cause rain" stories. Paper's full of them. In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.

— Michael Crichton
 
Tesla has been pushing its driver-assist features, including Autopilot and what it calls “Full Self Driving,” which Tesla has insisted make driving safer than cars operated exclusively by humans. But NHTSA has been studying reports of accidents involving Autopilot and its Autosteer function for more than two years.

The recall comes two days after a detailed investigation was published by the Washington Post that found at least eight serious accidents, including some fatalities, in which the the Autopilot feature should not have been engaged in the first place.

Tesla’s owners manuals say: “Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver.” But the company has pushed the idea that its driver assist features allow the cars to safely make most driving decisions even away from those roads.

A NHTSA investigation, however, has found numerous accidents over the past several years that suggest that these features do not live up to their names of Autopilot and Full Self Driving.

The safety regulator in its letter to Tesla said “in certain circumstances when Autosteer is engaged, the prominence and scope of the feature’s controls may not be sufficient to prevent driver misuse [of the feature.]” It said that when drivers are not fully engaged and ready to take control of the car “there may be an increased risk of a crash.”

Enter your email to subscribe to the CNN Five Things Newsletter.
close dialog
CNN Five Things logo

You give us five minutes, we’ll give you five things you must know for the day.


Sign Me Up
By subscribing you agree to our
privacy policy.

Ps. NHTSA closed at least one of the autopilot investigations with Tesla agreeing to make or alter some alerts in December 2023. Guess what they didn’t say? It was unsafe when used properly.


“Tesla has agreed to add more alerts to its Autopilot system to prevent driver misuse as part of a NHTSA “safety recall” of 2 million vehicles that concludes its Autopilot investigation.”

The recall is an over the air software update.
 
JFC google the key words. Tesla has argued in court that they're not liable for accidents using autopilot on surface streets.
You look more and more like a cult member when you refuse to admit known facts.

This is a pretty ignorant take and with the larger media constantly miss reporting and inaccurately characterizing events I don’t blame your ignorance on you. You are being mislead.
 
  • Haha
Reactions: BelemNole
Ps. NHTSA closed at least one of the autopilot investigations with Tesla agreeing to make or alter some alerts in December 2023. Guess what they didn’t say? It was unsafe when used properly.


“Tesla has agreed to add more alerts to its Autopilot system to prevent driver misuse as part of a NHTSA “safety recall” of 2 million vehicles that concludes its Autopilot investigation.”

The recall is an over the air software update.
Right and to Tesla "Properly" means only on highways.

Tesla's owners manuals say: “Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver.”

And yet, all their marketing tweets show it self parking and doing u-turns.

I can't imagine why people are confused.
 
  • Like
Reactions: Joes Place
Right and to Tesla "Properly" means only on highways.

Tesla's owners manuals say: “Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver.”

And yet, all their marketing tweets show it self parking and doing u-turns.

I can't imagine why people are confused.

Sigh….have you used it yourself? Have you driven in a Tesla with FSD engaged?

Do you have, you know, an opinion of your own to share about the video links I’m providing to you?

Is any of it interesting?

Something anything has to better than regurgitating old shit after the NHTSA already closed their investigation. Move on for focks sake.
 
  • Haha
Reactions: BelemNole
ADVERTISEMENT

Latest posts

ADVERTISEMENT