Eh, so there's quite a lot of backstory to that story.
First things first, the person being interviewed is is John Bernal. This guy is indeed a former employee of Tesla and worked in the FSD development group, specifically in the group which applies metadata tagging to collected scene data and detected objects. He got fired because he thought himself a tester (he wasn't) and set up a YouTube channel to do test reviews of the product -- a violation of the terms and conditions of his employment there. https://nypost.com/2022/03/16/tesla-employee-fired-after-driverless-tech-youtube-reviews/
Pay attention to his job title "data annotation specialist" which is not the same as "Bonafide FSD tester." Sure, yeah, the guy obviously has NO BONE TO PICK AT ALL with Tesla firing him. Obviously we can all be fully confident he's an unbiased, non-axe-grinding source at this point. And to be blunt, whether any of us agree with Tesla being morally right to fire him or not, apparently it was quite clearly spelled out in his employment contract that he wasn't allowed to do it.
"But Albuquerque, just because he got fired doesn't mean he isn't right about all the other things! Maybe that even makes him MORE right than you want to admit!" I hear you saying.
Ok, so let's talk about radar elimination: at no point are FSD cars just willy-nilly running into shit without the radar function. How many times in the last two years have we all read about "oh look at this Tesla crash into a bunch of shit... OMG I bet FSD is in trouble now!" then followed by "Local / state / federal Investigators show that autopilot was not enabled during the crash into [firetruck, bottom of cliff, 130mph sustained acceleration crash into a pole, a shitload of cars at the end of a tunnel]", citing examples from rote in the last six weeks. Radar is not necessary for driving; you as a human do not have radar and somehow (I presume) you don't run around bumping into shit.
"Yeah but obviously they're doing slimy things, just listen to his anecdote about Lombard street -- it's super obvious they're cutting corners!" I hear your next retort.
So the largest statistical outlier of a city street they can possibly think of, quite possibly North America's most treacherous street, was being shown on YouTube as "See, FSD can't navigate the worst possible street we can drive on!!" So as a bit of a thumb-your-nose to the situation, Tesla sat down and made FSD drive the street correctly. And so what? Care to guess how Waymo works in the scant few places where it's permitted to operate? Or Blue Cruise? Or that Level 3 Mercedes thing that was proclaimed as better than Tesla? All three of those are pure examples of 100% human-mapped route traversal. Yeah, so when Waymo and BlueCruise and Mercedes do it with direct human effort, it's the right way -- when Tesla does it for a ridiculous exception case like Lombard Street, it's completely the wrong way?
"But see Albuquerque, that's the point! Waymo totally works and FSD doesn't! Mercedes beat Tesla to L3 capability, which obviously means Tesla's method is pure shit and cannot be scaled or trusted."
So here's the trick: Have any of you looked at the littany of requirements to enable the Mercedes Level 3 autonomous driving? It only works on pre-mapped highways (no city streets at all), it only works at 40mph and below (how does that even work on a highway?), it cannot drive directly into the sun, it cannot drive in any precipitation or fog, it cannot operate in any road construction zone, it cannot interact with emergency vehicles, oh and the very best part: Mercedes L3 autonomous driving only works if it has another car to follow. Yeah, so if you have nobody to follow? You're screwed, that's it.
Surely Waymo is better, right? Except it only works in very specific geofenced areas, only where 100% human-mapped routes exist, it cannot make any turn that isn't a right turn, it doesn't operate in precipitation either, you can't actually operate it as your own vehicle so it's only available in "taxi form."
None of this is to say FSD is "ready", just like Waymo isn't "ready" and Mercedes L3 only "ready" in the most laughable sense. Tesla has a very long way to go (and maybe, honestly, never) before FSD can be trusted to drive without a human. That doesn't mean the answer is 100% human-routed behavior like Waymo and Blue Cruise and Mercedes, because there are more miles of road construction than there will ever be humans hired to update and maintain navigation data. They'll never catch up, it's a physical impossibility.
The only way self driving cars of any sort actually come to fruition is through machine learning and AI models, whatever those might be. And honestly, even then it's not really fully solvable -- there will always end up being some driving problem which a machine has no rational way to understand how to deal with. Want an example? Try pulling into or leaving a school parking lot to pick up or drop off your kids. If it's anything like the schools I've been around for decades, even "normal humans" can't figure that shit out right -- mostly because "normal humans" are the ones routing traffic and jacking it up for everyone else.
So yeah, it seems like a poorly sourced hit piece to me. But what do I know?
First things first, the person being interviewed is is John Bernal. This guy is indeed a former employee of Tesla and worked in the FSD development group, specifically in the group which applies metadata tagging to collected scene data and detected objects. He got fired because he thought himself a tester (he wasn't) and set up a YouTube channel to do test reviews of the product -- a violation of the terms and conditions of his employment there. https://nypost.com/2022/03/16/tesla-employee-fired-after-driverless-tech-youtube-reviews/
Pay attention to his job title "data annotation specialist" which is not the same as "Bonafide FSD tester." Sure, yeah, the guy obviously has NO BONE TO PICK AT ALL with Tesla firing him. Obviously we can all be fully confident he's an unbiased, non-axe-grinding source at this point. And to be blunt, whether any of us agree with Tesla being morally right to fire him or not, apparently it was quite clearly spelled out in his employment contract that he wasn't allowed to do it.
"But Albuquerque, just because he got fired doesn't mean he isn't right about all the other things! Maybe that even makes him MORE right than you want to admit!" I hear you saying.
Ok, so let's talk about radar elimination: at no point are FSD cars just willy-nilly running into shit without the radar function. How many times in the last two years have we all read about "oh look at this Tesla crash into a bunch of shit... OMG I bet FSD is in trouble now!" then followed by "Local / state / federal Investigators show that autopilot was not enabled during the crash into [firetruck, bottom of cliff, 130mph sustained acceleration crash into a pole, a shitload of cars at the end of a tunnel]", citing examples from rote in the last six weeks. Radar is not necessary for driving; you as a human do not have radar and somehow (I presume) you don't run around bumping into shit.
"Yeah but obviously they're doing slimy things, just listen to his anecdote about Lombard street -- it's super obvious they're cutting corners!" I hear your next retort.
So the largest statistical outlier of a city street they can possibly think of, quite possibly North America's most treacherous street, was being shown on YouTube as "See, FSD can't navigate the worst possible street we can drive on!!" So as a bit of a thumb-your-nose to the situation, Tesla sat down and made FSD drive the street correctly. And so what? Care to guess how Waymo works in the scant few places where it's permitted to operate? Or Blue Cruise? Or that Level 3 Mercedes thing that was proclaimed as better than Tesla? All three of those are pure examples of 100% human-mapped route traversal. Yeah, so when Waymo and BlueCruise and Mercedes do it with direct human effort, it's the right way -- when Tesla does it for a ridiculous exception case like Lombard Street, it's completely the wrong way?
"But see Albuquerque, that's the point! Waymo totally works and FSD doesn't! Mercedes beat Tesla to L3 capability, which obviously means Tesla's method is pure shit and cannot be scaled or trusted."
So here's the trick: Have any of you looked at the littany of requirements to enable the Mercedes Level 3 autonomous driving? It only works on pre-mapped highways (no city streets at all), it only works at 40mph and below (how does that even work on a highway?), it cannot drive directly into the sun, it cannot drive in any precipitation or fog, it cannot operate in any road construction zone, it cannot interact with emergency vehicles, oh and the very best part: Mercedes L3 autonomous driving only works if it has another car to follow. Yeah, so if you have nobody to follow? You're screwed, that's it.
Surely Waymo is better, right? Except it only works in very specific geofenced areas, only where 100% human-mapped routes exist, it cannot make any turn that isn't a right turn, it doesn't operate in precipitation either, you can't actually operate it as your own vehicle so it's only available in "taxi form."
None of this is to say FSD is "ready", just like Waymo isn't "ready" and Mercedes L3 only "ready" in the most laughable sense. Tesla has a very long way to go (and maybe, honestly, never) before FSD can be trusted to drive without a human. That doesn't mean the answer is 100% human-routed behavior like Waymo and Blue Cruise and Mercedes, because there are more miles of road construction than there will ever be humans hired to update and maintain navigation data. They'll never catch up, it's a physical impossibility.
The only way self driving cars of any sort actually come to fruition is through machine learning and AI models, whatever those might be. And honestly, even then it's not really fully solvable -- there will always end up being some driving problem which a machine has no rational way to understand how to deal with. Want an example? Try pulling into or leaving a school parking lot to pick up or drop off your kids. If it's anything like the schools I've been around for decades, even "normal humans" can't figure that shit out right -- mostly because "normal humans" are the ones routing traffic and jacking it up for everyone else.
So yeah, it seems like a poorly sourced hit piece to me. But what do I know?