Tesla turns to Texas to test its autonomous “Cybercab”

mozbo

Ars Tribunus Militum
1,867
Its really funny how perceptions have shifted so quickly, there's a new restaurant near my home, and I was curious to check it out: saw a 'pimped out' Cybertruck parked in front. Turns out, the owners vehicle.
so, i decided I didn't need to try out the new restaurant...
One can make some exceptionally quick decisions about how a business is run by looking at how the owner feels their money should be spent.
Was at one of my favorite local pizza joints a couple days ago.

Owner had FOX on and was nodding along. Will not be going back there.
 
Upvote
105 (109 / -4)

Dr Gitlin

Ars Legatus Legionis
24,277
Ars Staff
Did I see where the Cybertruck ignites at almost 20X the frequency of the Ford Pinto?
That analysis included the one that guy blew up in Las Vegas, which arguably it shouldn’t, so the rate of deaths is actually 4/35k, not 5/35k. So more like 14x the frequency?

Teslas are easily the deadliest cars on sale today which makes a mockery of Musk’s longtime claims that his EVs are the safest cars on the road.
 
Upvote
115 (119 / -4)

GreyAreaUK

Ars Legatus Legionis
10,314
Subscriptor
Can't it be? I wouldn't take the bet that it isn't.
(AIUI) Legally speaking it can't be, because that would mean Cybercabs would have to have drivers.

Which would mean that Elon Musk has re-invented the taxi. But I'm sure certain posters here will praise him to the heavens for it.

The alternative is that (a) the passenger has to be willing to step in if needs be, or it'll be FSD L2 but they'll claim it's L4 or something. Laws don't seem to matter much currently.


Edit: somehow missed your 'remote driver' bit and yeah, that wouldn't surprise me.
 
Upvote
32 (32 / 0)

Dr Gitlin

Ars Legatus Legionis
24,277
Ars Staff
The other aspect of this is there is a growing segment of the population with intent to vandalize or otherwise impede everything Musk. For Cybercab, for example, a quick puff of black spray paint and the cybercab is out of commission until a Tesla rep shows up to remove the paint. If people are already bold enough to vandalize private owner vehicles I am sure they will be even more bold with vehicles still owned by Tesla.

Musk made this bed and he will sleep in it.

FWIW: I am in no way condoning this behavior. Just an observation. I believe Musk is well on the way to doing the damage himself.

Cmon shareholders and TSLA board. It's way past time for a vote.
At least 95% of $TSLA’s value is due to Musk and his behavior. No one who owns the share would vote to devalue it by ditching the golden goose.
 
Upvote
111 (111 / 0)
Its really funny how perceptions have shifted so quickly, there's a new restaurant near my home, and I was curious to check it out: saw a 'pimped out' Cybertruck parked in front. Turns out, the owners vehicle.
so, i decided I didn't need to try out the new restaurant...
One can make some exceptionally quick decisions about how a business is run by looking at how the owner feels their money should be spent.
Everyone should load up on some anti-fascist stickers and start marking businesses around town:

https://www.etsy.com/market/fascism_stickers
 
Upvote
32 (36 / -4)

Uragan

Ars Tribunus Angusticlavius
9,797
Love the hit pieces.

I would love some valid counters to my positions below with data:

We're already laying the groundwork for excuses to why Waymo and Cruise (already gone) will eventually fail due to their lack of innovation:

View attachment 102294
What’s the “lack of innovation” exactly? How come Waymo’s implementation runs literal circles around Tesla?

Surely, the much more narrow permissions will be a critical factor in why these companies fail.
Did you not read that Waymo is in other markets?

Also, clearly disregarding the fact that all Tesla FSD incidents as a whole comprise all versions of the software, which have been iterated and improved with far more regularity than anyone else.
How do you know this? Waymo doesn’t broadcast when it iterates its software stack. And Waymo has far less disengagement events than Tesla does per mile, despite Tesla has god knows how many miles under its belt. Additionally, Musk has literally said, on the last quarterly earnings call, that existing Tesla vehicles do not have the hardware to run whatever software will be their ADS software.

Not delineating between ADAS crashes that been have at no-fault to the FSD driver, also paints an unclear picture.

View attachment 102295
Highlight 1: There may be some confusion because it doesn't happen with any regularity or validity upon scrutiny. Other users hit Teslas.

Highlight 2: Tesla has no responsibility for these crashes since they are not driving the car. 100% clear that the driver was complicit in using their software with all known risks.
How are they not responsible for their software? If the software freaks out and the driver has zero time to react to said freak out before an accident occurs, how is the driver liable?
 
Upvote
108 (109 / -1)
Post content hidden for low score. Show…

Dr Gitlin

Ars Legatus Legionis
24,277
Ars Staff
Given FSD's record of crashes, people absolutely will be injured or killed. Musk wants his cybercabs to be like Waymo's (unmanned), but Waymo's are far more sophisticated. Remember that nearly all of the FSD crashes that make the news happened with someone behind the wheel (although generally not paying attention). Talk to Tesla owners who have tried or use FSD and most will undoubtedly tell you the system needs supervision. What happens when there's no one behind the wheel?
The cabs won’t even have a steering wheel.
 
Upvote
81 (83 / -2)
This is near my friends house. I've lived in the area most of my life. That lane even confuses me from time to time. How the hell does anyone actually think that self driving is solvable?
You'd really think "avoiding stationary objects right in front of you" would be a solvable problem and a good place to start with the whole "autonomous driving" thing.
 
Upvote
117 (118 / -1)
During the testing phase Tesla will own the cybercabs and will be (presumably) be liable for the accidents where the cab is at fault. This may be a first for them, as testing on FSD was done with safety drivers who could be held at fault, and post-sale the owners could be held at fault.

What I don't get is once cybercab testing is over and Tesla tries to sell them, what company would buy a fleet of cybercabs and accept legal liability for mistakes made by Tesla's software, something the buyer has no control over and cannot opt out of.

The only thing that might possibly work is to make each individual cybercab be owned by a minimally insured LLC and if it crashes and kills someone, the victim can get the minimum liability payout ($30k in texas). But after one or two crashes, insurance companies will stop insuring them.

I really don't see how this is going to work as a business concept in any manner that would justify the development and rollout costs.

Just to clear things up, the fact that there may be a safety driver, a post-sale driver, or another entity operating the vehicles in driverless mode would not insulate Tesla from potential liability. In any given case Tesla may succeed in shifting part of all of the blame for the problem, but that is very fact specific.
 
Upvote
12 (12 / 0)
Post content hidden for low score. Show…

alansh42

Ars Praefectus
3,175
Subscriptor++
You'd really think "avoiding stationary objects right in front of you" would be a solvable problem and a good place to start with the whole "autonomous driving" thing.
That actually is a hard problem for self driving. It's much easier to identify objects moving relative to the background. Objects stationary or moving at 90° can blend in and (especially without radar) be identified as farther away. Too sensitive and you get a lot of phantom braking.
 
Upvote
50 (50 / 0)

Bash

Ars Scholae Palatinae
1,305
Subscriptor++
Surely, the much more narrow permissions will be a critical factor in why these companies fail.

FYI the 'narrow permissions' relate more to reporting requirements and safety standards, not something about the driving environment or conditions. I think the only reason why Tesla doesn't want to do autonomy testing in California is they cannot provide documentation showing Tesla-Autonomy is reliable enough to go driverless, and they do not want to publish their eventual failures (and likely get banned by the CAL DOT for being unsafe).
Elon is very obviously of the 'break things until they start to work' mentality, and he doesn't want all his public driving failures documented by the government. I predict there will be a major set of controls or new leadership put on NHTSA before Tesla is going to put a single car on the road. Elon / Tesla cannot handle the negative press that would result from actual engineering safety standards being applied to Tesla's driving data.
 
Upvote
48 (48 / 0)

tucu

Ars Tribunus Angusticlavius
7,629
Upvote
41 (41 / 0)
I was just talking about how the new cyberpunk resistance will be kids on untrackable scooter and bikes vandalizing or spray painting the cameras or dropping caltrops around these things. Especially when they're just waiting to be called. Musk will then spend all of his time focusing on those kids and not putting out another car.
 
Upvote
10 (13 / -3)

rbutler

Smack-Fu Master, in training
78
Subscriptor
insured LLC and if it crashes and kills someone, the victim can get the minimum liability payout ($30k in texas). But after one or two crashes, insurance companies will stop insuring them.
You forget that Tesla offers their own insurance to drivers, so they would just self-insure, pay the minimum, and spread the increased cost across all owners that are paying for their policies through Tesla.
 
Upvote
17 (17 / 0)