Any GTAM'ers own an electric vehicle? | Page 191 | GTAMotorcycle.com

Any GTAM'ers own an electric vehicle?

Mad Mike

Well-known member
If he is going to put 5000km a week on his Tesla I wouldn't expect it to hold its value well.
That would be quite difficult -- you need to do 1000km each workday day, and be able to find fast chargers every 250km.
 

K20EF8

Well-known member
That would be quite difficult -- you need to do 1000km each workday day, and be able to find fast chargers every 250km.
Thats what I figured.
He mentioned spending $600 a week on fuel, which for an average vehicle is about 5-8000 km. it must be all highway too as even on the low end where economy is worse, 5000 km in one week equates to about 100 hours at 50kph.
 

FullMotoJacket

Well-known member
Site Supporter
If he is going to put 5000km a week on his Tesla I wouldnt expect it to hold its value well.
Not to mention, that's going to be 10 full charges (long range avg. is 523km)
 

LiNK666

Well-known member
What car did you have before that was using $600/week in gas but doing the same mileage as a Tesla?
Ford Escape. Did about a 800 km to 1k km per week.

I made a mistake in my previous post. I did about $600 per month in fill ups not weekly. Sorry!
 

Mad Mike

Well-known member
Ford Escape. Did about a 800 km to 1k km per week.

I made a mistake in my previous post. I did about $600 per month in fill ups not weekly. Sorry!
That makes a big difference!

Still something a bit strange -- $600 $1.30/l your Escape would be using almost 14.5l/100km.
 

Iceman

Well-known member
That makes a big difference!

Still something a bit strange -- $600 $1.30/l your Escape would be using almost 14.5l/100km.
I dont know, I rented an escape once, 3.0l awd. Thing was horrible on fuel.

Sent from my SM-A530W using Tapatalk
 

mimico_polak

Well-known member
Site Supporter
Why would people blindly trust AutoPilot in their cars? I mean sure it's great but I don't know how secure I would be to just let the car do it's thing. I've seen videos of people sleeping during their commute and it's great...but I guess I just don't trust the technology enough yet.
 

GreyGhost

Well-known member
Site Supporter
Why would people blindly trust AutoPilot in their cars? I mean sure it's great but I don't know how secure I would be to just let the car do it's thing. I've seen videos of people sleeping during their commute and it's great...but I guess I just don't trust the technology enough yet.
It is very hard for humans to maintain focus when they have nothing to do. Honestly, the only safe way for a human to focus on driving is to ((*&)(*&^ drive. I have no problem with the computer driving, but it needs to be at least as good as a human. Relying on a human to take over from the computer is *&^(*&^ idiotic. Not only is the human not paying 100% attention, they need to process that something is going to happen and the computer has screwed up and isn't going to deal with it in time and then they need to take corrective action. That is a hell of a long chain to get through before the obstacle.

Elon keeps saying it's the human in charge, blah blah blah, but also repeatedly calls it auto-pilot, self-driving etc. wtf. If the human is in charge, call it steering assist, or lane-keeper or something to make it clear that it is not capable of dealing with all aspects of driving.
 
  • Like
Reactions: LBV

r3r3r3

Well-known member
Lord Elon can do nothing wrong.

Comes down to complacency. The computer hasn’t crashed the car yet so its not going to screw up in the future. Autonomous driving is broken up into 5 categories of sophistication. Level 2 is what most cars have – lane centering and the ability to accel/decel with traffic. The problem with Tesla’s is that they are at level 3 – where the car seems like it can do everything on its own until it suddenly can’t.

I was reading an article about this and apparently most automakers (beside Tesla) are hesitant to release level 3 systems. Most manufacturers plan to hold off until level 4 is a reality, where the car can handle nearly every driving scenario (besides extreme snow storms, off road, etc.). The problem with level 3 is that it gives a false sense of security that the car can handle everything. Now you have these situations where drivers are thrown into an emergency situation when a second ago they were half asleep.

Its surprising that the NHTSA allowed such a widespread roll out of the Tesla autopilot system. What would have happened if there was a road crew working behind those pylons? Who kills them, the computer or the inattentive driver?
 

GreyGhost

Well-known member
Site Supporter
Its surprising that the NHTSA allowed such a widespread roll out of the Tesla autopilot system. What would have happened if there was a road crew working behind those pylons? Who kills them, the computer or the inattentive driver?
Based on the bs Tesla spews, it was the drivers fault. In reality, the fault rests primarily on Tesla for rolling out a system that as you accurately put it, provides a false sense of security.
 

mimico_polak

Well-known member
Site Supporter
Based on the bs Tesla spews, it was the drivers fault. In reality, the fault rests primarily on Tesla for rolling out a system that as you accurately put it, provides a false sense of security.
Tesla can do nothing wrong in the eyes of many and they will cover themselves with ‘driver is responsible for taking control when needed’ BS line.

I like Tesla and want one but wouldn’t trust this aid for a long time.
 

Brian P

Well-known member
Moderator
Site Supporter
Human-factors engineering is not something which is fully understood. There is good reason why other manufacturers are going slow with this. Tesla, and specifically Elon Musk, isn't experienced enough to understand why that's how it has to be. They are still claiming that full self driving is coming soon. If it is, incidents like this shouldn't be happening.
 

GreyGhost

Well-known member
Site Supporter
Human-factors engineering is not something which is fully understood. There is good reason why other manufacturers are going slow with this. Tesla, and specifically Elon Musk, isn't experienced enough to understand why that's how it has to be. They are still claiming that full self driving is coming soon. If it is, incidents like this shouldn't be happening.
The upside to his obscenely dangerous experiment is it is probably the fastest way to close the loop (collect data, test code, fail, check data, update code, test again, etc). The fact that people may die in his quest for being the first to achieve real self-driving doesn't seem to bother him.
 

K20EF8

Well-known member
Where is the resident Tesla guru? SunnY S -- it this a Tesla or Driver error??
Its ultimately a Tesla error but the driver has to take some of the blame for
-Not paying attention
-Being dumb enough to place their safety in the hands of a largely untested and unproven AI/autonomous system.
 

SunnY S

Well-known member
Site Supporter
Where is the resident Tesla guru? SunnY S -- it this a Tesla or Driver error??

I haven't seen the video, and frankly, I don't care. self driving cars whether from Tesla or anybody else, don't interest me in the least hence my lack of comments on the subject.

I was more interested in the manual car discussion in the other thread. THAT.....is my type of driver involvement.
 

Top Bottom