Jump to content

Peter D Says Google and FCA may be Shackin' Up!


Anthony

Recommended Posts

http://www.autoextremist.com/current/2016/4/26/google-in-final-negotiations-with-fiat-chrysler-on-an-advanc.html

 

 

 

Google, the all-seeing and all-knowing technical juggernaut, is in the late stages of negotiating an advanced technical partnership with Fiat Chrysler, according to a source with direct knowledge of the ongoing discussions. John Krafcik, CEO of Google's Self-Driving Cars operation and Sergio Marchionne, CEO of FCA, began talks not long after the Consumer Electronics Show in Las Vegas back in January, and have been in final negotiations over the last three weeks.

 

 

Check out the link above for the rest of the story.

Link to comment
Share on other sites

Well if true, don't expect any "new Plymouths" or "full line of RWD muscle cars" as some* Mopar fan boys are demanding. Beggars cant be choosy.

 

 

* "Some" not "all" as some will be quick to say.

 

Will have to see what happens when these cars need "software updates", will riders have to wait on the shoulder for 20-60 minutes? Will they have to go to dealers and wait. They will have to build larger waiting rooms! And batteries needing charges, and all the "fun" that is part of smart phone use.

 

And with Millennials moving into cities, where are they going to park?

Edited by 630land
Link to comment
Share on other sites

 

Peter D seems to be saying FCA will build, distribute, and service the Google car. That's not what Google and Ford are up to. Ford just want to sit at the table when the relevant laws, navigation rules and protocol of driveless cars are developed.

Edited by bzcat
Link to comment
Share on other sites

 

. Ford just want to sit at the table when the relevant laws, navigation rules and protocol of driveless cars are developed.

 

Sounds sensible to me, lets say I bought a driverless car.

Software or hardware malfunctions and I'm involved in an accident where the person in the other car is killed, am I guilty of vehicular manslaughter?

Link to comment
Share on other sites

 

Sounds sensible to me, lets say I bought a driverless car.

Software or hardware malfunctions and I'm involved in an accident where the person in the other car is killed, am I guilty of vehicular manslaughter?

Just hit the undo button a couple of times...

  • Like 1
Link to comment
Share on other sites

I just don't see the practicality in self-driving cars..... You'd get a lot more bang for your buck with improved public mass transit, and it will be 100x safer than these computer-driven death traps.

 

The government can barely keep our dumb roads paved, I don't see where the funding will come from to make them smart.

  • Like 2
Link to comment
Share on other sites

I just don't see the practicality in self-driving cars..... You'd get a lot more bang for your buck with improved public mass transit, and it will be 100x safer than these computer-driven death traps.

 

The government can barely keep our dumb roads paved, I don't see where the funding will come from to make them smart.

 

I tend to agree. Driverless cars are going to exasperate traffic in most cities because people will change the way they use their cars and trip counts are going to skyrocket, especially with empty cars wasting valuable road space. If there is no law or at least some sort of social compact that will limit the use of driverless cars to a few socially acceptable ways, it will be chaos.

 

However, I'm all for more automation and tech-assisted driving (meaning there will always be someone in the driver seat). But overall, for most urban areas, investment in mass transit remains the most efficient use of transportation resources.

Edited by bzcat
Link to comment
Share on other sites

The truck company I work for is racing towards self driving trucks.

 

Data that has been compiled for several years prove that radar speed and braking, Lane wandering correction and accident avoidance is superior with computer assistance.

 

Sensors monitors and can correct more efficient than a driver can.

 

I may have to reboot the systems once or twice..per month.

 

The advancement has been phenomenal over the past three years.

 

Within 7 years.. products will be on the market.. if the states laws allow it.

Link to comment
Share on other sites

The truck company I work for is racing towards self driving trucks.

 

Data that has been compiled for several years prove that radar speed and braking, Lane wandering correction and accident avoidance is superior with computer assistance.

 

Sensors monitors and can correct more efficient than a driver can.

 

I may have to reboot the systems once or twice..per month.

 

The advancement has been phenomenal over the past three years.

 

Within 7 years.. products will be on the market.. if the states laws allow it.

How do/will they deal with cars that cut the semis off? Just slam on the brakes? Or keep a long gap between it and the car in front

Link to comment
Share on other sites

How do/will they deal with cars that cut the semis off? Just slam on the brakes? Or keep a long gap between it and the car in front

Same as if a car cuts off a driver in a semi either run'em over or stop...

 

I see truck and motorcoach companies embracing this tech 1st on over-the-road trips. Cars would use this on HOV/HOT lanes and some suburban roads.

 

Eventually in 20 or so years a driverless cars or driverless option cars will be the norm.

Link to comment
Share on other sites

Same as if a car cuts off a driver in a semi either run'em over or stop...

 

I see truck and motorcoach companies embracing this tech 1st on over-the-road trips. Cars would use this on HOV/HOT lanes and some suburban roads.

 

Eventually in 20 or so years a driverless cars or driverless option cars will be the norm.

Well, I was thinking more because a truck driver now can (presumably) tell when someone may be about to cut them off and prepare for it before it happens, whereas autonomous presumably would pick up on it only when the vehicle was already veering over.

Link to comment
Share on other sites

I receive a new truck every year. Each year more tech is added to the power unit and every year the failure rate drops significantly because of better hardware and software.

 

One item I really like is the DD12 transmission that drops into N when going downhill on a 3% grade or less. It will also skip shift 1-3 gears depending on the load weight and terrain. No driver can do a better job of obtaining higher MPG then the DD12 trans can.

 

Today, you can get this stuff on many production cars. Take a look at what you can get on the Mercedes S-class Class..

Link to comment
Share on other sites

I receive a new truck every year. Each year more tech is added to the power unit and every year the failure rate drops significantly because of better hardware and software.

 

That's a fair point.

 

However 'self-driving' requires a measure of machine-learning, infrastructure mapping and infrastructure standardization that are not refinements of existing systems. Especially in the realm of machine-learning is the gulf wider than what the popular press and glowing PR would have you believe.

Link to comment
Share on other sites

Do self drive cars by their ability to react so quickly actually become less predictable to human drivers around them?

A robot can't see that the driver next to you is agitated and likely to do desperate moves,while a human driver will go into avoid mode..

Edited by jpd80
Link to comment
Share on other sites

Do self drive cars by their ability to react so quickly actually become less predictable to human drivers around them?

A robot can't see that the driver next to you is agitated and likely to do desperate moves,while a human driver will go into avoid mode..

 

That's not the problem.

 

The problem as I see it is four-fold:

 

1 - How do you get a machine-learning AI to fail in 'safe' ways? Going back to Heidegger's concept of the mind's constant presence in an intelligible world, that larger world-knowledge will generally prevent a human being from identifying a black person as a gorilla. However, machine-learning systems don't exist in a larger intelligible world. Their field of information is sharply curtailed, and bizarre/bad things can happen when the limits of that information and decision-making logic is reached.

 

2 - The limits of self-driving cars are most pronounced under many dangerous driving conditions. Rain and ice covered roads are likely to interfere with lane detection and signal recognition, and whiteout conditions (blizzard, etc.) are probably not susceptible to improvement via infrared sensing (although fog may be).

 

3 - What will happen when the driver's attention is required immediately? If a self-driving car encounters a fault--e.g. cannot determine lane location, cannot determine if a signal is ahead, etc., and requests assistance from a driver who has been doing nothing for several minutes, and may not even be acutely aware of the situation or location.

 

4 - Who will be assessed fault when an autonomous car fails to recognize, say, a battered stop sign or awkwardly located traffic signal? Will the driver be held at fault, and will the insurance company attempt to subrogate/litigate against the municipality for failing to maintain signage in the more pristine form required by these limited functionality self-driving cars?

Link to comment
Share on other sites

 

That's not the problem.

 

The problem as I see it is four-fold:

 

1 - How do you get a machine-learning AI to fail in 'safe' ways? Going back to Heidegger's concept of the mind's constant presence in an intelligible world, that larger world-knowledge will generally prevent a human being from identifying a black person as a gorilla. However, machine-learning systems don't exist in a larger intelligible world. Their field of information is sharply curtailed, and bizarre/bad things can happen when the limits of that information and decision-making logic is reached.

 

2 - The limits of self-driving cars are most pronounced under many dangerous driving conditions. Rain and ice covered roads are likely to interfere with lane detection and signal recognition, and whiteout conditions (blizzard, etc.) are probably not susceptible to improvement via infrared sensing (although fog may be).

 

3 - What will happen when the driver's attention is required immediately? If a self-driving car encounters a fault--e.g. cannot determine lane location, cannot determine if a signal is ahead, etc., and requests assistance from a driver who has been doing nothing for several minutes, and may not even be acutely aware of the situation or location.

 

4 - Who will be assessed fault when an autonomous car fails to recognize, say, a battered stop sign or awkwardly located traffic signal? Will the driver be held at fault, and will the insurance company attempt to subrogate/litigate against the municipality for failing to maintain signage in the more pristine form required by these limited functionality self-driving cars?

 

Excellent post.

 

The technocrats are looking at this in terms of what the system can do at its best when we should be asking

where are its limits, what it can't do and what control measures / restrictions need to be put in place.

 

What a mine field...

 

What happens if say, Google wants to remove driver controls and steering wheel from its vehicles,

do they then assume responsibility for all decisions and mistakes made by the software?

Edited by jpd80
Link to comment
Share on other sites

No more so if the adaptive cruise control fails today.

 

For now, the intent is to have a driver in the seat. Slow speed manoeuvres would be conducted by a driver.

 

Adaptive cruise still requires driver input.

 

And this is about the point where I realize that you don't understand the gigantic gulf between driver assistance (which has been around basically since power brakes were invented) and self-driving. I also realize that you don't understand that you don't understand the gulf, and that there's no point in further conversation.

Link to comment
Share on other sites

And perhaps why law makers will struggle to implement this technology, people simply assume that they fully understand

all the implications and limitations of driverless car technology, clearly a lot of us still do not know what we're agreeing with..

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...