At what point does it just become the user’s fault then? A chainsaw’s design allows you to cut off your own leg if you don’t use it right but that certainly isn’t a flaw in the chainsaw’s design. It is a dangerous piece of machinery (like a car) that needs to be operated properly.
Perhaps that is why they don’t have auto piloted chainsaws… It appears that any “auto pilot” function may have the unintended consequence of causing the user to neglect their responsibility. I think you either need to make cars completely autonomous, or completely “manually operated.” I think the issue with a car is that both the propensity to “check out” when it is on autopilot, along with the possibility of user error causing death and destruction, are unfortunately high.
My question would be how responsible a product’s makers should be for human nature and error when the product’s shortcomings are apparent. For example, spellcheck makes spelling easier (just as autopilot can make driving easier) but still requires humans to make the final determination rather than just “checking out” completely and leaving it to the machine. Does that mean that any errors caused by “checking out” are attributable to the person or the program?
I know that the impact of a spelling error and an auto accident are not exactly comparable but the mistake itself would be similarly caused.
I agree that it is a bad comparison & most likely by someone who doesn’t know the difference between an electric & gas powered saw. It’s The difference between irrational reasoning & rational reasoning. Even a chain saw has built-in safety measures such as anti-kickback. today’s “Insurance Journal” begins with an article describing the industry the President’s proposal to remove government oversight of self-driving systems. After industry lobbying pressure & money, & the first law suit, I am sure the Supreme Court will declare any accident involving a self-driving car the fault of the driver thereafter & forever more.
At what point does it just become the user’s fault then? A chainsaw’s design allows you to cut off your own leg if you don’t use it right but that certainly isn’t a flaw in the chainsaw’s design. It is a dangerous piece of machinery (like a car) that needs to be operated properly.
Bad comparison between a chain saw and an automobile.
Perhaps that is why they don’t have auto piloted chainsaws… It appears that any “auto pilot” function may have the unintended consequence of causing the user to neglect their responsibility. I think you either need to make cars completely autonomous, or completely “manually operated.” I think the issue with a car is that both the propensity to “check out” when it is on autopilot, along with the possibility of user error causing death and destruction, are unfortunately high.
My question would be how responsible a product’s makers should be for human nature and error when the product’s shortcomings are apparent. For example, spellcheck makes spelling easier (just as autopilot can make driving easier) but still requires humans to make the final determination rather than just “checking out” completely and leaving it to the machine. Does that mean that any errors caused by “checking out” are attributable to the person or the program?
I know that the impact of a spelling error and an auto accident are not exactly comparable but the mistake itself would be similarly caused.
I agree that it is a bad comparison & most likely by someone who doesn’t know the difference between an electric & gas powered saw. It’s The difference between irrational reasoning & rational reasoning. Even a chain saw has built-in safety measures such as anti-kickback. today’s “Insurance Journal” begins with an article describing the industry the President’s proposal to remove government oversight of self-driving systems. After industry lobbying pressure & money, & the first law suit, I am sure the Supreme Court will declare any accident involving a self-driving car the fault of the driver thereafter & forever more.