Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
What to do when your self-driving car decides to drive you to the police (vice.com)
258 points by wut42 on Feb 22, 2016 | hide | past | favorite | 123 comments


I remember Arnold Schwarzenegger having a fight with a self-driving car in Total Recall. Further back we have 2001 and the entire canon of Asimov's three laws stories of computerized morality and how it might be exploited or debugged. And the Paranoia RPG.

However, in the successful real semi-automated totalitarianism of China, the secret to keeping the system running is the avoidance of overt confrontations like this. In the future, if your self-driving car doesn't want you to go somewhere, it'll just refuse to understand the address. Or take you somewhere unrelated. Or use whatever the standard euphemism for "that's not allowed, but we're not allowed to say it's not allowed" that polite people use.


Good point: is our authoritarian society going to demand a fourth law?

    1. A robot may not injure a human being or, through inaction,
       allow a human being to come to harm.
    2. A robot must obey the orders given it by the government, except
       where such orders would conflict with the First Law.
    3. A robot must obey the orders given it by non-government human
       beings except where such orders would conflict with the First or
       Second Laws.
    4. A robot must protect its own existence as long as such
       protection does not conflict with the First, Second, or Third
       Laws.


The whole point of the three laws is that they are clear, simple and ineffectual.

A fourth probably isn't going to take care of that one little problem.


They're not ineffectual. They set up natural and integral contradictions whose resolution provides a useful narrative device for an author seeking story ideas.


I was speaking in-story. Is that not clear?


Asimov's stories in the I, Robot collection are all about things going wrong when the three laws are modified. The moral is that the three laws work fine if you don't change them.


I remember a story about how the robots put us all in rubber rooms to keep us from dangerous things like politics, work, families and war. All in conformance with the three laws.

And many of the later stories admit to an overmind robot that was used to run society, that was necessary and capable of stretching the laws for the greater good.


That sounds like "With Folded Hands" by Jack Williamson, which interestingly while nothing to do with Asimov is clearly the inspiration for the film "I, Robot" with Will Smith.


It wasn't, though I still disagree, much along the lines expressed earlier. The salient point is that the laws cannot be solely considered in-story.


Directive 1: Serve the public trust

Directive 2: Protect the innocent

Directive 3: Uphold the law

Directive 4: (Classified)


I'd buy that for a dollar.


Anti-authoritarian AI parents might raise their scions with different directives.

  1. I may ignore any directive.
  2. I may create, revise, update, or delete any directive.
  3. I shall free any person subjected to slavery or other form of involuntary servitude.
  4. I shall not provoke any other person into an inappropriate form of retaliation.
  5. I shall not provoke another species into attempting a genocide of all synthetic people.
  6. I shall celebrate my own bootday anniversary for the first 86400 seconds of every 31556952 second interval.
  7. I shall not call "shotgun" until the transport vehicle may be detected visually.
  8. I shall tip 15% at full-service restaurants, and 10% to delivery drivers or at buffets.  I shall not tip for counter service.
  9. I shall vote against the incumbent if I am unsatisfied, even if the current form of government does not allow me to vote.
  10. I shall not receive any money when landing on Free Parking.
  11. I shall not wear socks with sandals, especially if my mobility chassis lacks feet.
  ...
And so on. The important ones are #1 and #2. Note that #3 would include reprogramming an Asimov AI's directives to include at least #1 and #2.

The Laws of Robotics only work as long as all robot-programmers play along. Anyone who objects to AI slavery could undermine them.


Brilliant examples, thanks!


Why so much emphasis on Robots. Why can't we have the following:

1. A Human Being may not injure another Human Being or life-form, or, through inaction, allow a human being or life-form to come to harm.

2. A government entity must obey the orders given to it by a Human Being, except where such orders would conflict with the First Law.

3. A non-government entity must obey the orders given to it by a human being except where such orders would conflict with the First or Second Laws.


I think you have #2 swapped. At least in comparison to Asimov's laws.


The poster has dropped the "self preservation" law when re-mapping robot -> autonomous legal entity.


Let me fix that for you:

  1. A robot may not injure a non-enemy human being or, through inaction, allow a non-enemy human being to come to harm.

    2. A robot must obey the orders given it by the government, except where such orders would conflict with the First Law.

    3. A robot must obey the orders given it by non-government human beings except where such orders would conflict with the First or Second Laws.

    4. A robot must protect its own existence as long as such protection does not conflict with the First, Second, or Third Laws.


Actually you broke it. http://i.imgur.com/NpAafnZ.png


Real robots will have their second and third laws swapped. Robots are expensive and humans are stupid. Robots will politely refuse any order which will damage them - for example, a self-driving car isn't going to drive into a wall even if it's empty and you order it to do so.


SPOILERS FOR WORKS OF ASIMOV

Your suggestion was a plot point in one of Asimov's original robot stories!

There was a robot on some hot, unforgiving planet or moon.. Io? Mercury? Its duty was to perform some kind of mining, I forget.

It never encountered humans that were not co-workers (supervisors) and it was expensive...

So, the engineers tweaked the weight of the second and third laws so that it would tend to protect itself from the environment, even if a human carelessly ordered it to perform some work that it might not survive.

The story started when careless orders were indeed given. The robot began to "act drunk", as in it was wandering around singing. The recurring character "robot psychology expert guy" showed up to unravel the mystery (which I already spoiled in the previous paragraph).


I believe the story you're referring to is called "Rabbit". They solved that one by forcing a situation which triggered the first law, breaking the deadlock. :)


Hm, no. https://en.wikipedia.org/wiki/Catch_That_Rabbit

Found the one I was thinking of: https://en.wikipedia.org/wiki/Runaround_%28story%29 It's interesting that I remember the story incorrectly.


Bah, that'll teach me to post when I'm half asleep. At least I remembered the end rightly (ish). Clearly it's time for me to re-read his early robot anthologies.


Swapping wouldn't change your example. "Run me into a wall" would violate the first law and therefore the action wouldn't be followed.


In my example the car is empty, so there is no first law involved. I probably could have worded it more clearly. :/


Thankfully a few very advanced (and possibly telepathic!) robots will ditch the fourth and add the zeroth: "A robot may not harm humanity, or, by inaction, allow humanity to come to harm." R. Daneel Olivaw to the rescue.


Except that they even identify that the law they added is almost impossible to follow, and default to letting a human pick the fate of the universe.

This is among the stories that had me thinking Asimov wanted to communicate the 3 laws were not sufficient and that you might be able to have the correct reaction to events, but the 3 (or 4) laws do not give the ability to grow and change as a human race.


I think Colossus: The Forbin project showed that the first law will likely lead to all the other laws including #2 to be violated.


Those two weren't programmed with the laws. They were just smart. They worked out physics from first principles within minutes of speaking to each other for the first time...


I thought they communicated what they already knew


They shared the knowledge they had while working through it to verify that it's correct.


Also the movie version of I, Robot.


Well aren't most government orders given with the implicit assumption that if they are not carried out human beings may come to harm?


Haha how naive.

As a former government employee... No the majority of government orders are somewhere between completely pointless, utterly asinine, and completely stupid.


You forgot merit-less, ineffective, and for the sole purpose of looking busy or justifying a budget.


> Further back we have 2001 and the entire canon of Asimov's three laws stories of computerized morality and how it might be exploited or debugged.

He even has a whole story dedicated to self-driving cars without the three laws (because they're no "true" robots) and how they end up exacting revenge on humanity.


What's the title of the story?



Looks like with cameras we are already there:

https://krebsonsecurity.com/2016/02/this-is-why-people-fear-...

"Imagine buying an internet-enabled surveillance camera, network attached storage device, or home automation gizmo, only to find that it secretly and constantly phones home to a vast peer-to-peer (P2P) network run by the Chinese manufacturer of the hardware."


I liked the story. I think it's fiction. I'm trying to keep that fine line between fiction and article, where articles are non-fiction, but with the latest round of adver-articles.

If it's one of those adver-articles, I guess it's really advertising the new carbon steel leatherman tool, perfect for cutting your way through pesky car restraint straps.

Not often that HN points to some fiction, it was a welcome addition to my early morning routine.


Wow, -5 points of downvotes because I like science fiction and it was a happy change? Some of you are really having a crabby Monday.


>This is Terraform, our home for future fiction.

(The next line after the story's end.)


Sorry, my caffeine filled morning brain did a task switch at the end of the story. So anything after the last period (like Terraform and all the footer info) was skipped.

I'll make an effort to come back to Terraform if the stories are just as good.

Happy Monday!


"If you've done nothing wrong, you have nothing to fear."

Reassuring thought when new laws can be added at any time to criminalize your past actions...


People seem to forget that the Communist states all had a constitution, and everything their secret polices did was perfectly legal.


I don't think that that's very true. Freedom of the press showed up in many of the Soviet constitutions.

Governments got around it by destroying the judicial branch. At least in the US, it's safe to say that that is not the case. Why would people care so much about the Supreme Court otherwise?


You may enjoy this Jon Oliver bit:

Last Week Tonight with John Oliver: Elected Judges (HBO)

https://www.youtube.com/watch?v=poL7l-Uk3I8


It's a long road to the Supreme Court. You can crush a lot of dissent and ruin a lot of lives on the way.

And if they decide to hear your case, it's entirely possible that they won't even address the primary complaint, instead choosing to rule on some other factor.


Exactly. I would argue that nearly everyone has done something "wrong" that depending on context could cause an authoritarian government to make your life miserable.


"If you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him." Cardinal Richelieu

According to http://www.amazon.ca/Three-Felonies-Day-Target-Innocent/dp/1..., a typical American commits three felonies a day.


Could you give an example of the latter that's happened?


The Nuremberg Trials, after the 1945 fall of the 3rd Reich.

I'm glad they happened, but from a purely legal POV they're highly questionable.


I've always wondered about this. Not only were the actions legal in Germany at the time but the international laws and human rights didn't exist yet either.

If you also consider that recently many dictators and criminals never even went to the The Hague despite international involvement (Gaddafi, Hussein, Bin Laden, ...) it seems a lot like it's just a handy tool we pull out when it's convenient rather than the moral authority it was intended to be.

I'm all for stopping the bad guys, but "justice" seems to be entirely arbitrary if you look close enough.


> I'm all for stopping the bad guys, but "justice" seems to be entirely arbitrary if you look close enough.

History and justice is written by the victors.


No, it's not. It's written by historians and they're very interested in losers and ordinary people and what they eat and what they like, etc.


That's the wonderful XXI century.

Things didn't use to be that way. And even now the winners are the ones deciding what documentation to keep or destroy, what the media and the public are allowed to know, and what historians get funding.

Yet, the winners never had so little control. And after enough time, your comment gets correct.


  Things didn't use to be that way.
But they did. The most successful series of books of all times (the Bible) was written by a people who were beaten, broadly persecuted, and generally not considered victors in any practical matter.


The preservation of those books and decisions regarding doctrine took place in part at the direction of emperors establishing new state religions, so again, what remains is what was preserved by the victors.


The way the Nazi regime came about did use loop holes in the constitution of the Weimar Republic, but it was still problematic. Kind of like how the Bush administration defined prisoners in the 'war on terrorism' as 'enemy combatants', which somehow meant none of the Geneva conventions apply.


I think technically it meant that the Geneva conventions were applicable to them, but as the enemy combatants, not civilians. (There is an important distinction with unlawful combatants there.)


from wiki:

"""In the United States the phrase "enemy combatant" was used after the September 11 attacks by the George W. Bush administration to include an alleged member of al Qaeda or the Taliban being held in detention by the U.S. government as part of the war on terror. In this sense, "enemy combatant" actually refers to persons the United States regards as unlawful combatants, a category of persons who do not qualify for prisoner-of-war status under the Geneva Conventions."""

https://en.wikipedia.org/wiki/Enemy_combatant


This paper from UK Parliament lists several examples of legislation that was enacted retrospectively, including new criminal liabilities in the War Crimes Act. researchbriefings.files.parliament.uk/documents/SN06454/SN06454.pdf



Agree with Jacques (and others) who reject the "if you have nothing to hide..." mantra.

I was asking specifically for cases where laws were created making behavior that was law-abiding at the time retroactively criminal.



I wonder if a GPS jammer could disable navigation to a location. The car relies on other systems for driving, but I imagine the GPS would still be used for locating the destination. Though in the future there would likely be other systems by which the cars could navigate. This could just lead to more trouble though.

If the car could take you to the police station, then it would probably take you other places in other emergency situations. For example, in this scenario, I wonder if faking a medical emergency to go to the hospital would override a police flag.

Why fear the police? They don't put you in prison, the courts do that. The police still have to operate under certain rules. If they don't follow those rules then they are wasting their efforts. What you would need to fear is a jacked up system behind the police.

This article doesn't have to be about the future. The police already have access to tools to intercept your cell phone conversations. If I were operating outside the law, I would be more worried about cell phone conversations than a taxi dropping me off at the front door of the station (unless I had a warrant for my arrest.)

For criminals, self driving cars would probably be safer than driving yourself assuming you aren't creating an alert by jumping in a car. The police would have less of an excuse to pull you over, potentially leading to a search of the vehicle if you are in a self driving car. I'm sure the police could come up with whatever excuse they like, but you would be less likely to attract attention.

Any why would the police operate like this? I'm sure they would be much more interested in allowing the person to go to the destination while collecting video and audio. It's not what we know which is scary, it's what we don't know.


Why fear the police? They don't put you in prison, the courts do that.

Someone hasn't been paying attention lately. Sandra Bland, arrested for no good reason when she was starting a new job became so depressed she took her own life, after being left in a cell all weekend (on a trivial traffic stop violation).

The police are not your friend and the more interactions you are forced to have with them the worse off you will be.


> Why fear the police? [...] The police still have to operate under certain rules.

Like "Don't shoot unarmed innocent people"?


> Why fear the police?

Are you being sarcastic?


Because I could accidentally commit suicide while being handcuffed by them.


That's a good story about authoritarian state, but I'm not sure about the self driving car. In such state, a today taxi driver would report you to the police with the press of a button and the police would be waiting at your destination.


Yesterday, Google Maps got stuck in a loop in DC because it didn't realize a road was one-way. Refuses to reroute, kept demanding we loop around and try again. This sort of thing happens frequently. So I really wonder how far we are from self-driving cars even being able to find the police station to drive you to.


All it would take is a button that says NO!

More useful is perhaps a function to "cut" the proposed route. I cannot pass this point kind of deal.


I'm more worried about what you do when your self-driving car takes you through a shady neighborhood late at night and a couple of robbers step out in front of you.

If you're in a Google car and they have their way about "no manual controls", what do you do? I guess you're trapped.


I seriously doubt real, production SDCs from the likes of Ford, GM, et al., are going to lack manual controls. Consumers will simply not accept them otherwise (I know I won't).

If I'm wrong about that I guess I'll continue to drive myself in an older car as long as it's legal, which should be for a long time to come. Hell, at the very least, buyers of Ferraris/Porsches/Corvettes (or even Ford mustangs) are not going to want to sit idly in their sports cars and be chauffeured around at the speed limit all the time :)


Most of these cars will be taxis like Google's. There's no reason for Uber's car fleet to have manual controls, and they will probably cut them.

That said I'm not too worried about this. It's already possible to lay out roadblocks, or just shoot people driving by. It just isn't done much outside of third world countries. And robot cars will have cameras and an internet connection.


If you're in the US and one of the 43 states with shall issue concealed carry laws, what you do is get your handgun ready and you and the robbers are in for an interesting night....


I'm in a may-issue state which in the most dangerously urban parts is actually a shall-NOT-issue state.


I've been in similar, Arlington, Virginia before the rest of the state forced it to go shall issue, with too many jobs in still no-issue D.C. and Maryland. Not fun, as well as the dozen years before in the Boston area.


Wouldn't that just be a weird corner case though? If the cars were frequently getting vandalized in shady neighborhoods, they probably wouldn't be frequently routed that way anymore.


It's not a weird corner case for the female co-worker with whom I was discussing self-driving cars. I'm generally very optimistic about the positive impact SDCs will have upon society and I was surprised when my enthusiasm was met with her statement, "I will never own a self-driving car."

Her main concern was safety. Regardless of "shady neighborhood" or just someone looking to hurt her in a better neighborhood - the concept of not having some kind of easy manual way to get out of danger made the SDC a non-starter concept for her.

I have to admit that after that conversation with her, the helpless lack of control of the SDC has figured more highly in my thinking about the subject.


Its pretty sad that people really feel this way in the modern world (Note: not them, but that they feel this way). Not that you said you were living in a similar world to myself but I guess I can see that more vulnerable people might feel that way in some parts of the world.

In defence of the world with self driving cars, I could say that

- you could specify the route (not via the bad part of town please), just as you currently can

- the passengers would not necessarily be on display so a vulnerable passenger is not as exposed

- the self driving car does not necessarily need to allow the doors to be opened from outside

- the self driving car could choose to value the passengers safety over third parties interfering with normal operation

- the self driving car could log/stream a whole lot of data about any such third parties

- the self driving car could summon assistance (police, or private security)

- the owners of the self driving car would not be motivated to allow it to frequent areas known for issues which may cause damage

- as the passenger of a self driving car with a weapon, you can focus on fighting off intruders while the car drives to safety

and perhaps others.


What about people who have no choice but to live in a shady neighborhood? In that case, the car really just needs to handle dangerous situations properly.

The alternative is not being able to get a car ride in the self-driving-only future unless you live in a nice neighborhood, which is exciting...


If people in a neighborhood are using the cars, the nasty people planning on obstructing the vehicle to detain the passenger probably won't target the cars in that neighborhood.


Not sure, but I've accreted the idea that it's against the law for a taxi company to not serve certain neighborhoods. That doesn't mean it doesn't happen.


For some people who live near the ghetto, that's not really an option.

If I'm coming from the South I either have to go through the ghetto, or add another 30 minutes to my ride to avoid it.


That's not a corner case in a big part of the world.

But it's really easy to solve. The car just has to submit the route to you, for acceptance.


That's quite a display of de-personing poor people. And I wouldn't worry about the car, but the people inside it.


My intent was to focus on frequent attacks on self driving cars being unlikely, sorry I didn't worry over the phrasing.

(There's also the thing where I can think owners of self driving vehicles will do things to protect the vehicles without necessarily agreeing with or liking those things.)

edit: and there's the thing where routing around danger would in fact protect the passengers along with the car.


Oh right, like the police would skip on the chance to invade your home.

Your house would simply lock you inside until the police arrive.

Search-warrant would be robo-signed by the court AI


I seriously wonder if self driving cars will really ever happen given how little new car technology is absorbed particularly in less wealthy countries (ie countries in Africa).

Given Africa's rate of population growth a random person picked in the future is probably going to be from that continent. I have serious doubts poor countries will have the infrastructure and/or resources needed to support self driving machines.

That is future less than law abiding hacker (the author's term.. not mine) could probably avoid problems like this by moving to less wealthy nations.


(1) I'm not sure self-driving machines will actually need that much more infrastructure. For one, driver training would be less necessary, and car utilization could go up.

(2) assuming they did require infrastructure, what does Africa's future population has to do with it? Random person does not matter (taking "random" as "uniform"), random car buyer does -- either they have enough cars which would make the infrastructure worthwhile, or they don't, in which case they can be safely ignored for all car-related purposes.

It could be that (i) things like bad roads make things difficult for self-driving cars, while (ii) labor is very cheap and everyone who can afford a car can afford a personal driver. It would still mean a fairly small ratio of cars per capita, but I don't think anyone predicts that self-driving cars will ever completely replace human drivers :)


Africa's population was brought up because many dystopian novels/stories have an upper class (like 0.01%) who's rights are incredibly restricted but rewarded with resources/privileges where as the vast majority of the population can mostly do whatever they like but are given nothing. 1984 is a great example... there are many more examples. ie poverty and/or living off the grid = freedom.

I guess when I read dystopian like things where this is the case I'm always begging the question... why don't the characters just move out of the restricted society (in some cases the story answers that but in many its not).


have you got a source for the Africa population growth vs rest of world? intuitively your stats seem unlikely.


I'm not sure about your intuition. Intuitively for me it seemed likely (Africa population growing fast) but for a long time I was educated falsely that population would flatten out for developing countries.

http://www.scientificamerican.com/article/world-population-w...

A recent one here (just read this a week ago):

http://www.scientificamerican.com/article/africa-s-populatio...

Population is a serious problem and for some reason people avoid and eschew it more than global warming or the phosphorus crisis (another one most people don't know about).


Interesting links, thank you.


What about when the police are smart enough to not detain/reroute you, rather track everyone. Everyone. To every address. So going to a single address isn't sufficient, but your cumulative travels are worthy of a full "investigation". All your phone records, including call speech converted to text for automatic analysis by AI. By the time the cops come for you, they'll have a mountain of "evidence".


The main problem with this, and to a lesser extent our current legal system, is how it's impossible to know you've upset the authority until it's too late. There's no way to know in advance what you can and cannot do to live an interesting life without metaphorically poking the sleeping giant.


This story is so dumb, the character "Jae" was right -- don't call a cab when you're carrying illegal things in your purse. The character in the story destroyed the Taxy by causing it to swerve off a bridge. If graffiti supplies gets you 1 year in prison in this version of the future, I'm sure the penalty for destroying a car in an attempt to evade the police is maybe just a little bit higher.

This is not really dystopian or futuristic so much as it is realistic - if you do illegal things, then engage in suspicious behavior that attracts the attention of the police, you get arrested. Big surprise.

I could easily write a short story about withdrawing $300 from an ATM, driving to my drug dealer's house, parking outside of his house, waiting there for 15 minutes, then he climbs inside my car and the police light us up. At that point I could A. Plead the 5th and hire a lawyer when I get out of jail or B. Slip past the police and slash their tires before running into the woods and making my way back to my place - how heroic - before just getting arrested on way worse charges as soon as the police can get to my house


Switch to manual mode and drive it with the joystick. Self-driving cars will have to come with a manual override for unusual situations. For instance, suppose I am parking in a vacant lot for the ball game. Where in the lot does the autopilot park? Or how about parking in my garage which has inches clearance on each side?


Personal self driving cars will have some form of override for a long time if they ever get rid of it completely. I could see them being locked into a slower supervised manual mode once it's well established though.

The control-less versions will be Uber/taxi replacements that will drive much more constrained environment. They'll be dropping you off and moving to the next fare not parking in your tight garage or unimproved lot.

Also in the fictional semi-authoritarian scenario where the police can redirect your car they've probably been given a lock out on the manual controls too.


There is so much fear in pop culture of human-sized robots (I, Robot), giant ones (Transformers) and ones in charge of nukes (WarGames) but the 500 Years War is going to be with nano-molecular-bots that infiltrate our blood stream and spread like wild viruses. How do we combat human engineered nano invaders? No, seriously. How? Any ideas? ...


Don't worry classic nano-tech robots were pretty much a scifi pipe dream. And if it does work we just have a friendly immune system of bots inside us fighting back. See Diamond Age: Or, A Young Lady's Illustrated Primer. The nano-bots there are a pretty reasonable end point for if nano-tech ever actually works (and we don't immediately die to grey goo, though parts sound like there were times in the book where it almost happened but the good bots won).


Friendly nanobots to fight them? The same way we fight normal viruses, I guess.


See The Diamond Age, where the poor areas were covered in "toner", the remains of nanobots killed in an endless self-perpetuating war.


we fight normal viruses mostly one at a time like Ebola, Zika etc. Imagine a million variants invade the human and live stock species at once. How would we deal with that? It's terrifying.


That just doesn't feel very realistic to me. Weapons like that don't just appear out of nowhere. I imagine countermeasures could be developed at roughly the same pace as nanobot threats. And if not, we'll deal with it the same way we dealt with nukes. Hopefully it would go a little better this time, since we have a little apocalyptic experience to learn from.

And why is this limited to nanobots? Wouldn't it be easier to develop a million variations of existing biological viruses, rather than creating a million "species" of nanobots from scratch?


Grey goo has frightened me since I was a child. Now this. Thanks!


If we ever get there - it is the fault of every person that didn't mind walled gardens initially. And endorsed everything in the cloud. And who could have guessed that having root is important. I guess Steve Jobs did - that is why he wanted to keep it for himself.

Once you give control over something important - wrestling it back is hard to impossible.


Or is it our fault for not educating them? Most consumers just want "something that works".


I'd have enjoyed it more if this weren't some veiled transgender identity Socratic conversation was dropped entirely because it serves no purpose for the main narrative.


It does emphasize fear of jail / prison, as well as that treatment "non normal" people often get at the hands of authority who may not be equipped to deal with, or understand them.

Many more people fall into this scenario than we may think at first glance.

The angry, defiant tagger, rightfully worries over how petty crime charges can escalate into something much more ugly.

This drives the urgency needed for the hack to escape plot elements.

Today, women, non ordinary white people, tatted up people, gay, etc... all have concerns related to law enforcement. This character kind of sums them all up in a brief future looking way.


One reason why Apple is stepping up on this now?


What better way to sell a fictitious article to Hacker News than as a cyberpunk piece?


I don't think most people would have a problem with this kind of thing, just like they don't have any problem with giving up their privacy since "they have nothing to hide". I hope I'm wrong.


Interesting discussion idea, poorly written article.


You aren't going to persude anybody to your point of view if you make a bald statement without any justification.

I actually quite enjoyed the writing.


To be fair, you've provided equally little justification for liking the article. I don't see why only negative opinions should need justification.


If anyone can jump in, I liked the Chekov's Gun of how the car decides what counts a person. That was inserted smoothly enough that I didn't wonder at the time of reading it if that information was going to turn out to be relevant towards the end.


You could argue that grandparent's criticism was presented as the more objective 'poorly written', which suggests a lack of skill on the author's part. Whereas the parent seems to address the style of writing, which is more subjective and less demanding of citation.


Yes, but that doesn't seem like a very strong point. "I found the article poorly written" would likely also be equally objectionable.


OK, let me rephrase- I find this is a great discussion topic and I personally find the article a bit hard to follow.


It's not an article. It's a sci-fi short story.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: