Submitted via IRC for Bytram
In the introduction to her new book, Hannah Fry points out something interesting about the phrase "Hello World." It's never been quite clear, she says, whether the phrase—which is frequently the entire output of a student's first computer program—is supposed to be attributed to the program, awakening for the first time, or to the programmer, announcing their triumphant first creation.
Perhaps for this reason, "Hello World" calls to mind a dialogue between human and machine, one which has never been more relevant than it is today. Her book, called Hello World, published in September, walks us through a rapidly computerizing world. Fry is both optimistic and excited—along with her Ph.D. students at the University of College, London, she has worked on many algorithms herself—and cautious. In conversation and in her book, she issues a call to arms: We need to make algorithms transparent, regulated, and forgiving of the flawed creatures that converse with them.
I reached her by telephone while she was on a book tour in New York City.
(Score: 3, Interesting) by splodus on Monday November 19 2018, @01:36PM (7 children)
I’m sure people are gonna hate this, cos it’s REGULATION!
However, I think she’s got it dead right with her drugs and snake-oil analogy…
Now of course you can take the line that ‘people should be able to mix St Johns Wort with glucose and claim it cures cancer – and the free market will put them out of business!’ But we know that doesn’t happen in reality – people still buy homoepathic ‘remedies’ even when they know for a fact they are just buying lactose pills and nothing else. Because ‘water has a memory...’ etc – some people will believe anything if the Government does not regulate against bogus claims.
It’s going to slow down innovation, some innovations won’t happen cos the process will be too onerous. It risks legitimising software patents or some similar new intellectual property legislation. There will be a can of worms over the definition of ‘algorithm’ at least.
I’m sure most of us here remember the legal challenge to software in speed-gun devices? How do we know the speed-gun is correct if we can’t see the code?
She uses the example of court-room AI to decide on bail cases. It’s only a matter of time before companies are selling expert systems that claim to make sentencing ‘objective’ - and ultimately assist juries in deciding guilt. Do we want those systems to be secret? Free from regulation?
It’s easy to hand-wave away concerns about Facebook influencing public opinion – but if a totally unregulated, opaque ‘algorithm’ is deciding whether or not you (or your son) is guilty of rape? Who will be comfortable with that?
And I think that is a good example – because there is tremendous pressure right now to ‘increase the number of guilty verdicts in rape cases’. A tweak of the threshold parameters in the algorithm will deliver that – but who will oversee the tweaking?
It’s going to be fraught, of course – but it’s people like us, who have an inkling of the problems AI and Expert Systems might suffer? Surely we’re the people who need to consider this issue? ‘The Public’ have trust in computers – that they are entirely objective and immune to bias!
I can’t help thinking that if we don’t get this regulated right now, it’s going to be decades before the worst consequences of flawed systems gain any recognition.
There’s a hell of a lot of damage could be done in the meantime, even if the free market sorts it all out in the end...
(Score: 3, Insightful) by khallow on Monday November 19 2018, @02:30PM (2 children)
What makes you think such programs are free from regulation now? What exactly would be made non-secret by a FDA-style organization that would be useful?
I hate this not because it's regulation, but because it's particularly stupid regulation. It'll be great for slowing things down, derailing productive economies, creating secure oligopolies (like Big Pharm), and just being yet another hoop of meaningless idiocy that businesses would have to jump through in the course of trying to function. There's no problem to address, and we already have a strong indication that the real world FDA works awfully with billions of dollars gambled away on various chemicals and medical procedures, often eventually rationalized only by some dubious p-hacking.
aristarchus complained in his recent journal [soylentnews.org] about "sociopaths" opposed to ethics. This is really how you create such opposition. Here, we have an ethics proposal that is great for controlling various parties, but useless for fixing actual ethical problems, while committing considerable harm in the process.
(Score: 2) by splodus on Monday November 19 2018, @05:17PM (1 child)
Thanks for that.
You are quite right – I’ve assumed there’s no regulation just cos I’m not aware of it, and she hasn’t mentioned any, but works in the field – so I’m just guessing she would have said if so... If there is regulation that you’re aware of, I’d be grateful if you’d link to some info?
I don’t think we can pronounce her thesis as ‘stupid’ till we know more – not sure I can be arsed to read her book, but it is something that interests me, so maybe I’ll find the time! (I’m hoping there’s more detail than just ‘we need an FDA!’ - but also, highlighting a problem doesn’t necessarily require a solution to make it legitimate, surely?)
I absolutely agree with you that regulation is going to cause its own problems – and I also agree that big pharma has benefited from the FDA’s systems at the expense of small operators and innovation in general…
Would you favour leaving things as they stand then? Leave it to the Free Market?
My default position is to avoid regulation whenever possible, so I’ve got a lot of sympathy with your view. I certainly wouldn’t want to see a situation where no one could write software without jumping through a load of legislative hoops!
There’s a danger of ‘something must be done!!!’ It’s a recipe for unintended consequences. I can’t say existing laws regulating medicines have definitely had a net benefit; maybe we’d have a cure for cancer now if it wasn’t for the FDA?
At the very least this must be a debate worth having?
I guess my fear is that the vested interests will object, of course – and they are powerful indeed. If the tech community doesn’t speak out about the dangers of relying on AI, then who will?
The media is making money off this tech – the big money is consolidating its position with it – so if the Courts become dependent on it, is there any avenue left to challenge the outcomes, in principle?
(And sure, you could argue that ‘people won’t stand for flawed judgements! They will vote out those who support it! Democracy!' But here in the UK – we are about to jump off a cliff as a direct result of the smart use of opaque algorithms to sway public opinion...)
(Score: 1) by khallow on Tuesday November 20 2018, @01:39PM
For example, there's plenty of cases where technologies or particular practitioners of technology have been ruled out as court evidence. So for the algorithm that's supposed to be used for sentencing? Show bias of the illegal sort (such as against gender, religious beliefs, etc in the US), and your have the basis for overturning every bit of sentencing done with that algorithm.
Scratch the surface of almost any human activity and someone is regulating it. You just need to look.
It still needs to be a problem in the first place. The key flaw is simply that an algorithm is not an action. When you regulate algorithms, you aren't actually regulating the problem behavior. And as I noted earlier, there's already regulation (and means to implement more regulation should that become necessary) that don't require any sort of specialized regulatory system. People have been implementing bureaucratic algorithms, for example, for thousands of years.
Further, most of the problems mentioned would be problems no matter what the algorithm was. Such as the project that siphoned a couple million UK residents' health data.
What AI? It's not what we have now. And we don't know enough, in the absence of credible AI, to decide what aspects need regulation.
(Score: 0) by Anonymous Coward on Monday November 19 2018, @05:03PM (3 children)
you don't have to set up a parasitic, unconstitutional bureaucracy, ffs. just make a law that says all government purchased software must be Free Software. done.
(Score: 2) by splodus on Monday November 19 2018, @05:51PM (1 child)
I wouldn’t object to that – but is it realistic? Would a law like that ever get passed in the US? I don’t think it could happen in the UK…?
(Score: 0) by Anonymous Coward on Tuesday November 20 2018, @04:09AM
Not until the working class seizes political power and gains democratic control over the major industries, including software. Proprietary software would fade away after we open source all of the major software companies' code.
One would also hope such a system would prevent the kind of software developer only possible through private sponsorship (*coughpoetteringcough* pardon me, just some symptoms) from ever entering the field or at least prevent him from gaining any sort of influence over anything important.
(Score: 2) by deimtee on Tuesday November 20 2018, @02:06AM
You would never get that, but you might be able to argue for all software purchased by the government must include a full copy of the source code and build environment. You are not prohibiting anything, just ensuring that it is auditable and maintainable.
You would need to come up with a title that acronyms to SECURE or SAFE or PUPPIES or something.
PUPPIES would be good, you could ask opponents "Why do you hate puppies, are you some sort of monster?".
200 million years is actually quite a long time.