I've remarked in various places that the Labour Party is currently in such disarray that if Theresa May kicked off a general election campaign with an hour of drowning kittens on live television she'd still win by 100 seats, or words to that effect. It's basically Leo in The West Wing:
"To sweep all fifty states, the President would only need to do two things-- blow the Sultan's brains out in Times Square, then walk across the street to Nathan's and buy a hot dog."
Which is all good knockabout politics, although I don't hear anyone, _anyone_, whether in the pub or in a CLP meeting, who takes Labour winning in 2020 are requiring anything less than a multi-dimensional miracle.
Unfortunately it does bring up one grim thought to depress me (and I've stopped listening to Today, that's how bad it is). At the moment, there is little the Tories need fear with regard to losing an election, and on the current trajectory with an increased majority, too. Abolish all free education past the age of 11? Fifty quid to see a GP? Declare war on Switzerland and make the eating of Toblerone an act of treason? Whatever: they still win in 2020. There is almost no policy, no matter how toxic, that the Tories could enact which gets the current Labour front bench into office. All the Tories have to do is kick back, chill their beans, and weigh the ballot papers. Labour need a manifesto which challenges UKIP to the right in the north and the SNP to the left in Scotland without alienating London, and even were the party functional and led by an proven election winner that is almost impossible. The more likely dogs' breakfast in the manner of 1983 just means a massive defeat followed by a generation of in-fighting while the Tories celebrate by roasting poor people over an open fire.
Suppose, just suppose, that instead of seeking the glory of a 150 seat landslide, May decided to double down and run on a manifesto which wins by 50 seats but makes her the Thatcherites' eternal heroine by giving party faithful as much as they can possibly have, consistent with winning an election. Not just the Human Rights Act, but tear up PACE, Freedom of Information and the Data Protection Act ("red tape", "stopping the police doing their job"). Not just index linking of university fees, but uncapped, and while we're at it not only grammar schools in every city but post-16 education chargeable via loans. And just for shits and giggles criminalise abortion, bring back workhouses and repeal the Discrimination Act. Whatever: a scorched earth, salted fields, roll the country back to before the Great Reform Act extinction burst of atavism.
With a manifesto like that coming from the Tories, what would Labour do? Lose by fifty seats, that's what. Grim, isn't it?
]]>On 15 Apr 2016, at 01:00, Ian Batten <xxx> wrote:
If anyone is keen enough to be running their own VPN server for Apple clients
More detailed examination with coffee in my hand (hey, I teach two lectures on IPSec and IKE, so this is _real_ _work_) reveals that on the down-low, Apple have re-written the entire opening phase of their VPN software and released it on two platforms over the past couple of weeks.
Historically, the Apple L2TP-over-IPSec implementation was as brittle as thin glass. The recommended deployment was talking to an Apple “Server” on OSX, but if you wanted to roll your own, it was very difficult to end up with an IKE configuration which would work with the Apple clients and also work with anything else. In essence, you had to configure the server with exactly the algorithms used at each phase by Apple, and none others: if you so much as mentioned an algorithm the clients didn’t support, the whole thing collapsed. I don’t have anything other than Apple kit in my mobile VPN estate so this didn’t matter to me, but I gather from former colleagues that using the Apple VPN client and the Microsoft VPN client into the same server is the best tool in your Cisco’s salesman’s box to convince you to just buy the end-to-end Cisco solution. Which Apple kind-of admitted by shipping the Cisco VPN client, branded, as a standard part of iOS (I think I’m right in saying that it’s the only piece of iOS as installed on a new device which has anyone else’s branding on).
The new stuff is completely different. You can turn on all the algorithms you like, and the Apple clients (a) in main mode, negotiate a sensible mutual combination of algorithms and use those for the rest of the exchange and (b) more impressively, in aggressive mode (where the two ends need to know in advance what algorithms are in use, as there’s no “what has and encryption do you fancy?” phase) it steps through a sequence of proposals to try to find one that works: that’s not fast, but at least it works. So you can turn on the offer of algorithms that Apple don’t support yet (large DH groups, EC crypto, SHA512, that sort of thing) and leave them there waiting for the clients to catch up, and for use by more capable clients.
There’s some other changes which aren’t as easy to analyse. The negotiation of PFS has definitely changed: it used to be that if you asked for it on the server, the client dropped the connection, now you can have it enabled with a group selected. But it’s not obvious whether it’s actually respected: since you can ask for crazy groups (6144 bits) or for things that don’t appear to be supported anywhere else in the Apple client (EC) and it still “works”, the implication is that the client is just doing a better (or worse, depending on your view) job of negotiation and is not using PFS even though it’s offered. I’m not sure how to check this. The packet sequence is the same, and although the contents are different they are encrypted: I’d need to find a way to get hold of the Phase 1 keys and use them to decrypt the Phase 2 packets in order to check. My gut feel is that Apple haven’t added PFS, they’ve just fixed the negotiation so it’s rejected cleanly.
It’s interesting that there’s a paper which raises concerns about widely deployed IPSec configurations, and within six months Apple are fielding a complete suite (they’ve made the same changes to the server, but I’m not using that code) of changes to close the whole issue down. They are playing hardball with the US government.
ian
]]>One day, I am going to get around to writing my magnum opus on the mistaken beliefs that some security people have about threat actors. But today, I’m going to consider one of them: the terrorist with perfect operational security.
There are a whole range of arguments which assume that there is no point is society adopting mechanisms to attempt to defend itself, because our enemies have perfect operational security. There is no point in intercepting communications because they all use encryption, both effective algorithms and with flawless security around key management (a feat few national agencies have managed). There is no point trying even traffic analysis because they all use TOR with flawless, error-free precision (even though there is ample reason to believe this is very difficult). There is no point using ANPR because all criminals drive stolen cars with false plates (although this weekend’s Paris attacks used hire cars). And so on.
There are good reasons to be wary of security service claims as to the efficacy of their boxes of tricks, and certainly we need to balance civil liberties and security agendas. We need to do this all the more in the aftermath of appalling events as happened in Paris this weekend. But we need good arguments. Arguments which presume that terrorists are criminal masterminds with not only access to, but the skills and discipline to use effectively, top-quality crypto and therefore interception is pointless are just wrong. Terrorists have many things to be doing while planning an outrage, and they clearly are not communicating using perfectly-used one time pads.
]]>There's a fairly well documented, and rather annoying, bug in Emacs 24.3 on OSX 10.9. Under some circumstances it either consumes a lot of memory and CPU and starts to run very slowly, or it causes distnoted to do likewise. It happens particularly after sleep and wake-up, and if distnoted is the victim it's usually enough to get the fans to come on and stay on. It happens to me roughly once a week. The bug is present in the binaries available from http://emacsformacosx.com.
There is a patch:
It's apparently incorporated in the 24.4 pre-tests and nightlies, if you like to live dangerously.
I've applied the patch to a set of clean 24.3 sources and compiled it on 10.9.4 with the latest version of XCode, to get the fix without any other changes.
If anyone needs the binaries:
http://www.batten.eu.org/~igb/emacs-24.3-leakpatch-mavericks.tar.xz
SHA256 hash [1] is f94c2f9dbf40ff42dd8ee41ce7fab4e1f5208c2178aa99ab8a8344560e49d41c
Just untar it and move the resulting Emacs.app directory to /Applications or wherever you keep such things. The OSX tar command now automagically handles .xz.
Aficionados of the ludicrous bloat of modern software will have their prejudices confirmed upon learning that using a good compression algorithm, the installation kit (ie a tar of /Applications/Emacs.app) is 100MB.
ian
[1] openssl dgst -sha256 -hex < emacs-24.3-leakpatch-mavericks.tar.xz
]]>
The allegations relating to schools in Birmingham raise serious questions about the quality of school governance and oversight arrangements in the maintained sector, not just the supplementary schools that would be signatories to this Code of Practice. How did it come to pass, for example, that one of the governors at Park View was the chairman of the education committee of the Muslim Council of Britain? Is it true that Birmingham City Council was warned about these allegations in 2008? Is it true that the Department for Education was warned in 2010? If so, why did nobody act?
* And now, just to make sure things don't run too fast.. * On a 60 Mhz Pentium this takes 34 msec, so you would * need 30 seconds to build a 1000 entry dictionary... */
Part of the reason why Caredata has become such a hot topic is the revelation that patient-level data was sold to actuaries, for a study into which factors are meaningful when assessing premiums. And that when this was revealed, no-one appears quite clear who approved it, and under what rules. There is now some significant debate as to whether this sale was wrong, whether it was permissible under the rules at the time, whether it would be permissible now (ie, under the Caredata rules as planned for the now-delayed spring 2014 launch) and whether it will be permissible under the hypothetical rules Jeremy Hunt is proposing in the aftermath of Friday's announcement of new legislation.
The problem seems to be a governance structure that is so complex that actual responsibility and accountability has been diffused to the point of invisibility. There is a complex mesh of advisor groups, boards and executives --- has anyone seen a diagram? --- but, when an actual case is challenged, no-one appears able to point to who took the decision, and under what rules. Even if the people who agreed the release of the IFoA can be identified, it's not at all clear what rules they were operating under and whether those rules were followed. The failure of the HSCIC to produce a code of practice exacerbates this.
The governance should have three clear components.
First, there should be a set of rules setting down the purposes for which data can be released, and in what form. The rules are owned by a group of people, with a named chairman, who sign off successive releases of the document. If the rules are found to be inadequate, either because they do not cover some case or because public opinion challenges the contents, that group of people are tasked with re-writing it. Those people are appointed by a minister who is democratically accountable to parliament (or, more probably, a select committee); it is likely that the process and policy for these appointments would be the subject of secondary legislation or the schedule to primary legislation. This is strategy.
Secondly, there should be another group of people who consider requests for access and evaluate them in the context of the rules. These decisions should be uncontentious, and if there is disagreement between reasonably informed people then that is more likely to reflect a problem with the rules than anything else. These people will probably need to be employees of the agency handing the data as the decisions will need to be made relatively quickly, but as they wield relatively little power this is not of itself dangerous. This is tactics.
And finally, there needs to be oversight that the decisions are being made correctly and that the process is fit for purpose. This could be done by a select committee directly, is more commonly done by appointing a retired judge or similar to act as a regulator. This person does not make decisions or policy, but confirms that the process is being followed, samples decisions to check in detail, and reports annually. This is audit. For all the fact that the legislation has many problems and there has been a lot of dispute, the role of the Interception of Communications Commissioner is a good model.
One committee, named and appointed by a minister who is democratically accountable, sets detailed policy. A second committee executes it. A commissioner checks the process is being followed.
That way, when things go wrong, people can be held to account. Democratically.
]]>One of the most common memes used in support of mass health data projects is that the data supports important research. Whether it is disease causation, effective treatment, epidemiology, drug side-effects, researchers need large amounts of data, so your data matters.
But from the perspective of a patient, ie you, your data doesn't matter.
Your data would only matter if a study which looked at the whole dataset would have a different outcome with or without your participation. But in a dataset covering 47m people (the size of the Hospital Episode Statistics database) or around 53m people (the number of people registered with general practitioners in England, assuming everyone is), the chances of your individual record being anything other than statistical noise are infinitesimal. In order for that to be the case, you would have to be very unlike the rest of the dataset, but mass population studies rarely identify things that affect only one person. So there always be sufficient people who look like you to fill your place in the analysis. And of course, the chances of a medical breakthrough hinging on your personal data, _and_ being related to a condition you have, _and_ producing a change in treatment quickly enough to benefit you are similarly small. An infinitesimal chance of a very small benefit has a net present value of zero, for practical purposes.
On the other hand, the risk of the data being leaked, re-identified or otherwise mis-used is greater than zero. We don't know how much greater, and without a code of practice we can't calculate it. But if, for example, your health record in which you talk to your GP about your depression were leaked to your ex-spouse in a contested custody battle, the effect would be immediately harmful. That's an immediate risk: a small chance times a very large disbenefiit has a net present value considerable greater than zero.
Now the problem with this, of course, is that if everyone thinks like this, there is no data. But of course, they won't; Germany's scheme is opt-in, and yet has reasonable numbers of participants. But shouting yet more loudly about potential benefits doesn't work, because that has already been written down to zero. What needs to happen is calm, rational discussion about why people are over-estimating the potential harm such a project can cause. And without transparent, accountable organisations handling the data, that will never happen.
ian
]]>One issue that has arisen in the debate about the release by either the HSCIC or its predecessor NHSIC is the joining of the HES hospital data against Mosaic demographic data.
This would have been done by NHSIC. And once they had made the basic decision to release the data in the first place (a separate discussion) this was the _right_ thing to do, and it would be the correct way to do a similar task for a less controversial research project.
Mosaic data maps very small areas to demographic tags. Let's assume that the data goes down to full postcode level (I believe that in some cases it's slightly less granular than that).
The Mosaic data would look like this:
X12 3YZ Demographic Description 1
X12 3YY Demographic Description 2
X12 3YX Demographic Description 1
X12 3YW Demographic Description 1
There are a lot of full postcodes in the country (I'm guessing, but around 2m --- 20 million houses, ten per code). There are a few hundred Mosaic descriptions, if that.
So the process will have been something like this:
IFoA take the Mosaic data and, with Experian's agreement, pass it to the NHSIC for this specific purpose (this is a standard thing to do with this sort of data).
NHSIC join the HES data against the Mosaic data using the postcode as the key, so that each HES record is extended by a demographic description.
NHSIC then truncate the postcodes to the agreed length (probably just the initial letters like "B" or "SW" would be enough) and hand over the records. All that IFoA see against each patient is therefore a very low resolution postcode, which will match an entire city or county, plus a demographic tag, which will be shared amongst tens of thousands of postcodes.
The basic agreement to release data to the IFoA is something that there is a lot of dispute about, and I think it was a very, very bad thing. But once you've made the decision to do it, what was done with Mosaic tags was the right thing: the IFoA got the data they could use, and the level of resolution in it was appropriately reduced.
ian
]]>From: MAILER-DAEMON@nhs-pd1e-esg001.ad1.nhs.net (Mail Delivery System)
Subject: Undelivered Mail Returned to Sender
Date: Mon 24 Feb 2014 07:34:54 GMT
This is the mail system at host nhs-pd1e-esg001.ad1.nhs.net.
I'm sorry to have to inform you that your message could not
be delivered to one or more recipients. It's attached below.
For further assistance, please send mail to <postmaster>
If you do so, please include this problem report. You can
delete your own text from the attached returned message.
The mail system
<england.cdo@nhs.uk>: mail for nhs.uk loops back to myself
Reporting-MTA: dns; nhs-pd1e-esg001.ad1.nhs.net
X-Postfix-Queue-ID: 8195544916D
X-Postfix-Sender: rfc822; igb@batten.eu.org
Arrival-Date: Mon, 24 Feb 2014 07:34:53 +0000 (GMT)
Final-Recipient: rfc822; england.cdo@nhs.uk
Original-Recipient: rfc822;england.cdo@nhs.uk
Action: failed
Status: 5.4.6
Diagnostic-Code: X-Postfix; mail for nhs.uk loops back to myselfFrom: Ian Batten <igb@batten.eu.org>
Subject: F.A.O. Information Governance Compliance Team
Date: Mon 24 Feb 2014 07:34:50 GMT
The Staple Inn Actuarial Society processed a large volume of Hospital Episode Statistics, which they also joined to Experian credit reference data.
Please supply:
* The submission made by SIAS in support of obtaining this data. This may take the form of a Privacy Impact Assessment, a Research Proposal, or some other document.
* The minutes of meetings at which this proposal was discussed.
* Details of the financial settlement between HSCIC and SIAS.
* Details of any agreement between HSCIC and SIAS which permits the combining of HES data with Experian data
It has been clearly stated by Geraint Lewis, NHS Chief Data Officer, that insurance companies are not able to purchase HSCIC data for commercial use, and that HSCIC does not sell data on a commercial basis, it only recovers costs. I therefore give you advance notice that any refusal on the basis of "commercial confidence" will be the immediate subject of an appeal to the ICO.
Ian Batten
XXX
Birmingham
XXX
Year on year, students’ test marks get better and better. By the time they leave school, they are ahead of students in other schools and are well prepared for their next stage of education, employment or training.
Too many students fail to make the progress expected of them in English and mathematics across Key Stage 3. When they begin their GCSE studies, they have too much ground to make up and, as a result, GCSE results for the last two years have been much lower than expected, given the students’ starting points.
Teaching is mostly good or outstanding which is why students make such good progress.
Too much teaching is ineffective and not enough is good, leading to students’ inadequate achievement.
The headteacher and senior staff know what the issues are for the school and quickly sort them out.
Leadership and management are inadequate because the school is not improving quickly enough. There has not been a sufficiently cohesive drive by leaders and staff to raise standards. Leaders have an overly positive view of the quality of teaching and the school’s performance. They do not analyse the performance of key groups of students sharply enough to help them plan effectively for improvement.
Behaviour is good in lessons and around the school. Students have good manners and respect adults. They enjoy and feel safe at school and their attendance has improved.
• Some students do not feel safe around the school grounds.November 2012:
Students with a visual or hearing impairment or those who have other special educational needs also make good progress in The Hub, due to the high quality of support they receive from teachers and specialist support workers.
• Disabled students and those who have special educational needs, including students in the Hub, make insufficient progress. Although teachers clearly identify these pupils and receive good information about their circumstances, not all teachers adjust their teaching to meet their needs. The large majority of disabled pupils and those who have special educational needs in Year 7 last year, made little progress or went backwards in English.
• The governing body is aware of the quality of teaching and the strategies the school is using to improve its quality. Governors are aware of how well students are doing, including in comparison with students nationally. They regularly ask questions about teaching and attend some of the staff training sessions. As a result, they are aware of which staff are performing well and how performance management is being used to reward staff when they have made a difference to students’ outcomes. Governors themselves are well-trained through a planned programme and hold the headteacher to account for the performance of the school. They have ensured that the pupil premium is used effectively to help potentially disadvantaged students to do better.
• The governing body has not questioned the school’s leaders robustly enough about students’ achievement; the decline in performance has not been investigated and they have not checked on how well groups of students are doing. They have simply accepted information given to them by senior leaders and, as a result, they do not hold an accurate picture of the school’s effectiveness. They do not have a deep enough understanding about the quality of teaching across the school. Governors have not held school leaders to account for their actions and, as a result, have not sufficiently challenged them about needed improvements.
A colleague and I have an on-going conversation about how universities in the Russell Group are going to be affected by the 20% drop in the number of 18 year olds over the next ten years. Simplifying what has been a long-running debate, he essentially argues that any fall in numbers applying at the upper end of the current spectrum of institutions will be back-filled by people who might otherwise have applied to less selective institutions, while I argue that there are practical, cultural, academic and other reasons why people will continue to apply to post-94 universities (particularly urban ones) even if there are places going spare in the Russell Group, even if they go on to get A Levels which would get them a place in the RG institution. Time will tell, I suspect, but universities whose recruitment is entirely predicated on people living away from home --- most of the post-Robbins universities --- are going to feel the chill more than the traditional metropolitan Redbricks with a large "home" source of students.
Reading through a prospectus is not the same as coming to visit the University in person. ‘Invitation only’ applicant visit days are held from November to April when students who have applied to study at [RG] are invited to a VIP day where you can meet current
You are welcome to drop in on a class – just search for classes that you might enjoy. Or, thanks to Open Yale Courses, you can try out a Yale class online at anytime
This doesn't appear to be a hollow offer --- my elder looked up what lectures were happening on the day we intend to visit Yale, mailed the lecturer and within 35 minutes got an enthusiastic response, finishing
Have a safe trip, and I look forward to meeting you next week.
[[ Not theory, not technical, but if you're interested in policy... ]]
It's a sensible idea to provide a renewal schedule for photographic ID, to account for changes in appearance and to "time out" tokens which might have weak physical security. So passports, for example, force renewal every five years for children and young adults and every ten years for old adults. The photographs are reasonably up-to-date (although five year old children can travel on photographs of them as a baby, which might not be good news in custody disputes) and passports only have to withstand a maximum of ten years of attempted forgeries. For example, recent EU passports have the passport number perforated through each page, presumably to defeat the swapping in of pages with visas, or swapping out of exclusion stamps. Older ones don't, but within ten years of the introduction of that security measure, all passports will have it. Attackers might choose to forge or modify an older passport, but they can't choose to manipulate a 1994 Blue British Passport with handwritten details and a photograph glued in, because even if they were able to modify the expiration date, the physical design itself has expired as well. So when photographic driving licenses were introduced, there was a lot of noise about the photocard portion having a ten year renewal schedule, the same as passports. This way, the photograph would always be relatively recent, and at least no worse than a passport photograph. However, it was a lot of work issuing photographic driving licenses, as it required photographs to be endorsed and validated and scanned. Once the passport office started scanning photographs and signatures and printing them in passports, rather than physically attaching a real photograph and sending them out for the bearer to sign, the solution was obvious: link driving licenses to passports, so that the same photograph was used for both. Provided you have a passport, you can apply for a driving license in the same name with the same photograph, modulo other proofs of shared identity. And you can do this irrespective of when the passport was issued. Can you see the problem? I've just received my first photographic driving license, applied for using the "linked to a passport" process. My passport was issued in 2003. So my new driving license, valid until January 2023 (when I will be 58), features a photograph taken in 2003 (which I was 38). I can drive, perfectly legally, in 2023 with a _twenty_ year old photograph on my driving license. So, why do we need to renew them every ten years? OK, the anti-forgery aspect of it is an issue. But most uses of driving licenses as driving licenses, rather than as ID cards, are checked online with the DVLC. Because you can't have driving license until you're sixteen, the ability to forge an older, weaker driving license isn't of much use for the main use case of posing as being old enough to buy alcohol, and I seriously doubt that even the Level 1 Security features in the license [1] are in reality checked by publicans. ian [1] UK Driving License Security Features]]>In 1986, you could bring up a network of half a dozen Sun workstations, each running a perfectly capable 4.3bsd-derived Unix which most people today would be perfectly happy to use in terms of functionality (the 15MHz 68020 might not be such fun) off a single 327MByte Fujitsu Super Eagle disk [1], with plenty of room left to do real work.
Today, the installation image for a Raspberry Pi is 1.8GBytes.ians-macbook-air:Downloads igb$ ls -lh 2012-12-16-wheezy-raspbian.img
-rw-r--r--@ 1 igb staff 1.8G 16 Dec 18:52 2012-12-16-wheezy-raspbian.img
ians-macbook-air:Downloads igb$
And the update kit looks like about another 475MBytes, too. remote: Counting objects: 21472, done.
remote: Compressing objects: 100% (7381/7381), done.
Receiving objects: 70% (14868/21219), 331.89 MiB | 32 KiB/s
[1] I found the securing straps from the pallet it was delivered on recently: I'd been using them to tue things down in the boot of one of the cars.
]]>Having been watching the demolition of Bournville College, I hadn't noticed that Shenley Court was being demolished as well. I hadn't been past it in daylight for a few months, so by the time I noticed yesterday it was pretty far gone.
Want to restrict access to Facebook after 8pm? Decide to allow younger children to view fewer sites than their older siblings? Or want to stop access to certain sites altogether? Now you will be shown how to do it.Read more: http://www.dailymail.co.uk/news/article-2250809/Victory-Mail-Children-WILL-protected-online-porn-Cameron-orders-sites-blocked-automatically.html#ixzz2FaHpxWqU
Follow us: @MailOnline on Twitter | DailyMail on Facebook