As reported by Michelle Roberts, http://www.bbc.co.uk, 26 Jan. 2016, ‘Doubts have been raised about whether England’s NHS out-of-hours helpline is able to identify serious illnesses in children, after a baby died of blood poisoning following a chest infection.
NHS 111 call handlers are not medically trained, and a report on the 2014 death of William Mead, from Cornwall, said he might have lived if they had realised the seriousness of his condition.
But it said that if a medic had taken the final phone call, instead of an NHS 111 adviser using a computer system, they probably would have realised William’s “cries as a child in distress” meant he needed urgent medical attention’.
A data mining algorithm is a set of calculations that creates a model of action from data. To create a model, the algorithm first analyzes the data you provide, looking for specific types of patterns or trends. The algorithm uses the results of this analysis to define the optimal course of action. The result is a decision tree that predicts an outcome, and suggests remedies.
The 111 emergency service uses NHS Pathways as its decision-support software, which is also used by English ambulance services to assess 999 callers. The NHS system is managed by the Health and Social Care Information Centre. They acquired their decision-support software from AXA Assistance in 2000.
Charlotte Jones, the BMA’s GP lead on unscheduled care, says the reliance on algorithms is part of the problem, “they are not always applicable to the clinical setting or, if they are, they don’t allow for subtleties in symptoms, and symptoms don’t always fall neatly into boxes.
“So the computer algorithms that call handlers have to follow don’t allow handlers to move away from them when common sense or your own individual knowledge calls for it”.
Janette Turner, senior research fellow at the University of Sheffield, agrees that any kind of phone service is limited by the fact that it cannot diagnose.
“You need a clinician face-to-face to make a diagnosis, to look at people and do tests,” she says. “This is about assessing the level of urgency and the level of care. The question is how well algorithms can assess, compared with clinically trained staff.”
Lindsey Scott, director of nursing with NHS England in the South West, said: “Everyone involved in this report (into the death of William Mead) is determined to make sure lessons are learned from William’s death, so other families don’t have to go through the same trauma”.
Does that mean replacing cheap staff with a decision making machine with experienced and well qualified nurses who could have red-flagged William’s sepsis?
Of course not, it just means tweeking the machine.
There were other failings before William’s condition was fed into a machine (the out-of-hours GP service had not had access to William’s primary care records), but this young life may not have been lost if an experienced and well qualified nurse was at the other end of the phone.
They cut costs in order to privatise, so offering a tasty morsel to their corporate masters.
lenin nightingale 2016
The process of passing responsibility to ‘decision making trees’ is to spread within the NHS, as this piece of propaganda lays out: ‘The National Patient Safety Agency has developed the Incident Decision Tree to help National Health Service (NHS) managers in the United Kingdom determine a fair and consistent course of action toward staff involved in patient safety incidents. Research shows that systems failures are the root cause of the majority of safety incidents.
The Incident Decision Tree supports the aim of creating an open culture (my arse), where employees feel able to report patient safety incidents without undue fear of the consequences. The tool comprises an algorithm with accompanying guidelines and poses a series of structured questions to help managers decide whether suspension is essential or whether alternatives might be feasible. The approach does not seek to diminish health care professionals’ individual accountability, but encourages key decision makers to consider systems and organizational issues in the management of error. Initial findings show the Incident Decision Tree to be robust and adaptable for use in a range of health care environments and across all professional groups‘ (www.ahrq.gov).
There you have it, dearhearts. Nurses and doctors will be reported to the NMC and BMA if an Incident Decision Tree recommends this! Thus, all apparent bias is removed from this process. This is not so, of course, as anyone with half a brain knows that the decision you get out relies on the data you put in, and fictitious data can come from lying colleagues and vindictive managers. Imagine it, some drone of the NMC reads out the charges against you.” Incident Decision Tree, model xcd 12578, charges you with ….”.
Will ’employees feel able to report patient safety incidents without undue fear’?
You can not throw yourself on the mercy of an Incident Decision Tree because the vindictive manager will know you have done this, and ‘new’ evidence will be dredged up from imagination, so that the ‘Tree of Justice’ comes out with the decision they want.
OUR MASTERS ARE CONTEMPTUOUS OF EVERYONE, INCLUDING THEIR NHS MANAGEMENT UNDERLINGS.
THE ONLY SOLUTION IS TO REPLACE OUR MASTERS.
lenin nightingale 2016