Jump to content
The Education Forum

Bill Fite

Members
  • Posts

    279
  • Joined

  • Last visited

Everything posted by Bill Fite

  1. My 2¢: If one is open to the possibility that Oswald was being framed then It doesn't matter whether he has an alibi (let's say witnessed being on the front steps or eating with coworkers at 12:30) because he's already on the hook for supplying the rifle (framed for the rifle purchase or purchased himself). Then having Oswald accused as the shooter is just gravy. The Castro-commie link is already there with the rifle purchase moving the focus away from any and all non-Castro-commie conspiracies. The bullet fragment trail indicates a shot from the front - larger fragments travel farther. So then the challenge to show a conspiracy is to link the front and rear shots somehow or show more than 1 person involved in 1 or both shot sources.
  2. Pulled his shot not to hit Jackie? or just missed? Don't forget the shooter from the back missed JFK, everybody else in the car, the car, and even the road.
  3. Always wondered about that too. Or a shot that missed and imbedded itself in the grass across the street from the knoll.
  4. Hi Christian I've wondered about a process to do what you are doing since seeing Tom Wilson's interview. Would the process that you are doing consist of these steps? The image we see is a blending of adjacent pixel values of different wavelengths of light on the black - grey - white scale for example. Estimating the primary component wavelength or a confounding component wavelength. Removing that component Viewing resulting image to assess results Repeat process to remove another candidate component's wavelength. (With some pixel by pixel randomness added?) Or is that off base?
  5. IF the probability estimates are sound and extremely low that proves beyond a reasonable doubt (to borrow a phrase) that something suspicious is going on in the witness deaths. Playing the devil's advocate here: I would guess the probability of a mafia boss being murdered is actually several orders of magnitude higher than a person in the 1960s general population. It probably increases a couple more orders of magnitude if he is scheduled to meet w the Feds, no? Wouldn't the probabilities for murders of reporters who investigate criminal activities be much higher also? As for stealing notes - why would the criminal have to necessarily be looking for JFK notebooks and not notes on some other crime? I think you can make the anecdotal argument about specific deaths from now until forever and it won't convince many more people. If that would work wouldn't it have closed the issue already? On the other hand, if the estimated probabilities of the number of witness deaths are extremely low wouldn't that convince anyone other than an OJ jury member that something was going on? You asked how I would estimate probabilities for witness deaths and I answered that question.
  6. Hi W First thoughts: I think that there are 2 probability estimates you would need. I might estimate them using simulation if all the data was readily available. Data needed: * witness list * witness sex and age * time frame to use * categories of cause of death of witnesses in that time frame * dates of death * mortality rates by sex and age in each of the categories of death, maybe just the mortality rate alone by murder * dates witnesses were being called to testify or give a statement So the first probability that I would look at is the number of expected deaths in each category vs. actual deaths in that category. I would simulate 100K instances with the following steps. Set count of more_than_observed deaths = 0 Loop = 0 Set simulated_deaths = 0 for each witness generate a uniform random number in the 0 to 1 range -- call this u for each witness look up the appropriate mortality rate by sex and age -- call this p for each witness if their u < p then simulated_deaths = simulated_deaths + 1 if simulated deaths >= observed deaths in category then more_than_observed_deaths = more_than_observed_deaths loop = loop + 1 if loop = 100K STOP else GOTO step 3 After 100k simulations the probability of getting at least the observed number of deaths in the category = more_than_observed_deaths / 100K. ************************ For the second probability - witness death in some time period immediately before testifying (lets say 14 days) I might do this: N = number of days between JFKA and scheduled investigation appearance p = 14 / N then use the cumulative binomial probability distribution to get the probability of at least the observed witness deaths in that short time period. I think those estimates would work and hope that I've explained it well enough.
  7. Hi Kevin - it's interesting to think about this, at least for me.
  8. my 2¢: The way to deal with uncertainty is to use probability and statistics to quantify it by looking at all the data available and not by picking a few examples (you might be accused of cherry-picking coincidental data). In Reclaiming Science - the JFK Conspiracy - Charnin uses a Poisson probability distribution to model the probabilities of unexpected deaths of witnesses / interviewees of the investigations. If the 1960-1970 era age-corrected population death rates by suicide / heart attack / murder / cancer / etc. are correct then the probabilities of getting n unexpected deaths from the sample population are a way to determine if the deaths were part of the natural life and death process or not. The book is available on Kindle. Otherwise, you are left with an argument similar to - well one person duplicated 2 hits in 3 shots in < 9 seconds so that's proof LHO (or whoever shot from the TSBD) did it. Just because something is possible, doesn't mean it happened.
  9. Oh great - I have that book on Kindle & one of the same MSc degrees as the author. Now I'm gonnahavta look through it again - maybe even read it closely. I remember thinking it pretty thought provoking. I have always wondered about probability calculations and the JFKA - why they aren't used more - then I remember I did read Innumeracy. link to spreadsheet with all data, calculations, and results link to author's blog JFKA section
  10. Hi Pat If the data could be collected (by AI or web scraping or ...by hand) a straightforward approach would be to estimate a statistical model using multinomial logistic regression to estimate the probabilities for each of the targets of the research accepting it and publishing it.
  11. It very well may end up that the AI in the future will be reflecting a whole bunch of previously generated AI garbage all over the net.
  12. Meant by allowing him to run in the Dem primary and debating, isn't that what RFKjr wanted? or am I misremembering that?
  13. Would this have been avoided if Biden had agreed to debate him?
  14. That's where I started both academically and professionally. Mostly using a somewhat dead language PL1 which seemed like a combination of COBOL for input/output and Fortran for computation. Hand that deck in then go wait for an hour for the run to finish after those ahead in the stack. The poor guy who took the decks and then ran them was no longer needed once we could submit jobs from the desktop pcs.
  15. As someone who had their area of study renamed in the Great Renaming around the turn of the millennium from the name the general population didn't understand, Operations Research Analyst, to a name that apparently refers to some guys and gals in the back of a pharmacy, Prescriptive Analytics Engineer by the 'Data Scientists', I find it interesting to see algorithms that existed before being labelled AI. For example, the facial recognition algorithms were well-known decades ago before AI was in the news: I remember laughing at Polly Perrett's character on NCIS recognising a suspect by searching YouTube videos in maybe 15 minutes. Anyway, I've seen AI go from simple rule based systems, to providing solutions to complex planning problems in Prolog (how to re-assign airliners to gates when the schedule has been disrupted by weather cancellations and re-routings), to how to predict the next set of words given previous sets of words. IMO - Large Language Models (AI now) are programs that have implemented memorization which isn't Intelligence. The Prolog example is closer to real intelligence. But just my opinion. That said, I think the applications that have been discussed would be very interesting, just don't know if I would label them AI.
  16. another question, actually 2: Is this the same Guinn ran the NAA tests comparing the chemical composition of the MC bullets and refused to report the statistical confidence in his conclusions which were later refuted with 95% confidence? Where does the 2.5 hour estimate come from? experiments or guessing? I don't know if the 2.5 hours is the upper limit or how it was arrived at, but Guinn seems to have been mistaken in the past.
  17. Correct me if I'm wrong, but The NAA was positive for the casts from the hands, but could have been caused by handling printed materials among other items - books for example. So - don't most people wash their hands when using the toilet? Do most people wash their hands more thoroughly than their face? If LHO's washing was to remove GSRs why would he not concentrate on his hands? I don't follow the assumption that he would wash the GSRs off his face but not his hands.
  18. wrt Iraq: I always remembered this. When Gen. Colin Powell gave his presentation to the United Nations, the US had the Guernica tapestry covered. That should have told everyone what the truth was.
  19. @Pat Speer Thank you for all the effort you put into this and the clear explanation of all the results of the GSR / NAA tests and experimental setup. I've been looking for more details on the NAA tests for quite a while. Very well done and also well-written.
  20. Actually, the quoted text was written by Pat Speer - I just agreed.
  21. It's not just the paraffin test but the Neutron Activation Analysis tests performed on the Oswald paraffin and the paraffin tests on 7 FBI agents who test fired an MC rifle (the MC rifle). The NAA would be much more sensitive than paraffin tests. LHO's result was negative. All 7 FBI agents positive. A summary can be found here - http://www.22november1963.org.uk/jfk-assassination-neutron-activation-analysis. Pat Speer posted a really good explanation of this last week in this thread -
  22. I seem to remember that there was a JFKA conference in the last year where Ron Paul was going to present who he believed was responsible. Am I misremembering that? or does someone know what he said?
×
×
  • Create New...