Jump to content
The Education Forum

Bill Fite

Members
  • Posts

    275
  • Joined

  • Last visited

Profile Information

  • Location
    france
  • Interests
    math JFK's

Recent Profile Visitors

498 profile views

Bill Fite's Achievements

Collaborator

Collaborator (7/14)

  • Conversation Starter
  • One Year In
  • Dedicated
  • First Post
  • Collaborator

Recent Badges

  1. IF the probability estimates are sound and extremely low that proves beyond a reasonable doubt (to borrow a phrase) that something suspicious is going on in the witness deaths. Playing the devil's advocate here: I would guess the probability of a mafia boss being murdered is actually several orders of magnitude higher than a person in the 1960s general population. It probably increases a couple more orders of magnitude if he is scheduled to meet w the Feds, no? Wouldn't the probabilities for murders of reporters who investigate criminal activities be much higher also? As for stealing notes - why would the criminal have to necessarily be looking for JFK notebooks and not notes on some other crime? I think you can make the anecdotal argument about specific deaths from now until forever and it won't convince many more people. If that would work wouldn't it have closed the issue already? On the other hand, if the estimated probabilities of the number of witness deaths are extremely low wouldn't that convince anyone other than an OJ jury member that something was going on? You asked how I would estimate probabilities for witness deaths and I answered that question.
  2. Hi W First thoughts: I think that there are 2 probability estimates you would need. I might estimate them using simulation if all the data was readily available. Data needed: * witness list * witness sex and age * time frame to use * categories of cause of death of witnesses in that time frame * dates of death * mortality rates by sex and age in each of the categories of death, maybe just the mortality rate alone by murder * dates witnesses were being called to testify or give a statement So the first probability that I would look at is the number of expected deaths in each category vs. actual deaths in that category. I would simulate 100K instances with the following steps. Set count of more_than_observed deaths = 0 Loop = 0 Set simulated_deaths = 0 for each witness generate a uniform random number in the 0 to 1 range -- call this u for each witness look up the appropriate mortality rate by sex and age -- call this p for each witness if their u < p then simulated_deaths = simulated_deaths + 1 if simulated deaths >= observed deaths in category then more_than_observed_deaths = more_than_observed_deaths loop = loop + 1 if loop = 100K STOP else GOTO step 3 After 100k simulations the probability of getting at least the observed number of deaths in the category = more_than_observed_deaths / 100K. ************************ For the second probability - witness death in some time period immediately before testifying (lets say 14 days) I might do this: N = number of days between JFKA and scheduled investigation appearance p = 14 / N then use the cumulative binomial probability distribution to get the probability of at least the observed witness deaths in that short time period. I think those estimates would work and hope that I've explained it well enough.
  3. Hi Kevin - it's interesting to think about this, at least for me.
  4. my 2¢: The way to deal with uncertainty is to use probability and statistics to quantify it by looking at all the data available and not by picking a few examples (you might be accused of cherry-picking coincidental data). In Reclaiming Science - the JFK Conspiracy - Charnin uses a Poisson probability distribution to model the probabilities of unexpected deaths of witnesses / interviewees of the investigations. If the 1960-1970 era age-corrected population death rates by suicide / heart attack / murder / cancer / etc. are correct then the probabilities of getting n unexpected deaths from the sample population are a way to determine if the deaths were part of the natural life and death process or not. The book is available on Kindle. Otherwise, you are left with an argument similar to - well one person duplicated 2 hits in 3 shots in < 9 seconds so that's proof LHO (or whoever shot from the TSBD) did it. Just because something is possible, doesn't mean it happened.
  5. Oh great - I have that book on Kindle & one of the same MSc degrees as the author. Now I'm gonnahavta look through it again - maybe even read it closely. I remember thinking it pretty thought provoking. I have always wondered about probability calculations and the JFKA - why they aren't used more - then I remember I did read Innumeracy. link to spreadsheet with all data, calculations, and results link to author's blog JFKA section
  6. Hi Pat If the data could be collected (by AI or web scraping or ...by hand) a straightforward approach would be to estimate a statistical model using multinomial logistic regression to estimate the probabilities for each of the targets of the research accepting it and publishing it.
  7. It very well may end up that the AI in the future will be reflecting a whole bunch of previously generated AI garbage all over the net.
  8. Meant by allowing him to run in the Dem primary and debating, isn't that what RFKjr wanted? or am I misremembering that?
  9. Would this have been avoided if Biden had agreed to debate him?
  10. That's where I started both academically and professionally. Mostly using a somewhat dead language PL1 which seemed like a combination of COBOL for input/output and Fortran for computation. Hand that deck in then go wait for an hour for the run to finish after those ahead in the stack. The poor guy who took the decks and then ran them was no longer needed once we could submit jobs from the desktop pcs.
  11. As someone who had their area of study renamed in the Great Renaming around the turn of the millennium from the name the general population didn't understand, Operations Research Analyst, to a name that apparently refers to some guys and gals in the back of a pharmacy, Prescriptive Analytics Engineer by the 'Data Scientists', I find it interesting to see algorithms that existed before being labelled AI. For example, the facial recognition algorithms were well-known decades ago before AI was in the news: I remember laughing at Polly Perrett's character on NCIS recognising a suspect by searching YouTube videos in maybe 15 minutes. Anyway, I've seen AI go from simple rule based systems, to providing solutions to complex planning problems in Prolog (how to re-assign airliners to gates when the schedule has been disrupted by weather cancellations and re-routings), to how to predict the next set of words given previous sets of words. IMO - Large Language Models (AI now) are programs that have implemented memorization which isn't Intelligence. The Prolog example is closer to real intelligence. But just my opinion. That said, I think the applications that have been discussed would be very interesting, just don't know if I would label them AI.
  12. another question, actually 2: Is this the same Guinn ran the NAA tests comparing the chemical composition of the MC bullets and refused to report the statistical confidence in his conclusions which were later refuted with 95% confidence? Where does the 2.5 hour estimate come from? experiments or guessing? I don't know if the 2.5 hours is the upper limit or how it was arrived at, but Guinn seems to have been mistaken in the past.
  13. Correct me if I'm wrong, but The NAA was positive for the casts from the hands, but could have been caused by handling printed materials among other items - books for example. So - don't most people wash their hands when using the toilet? Do most people wash their hands more thoroughly than their face? If LHO's washing was to remove GSRs why would he not concentrate on his hands? I don't follow the assumption that he would wash the GSRs off his face but not his hands.
×
×
  • Create New...