- vcdimension parentFurthermore, there is no evidence that anyone else had the idea of using thermionic valves to greatly increase the speed of the code breaking computer, so if Tommy Flowers didn't exist it probably wouldn't have been discovered until much later (i.e. too late to help the war effort). Of course CEO's such as Bill Gates and Steve Jobs have made important individual contributions to society, but they were not as pivotal as that of Tommy Flowers, and furthermore they have been rewarded in money (many would say too much), whereas Tommy Flowers has received very little reward or recognition for his achievement.
- Exactly, in your words: "being built by a lot of people", i.e. not just Bill Gates and Steve Jobs. If Bill Gates & Steve Jobs didn't exist those employees would be working for different computer companies such as IBM, Olivetti, Apricot, Xerox (who invented the point & click windows system) or one of many others, and we would have similar products under different names.
The geopolitical achievement of winning the war (and the consequences of that) was the whole purpose of building the machine, so it doesn't make sense to dismiss it as you did.
- The value that companies such as Microsoft & Apple provide is supplied by all their thousands of employees, not just their CEO's. So its not a fair comparison to compare the output of Microsoft & Apple with the output of a single person (Tommy Flowers). Furthermore there are plenty of alternatives to Microsoft & Apple: if Bill Gates & Steve Jobs didn't exist we'd be probably be running Unix, Linux or one of the many other operating systems that lost out to Windows & MacOS for market share. If Tommy Flowers didn't exist we might have lost the war.
- Here's another tutorial for creating zsh completers using the built-in functions: https://github.com/vapniks/zsh-completions/blob/master/zsh-c...
- In zsh you can use the _gnu_generic function for simple completion of commands with a --help flag. Just put a line like this somewhere in your startup file: compdef _gnu_generic <CMD>
- I've forked the repo and created a zsh version: https://github.com/vapniks/shell-secrets
- Yes, you need a good tutor to help you navigate through such a complex topic.
- and if anyone is interested in delving more deeply into the statistical concepts & results referenced in the paper of this post (e.g. VC-dimension, PAC-learning, etc), I can recommend this book: https://amzn.eu/d/7Zwe6jw
- I had a look at Eric Stansifer's write-up of his decision, but I didn't read all of it (83 pages!). He does seem to have a good understanding of Bayesian decision making and hypothesis testing.
What confuses me however, is his dismissal of two pieces of evidence in table 2 which he says should be ignored "following the presumption that HSM is the first SSE", and yet earlier, in footnote 24, he states "We are very specifically NOT conditioning on that place being HSM" (talking about the first SSE location). Can anyone enlighten me about this seeming contradiction?
Another point: while both judges are qualified scientists, their expertise is in microbiology/virology not epidemiology, but it is the epidemiological aspect of the situation that is the most contentious part of the analysis, and AFAIK the part that swung the decision in favour of zoonotic origins for both judges. Without prior assumptions they both agree that the DNA evidence favours the lab leak theory.
- Which means this; it gives further weight to the lab leak theory, and shows the reasoning behind it.
I don't have time to watch the 3hr debate or read all of that article (which makes some misrepresentative statements, and like your response, is rather venomous in tone), but here is the response from rootclaim about the debate outcome: https://blog.rootclaim.com/covid-origins-debate-response-to-...
I also know from experience that scientists, and people in general, are often not well trained in the kind of probabilistic reasoning that is required for combining and weighing up multiple sources of evidence.
- rootclaim gave the lab leak theory an 87% probability using Bayesian analysis back in 2020: https://www.rootclaim.com/analysis/What-is-the-source-of-COV...
- Does anyone have any tips on how to persuade non-techy friends & family to switch from WhatsApp to element-x?
- Someone should train an LLM on all the kernel documentation, code, mailing lists, etc.
- but what if H0 = Hewlett Packard did not plan to eliminate Mike Lynch and Stephen Chamberlain...
- So I guess we'll never know the p-value of that event...
- Yes, it would be nice to know how things change for different weightings of the null and alternative priors.
- This article is very interesting and informative, however it's a bit ironic that an article about misinterpretations of the meaning of the p-value, misinterprets the misinterpretation; in the first blue box it's clear that Bernstein is interpreting the p-value as the probability of randomly rejecting the null (which is what you do when you get something statistically significant) yet in the text following that they say he's interpreting it as the probability of the null. Bernsteins mistake is that he appears to interpret it as an unconditional probability rather than a conditional one (correct interpretation; p-value = Prob(rejecting the null when the null is true)).
- When I click on any of the pdf's it says: Error rendering embedded code, Invalid PDF
- @justk The R^2 value of 0.01 calculated on that webpage uses both states, not just one: the variance of the predicted values across both states is 0.55^3+0.45^3 - (0.55^2+0.45^2) ≃ 0.497 ≃ 0.5 I don't think it makes sense to use a mixed model in this case since the variance is the same for each state. A mixed model is used when the observations have some structured heteroskedasticity, i.e. different variances for different values of the independent variables.
- @justk What you're talking about might make sense if there were more independent variables to consider, but in this case there's only one, state. So in fact you could say that there are two conditional linear models in the example; one for the first state (state=0), and one for the second (state=1). The model does the best job with the information available (state).
- I am a statistician, and you're right, for this kind of thing we would normally use a binary response model such as a logit or probit model that constrains the response variable to be between 0 & 1. However in this case it doesn't matter since there's only one independent variable (state), and it's binary so there's only 2 different predictions the model could make (which will be the correct probabilities of 0.45 & 0.55, even with a linear model).
The normal R^2 formula can't be applied to a logit/probit model; instead you use an alternative such as McFadden's or Cox & Snell pseudo R-squared. I'd be interested to see what value they take for this example.
Linear models are sometimes used even in models with many independent variables since it can be shown that the coefficients in a linear model are unbiased estimators for the average partial effects of any non-linear binary response model.
- I thought about averaging the scores, which gives you a point inside the circle, and then projecting onto the circle with a ray from the centre, which is continuous everywhere apart from where the average is at the centre (e.g. for two voters this is when they have exactly opposite views). So if you have a continuous probability distribution on the domain the probability of undecidability has measure zero.
- Remembering all those unix tools and their uses can be tricky. I wrote a couple of shell scripts that allow you to build command pipelines one step at a time, choosing a tool from a menu at each step, and with the ability to preview the results while tweaking the command line flags at each step. At any point you can go back to the previous step and continue: https://github.com/vapniks/fzf-tool-launcher https://github.com/vapniks/fzfrepl
- Not sure how much immigration is really needed; due to improvements in healthcare, working conditions, and machinery, the length of time for which people are able to stay in active work and contribute positively to the economy is increasing at a comparable rate. France has a higher proportion of young people, and yet lower immigration. Why? Could it be that the decision to have children is correlated with population density?
- (1) more than just a grain of truth, more like the elephant in the room; net migration into the UK is more than 4 times higher than France, and population density is currently almost 3 times higher, and rising.
- what about guix? could that be used for this purpose?
- Unfortunate name; it clashes with the builtin zshell editor.
- In emacs you can use syslog-mode for analyzing strace output: https://github.com/vapniks/syslog-mode It allows you to easily navigate, filter, highlight lines, and lookup documentation.
- Just to clarify, the pattern language is a more powerful alternative to using regexps (but you can mix them). My bank statements are pdf's which can be converted to ascii using pdftotext, however this destroys the structure of the documents which makes extracting data using regexps (even pcre's) very difficult, but much easier using txr.
- Cheers Kaz, I'll make the changes. There's still quite a few TODO's in that code that I intend on fixing up at some point.