What? Here I describe my bet, my insurance, what I'm mad about, and what I think I'm learning. Noone will be able to say I didn't explain it, I didn't share what I learned. So here you go.
Why? It started with a lifetime of nightmares (I just thought I deserved them), and gets worse. Yes, I'm motivated. You?
So let's talk about it.
So: My diet, such as it is, as of early 2022, aims to be:
(K × V) − CSimple + AK is for keto, V for Vegan, C for Carbohydrates, A for Anchovies.
× means AND, so K × V means food that is BOTH keto AND vegan (and cuts out most groceries).
Subtract the Simple Carbohydrates to cut out the nighttime headaches. Add the Anchovies for some flesh-only required nutrients (B3?), aiming low on the food chain.
Next I have to rant about vitamins. Which brings up the whole Vitamin D3, skin color topic. Which is important. So here.
Some thousands of years ago a tiny group of human genetic mutants had a grandparent with some melanin genes that had been accidentally deleted, which gave them skin translucency, which is weird, and could have caused them to die out if they all had gotten cancer living near the equator with so much sun and not enough melanin to protect them. But instead these individuals survived and thrived, in certain locations, with this peculiarity of better sun penetration into their skin -- because sunlight inside the skin is how humans make our own D3. The peculiarity became an advantage in the semi-darkness of the North, through many-thousand-year Ice Ages, when they had to cover up their skin from the cold, or stayed indoors, for warmth, and where less sun was available anyway, being so far North. Even a little bit of sun exposure produced enough vitamin D3 to allow them to survive; but their cousins, dark skinned in the far, northerly, high latitudes didn't get even that much D3 because their melanin blocked the sunlight getting into the skin and synthesizing it. Those cousins, well, they didn't survive. And for the translucent ones, it wasn't as big a problem getting sunburnt without melanin protection because they didn't have as much sun up there.
It happens that translucent skin, without any sunblock in it, enables a little bit of sunlight on the body to still be enough to make enough D3 for humans to survive in the cold, dark North. That's the story.
Nowadays we call this group "White People". But the truth is, they are actually "Translucent People". Look carefully, you can look through and into the skin itself and see the fat beneath, which is yellow-white, and also the red, which is the small blood vessels, and the occasional blue, which is the big veins. They aren't actually white, people! They are translucent! You're not looking at them, you're looking into them!
Do you see it? It is hard to un-see it!
Now everyone else, the still-brown people, mostly stayed South so there was plenty of sun, enough so that the melanin, built-in sun-block, didn't stop all D3 from being made in the skin; they got plenty, living outdoors, in the South. Them; there; before. But nowadays with migration, plenty of brown people live in the northern latitudes (danger!), and anyway all us modern folks generally live indoors under electric lights (danger!, danger!), so nearly all of us get too little sunlight into the skin to make enough D3, so we are all generally low, but especially brown and black skinned people living indoors OR in the North OR both.
Dear Ones, Everyone, but Black and Brown people especially, Please take Vitamin D3, like 5000-10000 IU/day, UNLESS you're outdoors AND it's summer AND you're south of Atlanta Georgia! I do.
So now we can go on to vitamins and supplements in general.
Humans have 20,000 genes. Vitamin D is a kind of a key that fits into a certain kind of a lock in the biochemistry of the human body, and that key opens the door or triggers stuff that will only happen if the key is there. They know what the DNA looks like, which codes for that lock, and there are 800 copies of it in the human genome. We think Vitamin D3 controls 800 of the 20,000 human genes. That's 4% of all your genes.
Blood levels of D3 are way low for most people, especially people out of the sun, which is all of us, and especially brown- and black-skinned people living modern life in northern latitudes, which is a whole lot of us.
Really, get D3 tested whenever your blood is checked. You want the levels to be over 75. That'll tell you how much to take, but 500IU or even 1000IU is TOO LOW, and evidently 5000-10000 international units/day is more like it.
10,000 IU is safe according to Oprah.
Don't f yourself by not having enough just because you live indoors, or North, or in the winter. (Yes it even explains the winter flu season!)
Look, 4% of all genes don't even turn on without D3, so you're hosed without it. Does a bowl hold water if 4% is holes? I don't think so; that's called a "sieve". If your genes were there to make you into a bowl, would you want to be a bowl? Or a sieve?
D3 does a lot of stuff: reduces inflammation (which causes many diseases, even aging), prevents osteoporosis, prevents artery calcification, and protects against viruses, too, including the flu and better responses to Covid, etc. Jesus!: diabetes, Parkinson's, MS, high blood pressure, obesity, depression, age-related dementia, you name it, frickin' vaginosis. Read the review article.
So the US Recommended Daily Allowance RDA (for adults) for Vitamin D3 is 600IU, according to Harvard. I can't believe they haven't updated it.
Well, they set it low. Way, way, Way, WAY low.
They have a method of taking the data and turning it into an RDA, and they made a mistake, and they set it too low. And science has detected this error, and peer-reviewed, published, replication papers have confirmed this error, and the RDA should be >10X more than it is (~9000IU/day for adults for a low tolerable blood level), and still they aren't fixing this in the news or in the US RDA statements or even at Harvard frickin' University. It is scandalous. And the outcome is that Americans, and everyone that follows American standards, which is most of the world, actually, are, how shall I put this: we are as much as 4% underfunctioning, if the design of humans was to be bowls, we would be sieves.
Oh, uh, duh, we do suffer from the so-called "diseases of civilisation". Hmm, I think that's the same point.
Okay let's see the details.
How then does one calculate an RDA? Here's a way to think about it, not exactly literal. I've translated "human with 25-(OH)D blood levels below 40ng/mL" to "dead mouse", to make it easier to follow. The data analysis and the statistical reasoning is the same.
So: You take a bunch of mice. They can die for science, we don't care. Now you don't give them enough of some particular vitamin, and you wait for them to die. If more than a few of them die before your study is over, then, you give a little more, a little more of your vitamin, to different groups, until you know the threshold at which they mostly survive. Not all of them, but most of them. When 97.5% of them survive for your study duration, say it's a couple of months, then that would be the level to call the Recommended Daily Allowance. It means 97.5% of the individual mice will live if they get at least that much. That's what the RDA means: 97.5%. Individuals.
So what happened with the Vitamin D3 RDA? Almost the same, nearly the same, but different, and the number that comes out is TOTALLY different. Here's what they did. Instead of 32 mice, they took 32 study groups of mice, each study group having many mice, a bunch of mice in each study. Now each mouse doesn't count as one mouse, but it counts as one of N mice in a single study average, and the high ones cancel with the low ones and the average is in the middle somewhere, and that's all they cared about, just the average, for that study group. So after they calculate the study average they throw out all the individual mice, and just keep the average for that study. Really? Yes, instead of pooling all the mice and looking at that pile, instead they put all the study averages into a pool of study averages, and they figure out where the 97.5% line goes in the pool of study averages. Did I make that clear? Not the pool of all the individual mice, but the pool of the study averages. Of course all the study averages are all clustered together over here, right in the very middle, because even though individuals might be all over the place, their averages, on average, tend toward the global average (that's called the Law of Large Numbers, and it's a theorem, so it's true). So instead of looking at a pile of data where individuals are all over, they were looking at a pile of data with just study averages only, where everything is concentrated right there a lot closer to the global average.
And then they drew the 97.5% line on the concentrated pile. Which would be a nice way of missing almost half of the spread out pile. If you wanted nearly half of the individual mice to die, then that would be a good way to do it. And that's what they did.
I hope I have made this issue understandable. Again, I translated "human, blood level of Vitamin D3 below 40ng/mL" into "mouse, dead", to make it more readable. But the logic is what I have explained. I encourage you to read the more dry but no less shocking original paper here, and another that confirmed its conclusions with a different data set here.
In short, 9000IU/day Vitamin D3 is not too much. Especially if you take Vitamin K2 with it. I have a friend who is taking 40,000IU/day prescribed by her cardiologist. So maybe the safe range is a lot different from what Harvard says. I'm taking 5000IU and 10000IU every other day, and 180ug K2.
Which brings us to Vitamin K2.
And K-2 is the point here.
What Vitamin K-2 mainly does is it puts calcium in the right place.
The simple model is this: It's all about the calcium. Vitamin D3 brings the calcium INTO the system, Vitamin A moves it OUT when it's done, and Vitamin K-2 puts it in the RIGHT PLACE: into your bones instead of your arteries. K-2 activates bone construction, deactivates bone destruction, and regulates calcium deposits in the arteries.
This is a typical thing about calcium, in the diseases of modern civilisation: it stops going (as much) to the bones where it ought to, and instead it accumulates (more) in some wrong places, like in the arteries as plaques, or perhaps in the kidneys as kidney stones (studies based on this hypothesis are ongoing), or even in the mouth as calcified plaques. How many women do you know with osteopenia or osteoporosis, all worrying about breaking their bones when they get older? To me it seems like all of them. And now they can't take Fosamax which pumps up their calcium levels, because it turns out that makes their bones brittle and gives them heart attacks. This is all about calcium going to the wrong places, out of the bones, into the arteries. Hello!
And calcification in the arteries is a better predictor of heart attacks than cholesterol levels, so this is a big deal.
K-2 works with Vitamin D in calcium regulation to reduce plaques in the arteries AND maintain bones. K-2 is not the same as K-1 which comes from veggies and noone is deficient in. Everyone (generally) is deficient in K-2 which comes from fermented soybeans, "natto". In Japan where they eat this gooey natto stuff, no osteoporosis, but in Northern / Eastern Japan where they don't, guess what?: Lots of broken hips. The two maps are not perfectly identical, but very, remarkably similar.
So, please, take K-2 supplements. None of us eat natto, and most of us are low, and it's not in the vegetables like K-1. Take K-2 in the form of MK-7 at 360mcg/day, not 180mcg/day. Or justify why not. It's been tested safe at under 4500mcg/day * 2 years, so why would you want to be so low? K-2 as MK-4 (also K-2, but with faster breakdown than MK-7) has been prescribed for osteoporosis in Japan for decades, at 45mg/day (yes mg not mcg: 1mg=1000mcg). There is no known safe upper limit to K-2 consumption; in Japan they will even do double dosing of the 45mg treatment.) So at least look it up, please. I'm just a self-studying plumber, you have to make up your own mind!
An anecdote to add to Science: my dentist today (4/4/2022) said I had not much calculus on my lower front teeth. After a lifetime of heavy calculus (calcified mineral deposits) on those very teeth, what changed was I have been taking 360mcg/day K-2(MK7), for the last few months. This is consistent with the claim that calcium deposits on the teeth are regulated also by K-2.
Now, the definition of the RDA, the Recommended Daily Allowance, is SM+2S where S is the standard deviation of measurements of SM. Like, some mice died with a little more, some lived with a little less, and the average variation across the estimated death minimums is S, which could be a small absolute amount if they are all tightly clustered. But they figured that if you give enough to cover twice the variability, then 96% of the mice will survive with that little, and that's called the RDA. It's not actually a margin of error to cover additional uses of a nutrient (such as for long-term health), but a margin of probability to cover only the die-soon, high-priority uses, at the level of bare survivability. So those two standard deviations only covers 96%, and only for those uses. Great, we don't want to die in a few weeks, that's helpful information. But we are hung out to dry, trying to understand the long-term-health-supportive uses of ANY nutrient! The RDA hardly tells us anything, really.
So that's how we measure the RDA, and that's what it means. So, do you think that's enough?
I don't think it's enough. You might have a hundred or 800 uses for some essential nutrient, like Vitamins D3 and K2 are our great first examples, and if you don't get the RDA then you die in the short term, but if you get the RDA, it still might not be enough for all the 800 uses that might give you a long and healthy life. The RDA is only designed to be good for keeping you alive in the short term.
So what we'd like to know is, What is the Optimum Healthy Allowance? This will depend on all the good uses of all the different nutrients, perhaps half of which we haven't even put on the list yet, according to Dr. Patrick. So we need a lot of research, basically, to figure this out. Please go donate to some studies! And meanwhile, we can expect that the RDAs might often be insufficient for keeping you strong and healthy for a long time, not merely alive for a short while.
It's not like if you consume half the RDA then half your needs are met: No! If you consume half the RDA then half the needs ON ONE BRANCH of the tree of needs are met. The other branches will make you live better, or longer, or have stronger bones, etc., but they get ZIP if you don't COMPLETELY meet the needs of the most important short-term need first.
(This is called "Triage theory" according to Dr. Ames.)
Since the RDA is based only on immediate survival functions, the optimum health allowance, let's call it the OHA, is potentially a lot more than the SM or the RDA. We have no idea how much more, or when we'll stop adding stuff to the OHA list.
This would be a great use case for Community Capitalism.
Is it people, or is it their doctors, that are the most gutless, when the doctors won't tell them to just eat falafel, oatmeal, stirfry, and salads, instead of batter fried meat on cheese plus dessert, so that they would maybe not die so much of cancer and everything else?
I heard it's because the doctors think everyone will just ignore them. Blaming the people. Fine, go ahead and frickin' die, people. That's what you prefer? Wow.
Apparently, you have to go to the non-idiot line at the doctor's office and say No, you don't prefer McDonalds to Life Itself and you're actually willing to reconsider a couple diet choices. Then *Maybe* they'll break down out of their standard-American-diet-locked customer mode and tell you, Hey that's actually good because you'll probably not die so young of cancer and heart disease and stuff because of that.
But SSHHHH, don't tell anyone, because it's a secret, not because it's actually a secret, but because some of the people around here are chickenshit and so we seriously can't even talk about it, or we can't even talk seriously about it, or something. Seriously.
Now you tell me, who is it, the people, or the doctors?
I guess I'm minimizing the power of customer opinion, when you're a doctor you actually do have customers, and if you tell them what they really really don't want to hear, then you might lose your customers. I don't know, isn't that their job, though?
Now, low-carb advocates (like Amber O'Hearn who says "headache, lassitude, vague discomfort, are symptoms of 'rabbit starvation'" (!My Symptoms!)) are arguing humans evolved as marrow scavengers, i.e., as fat eaters. So the first tool was just a rock, but with a rock an early human, like an australopithecus maybe, could scavenge otherwise inaccessible and preserved marrow and brain by breaking the big bones and intact skulls left at a kill after the vultures were finished.
To this evolutionary phase we can attribute the development (as compared with chimps) of more acidic stomachs, a more ketotic metabolism, less chimp-like raw-fiber digestion, preference for fatter (large) prey. And we can infer it supported elaboration of tool use, more eclectic carnivory, fatter bodies, basically more of ye olde fat-based brain. If your metabolism is fat-burning, that's called ketosis; but did you know, most of the carbon atoms in the brain come into the brain as ketones? Fat = brain food. (Amber, Did I get it right?) If so, this evolutionary track helped us smarten up and get bigger brains, fatter bodies, and kickstart the whole human evolutionary cascade.
I do propose that the marrow scavenging preceded the aquatic phase of human evolution when we lost body fur, got (even) more skin fat, became fully bipedal (according to Elaine Morgan), acquired voluntary breathing control (necessarily preceding speech), learned to love swimming. Because the marrow+brain fat consumption and a bit more of our own brains would tend to lead us toward safer feeding of just as plentiful fat in netted or group-herded fish. A puddle off a pond being as good as a net if you have twenty family members splashing in a line to chase the fish into it, and a log to roll across the neck. And nets themselves, a generalization, being slightly more complex than rocks (and logs and puddles), but not much, less so than cloth, for example, therefore the semi-aquatic phase should be after the marrow scavenging phase based on increasing tool complexity.
Also if you can tolerate picked over carcasses you can tolerate stinky fish smells too. I'm going to say that it was evolutionarily later that our smell preferences changed to become more fastidious.
Here: My nutritionist said to try intermittent fasting. No late night snacks, and a late breakfast, gives your system 14 hours to reach ketosis and start to clean out the day's bluck. Longevity, okay? Study up, friend, the science is actually starting to come in! And please educate me, too; I don't know everything (for sure), or anything (possibly).