What? Here I describe my bet, my insurance, what I'm mad about, and what I think I'm learning. But noone can say I didn't explain it, I didn't share what I learned. So here you go.
Why? It started with a lifetime of nightmares (I just thought I deserved them), and gets worse. Yes, I'm motivated. You?
So let's talk about it.
So: My diet, such as it is, aims to be:
(K × V) − CSimple + AK is for keto, V for Vegan, C for Carbohydrates, A for Anchovies.
× means AND, so K × V means food that is BOTH keto AND vegan (and cuts out most groceries).
Subtract the Simple Carbohydrates to cut out the nighttime headaches. Add the Anchovies for some flesh-only required nutrients (B3?), aiming low on the food chain.
Next I have to rant about vitamins. Which brings up the whole Vitamin D3, skin color topic. Which is important. So here.
Some thousands of years ago a group of human genetic mutants had a grandparent with some melanin genes that had been accidentally deleted, which gave them skin translucency, and they survived and thrived, in some places, with this peculiarity of better sun penetration into their skin -- because sun in skin is how humans make our own D3. The peculiarity became an advantage in the semi-darkness of the North, through thousand-year Ice Ages, when they had to cover up their skin from the cold, and where less sun was available anyway, being so far North. And it wasn't as big a problem getting sunburnt without melanin protection because they didn't have as much sun up there.
Translucent skin happens to make it possible that a little bit of sunlight on the body could still be enough to make enough D3 for humans to survive in the cold, dark North.
Nowadays we call this group "White People". But the truth is, they are actually "Translucent People". Look carefully, you can look through the skin itself and see the fat beneath, which is yellow-white, and also the red, which is the small blood vessels, and the occasional blue, which is the big veins. They aren't actually white, people! They are translucent! You're not looking at them, you're looking into them.
Do you see it? It is hard to un-see it!
Everyone else, the still brown people, stayed South so there was plenty of sun, enough so that the melanin, built-in sun-block, didn't stop all D3 from being made in the skin; they got plenty, living outdoors, in the South. Them; there; before. But nowadays plenty of brown people live in the northern latitudes (danger!), and anyway all us modern folks generally live indoors under electric lights (danger!), so nearly all of us get too little sunlight into the skin to make enough D3, so we are all generally low, but especially brown and black skinned people living indoors OR in the North OR both.
Dear Ones, Everyone, but Black and Brown people especially, Please take Vitamin D3, like 5000-10000 IU/day, unless you're outdoors AND it's summer AND you're south of Atlanta Georgia! I do.
So now we can go on to vitamins and supplements in general.
Really, get D3 tested whenever your blood is checked. Then you know how much to take, but 500IU or even 1000IU is quite low, evidently 5000-10000 international units/day is more like it.
Don't f yourself by not having enough just because you live indoors, or North, or in the winter. (Yes it even explains the winter flu season!)
4% of all genes don't turn on without D3, so you're hosed without it. Does a bowl hold water if 4% is holes? I don't think so; it's called a "sieve". If your genes are there to make you into a bowl, do you want to be a bowl? Or a sieve?
D3 does a lot of stuff: reduces inflammation (which causes many diseases, even aging), prevents osteoporosis, prevents artery calcification, and protects against viruses too including the flu and better responses to Covid, etc. Jesus!: diabetes, Parkinson's, MS, high blood pressure, obesity, depression, age-related dementia, you name it, frickin' vaginosis. Read the review article.
And K-2 is the point here.
What Vitamin K-2 mainly does is it puts calcium in the right place.
D3 brings the calcium into the system, A gets rid of it when it's done, and K-2 puts it in the right place, into your bones instead of your arteries. K-2 activates bone construction, deactivates bone destruction, and regulates calcium deposits in the arteries.
This is a typical thing about calcium, it stops going to the bones where it ought to, and instead it accumulates in some wrong places, like in the arteries as plaques, or perhaps in the kidneys as kidney stones (studies based on this hypothesis are ongoing), or in the mouth as calcified plaques. Calficifation in the arteries is a better predictor of heart attacks than cholesterol levels, so this is a big deal.
K-2 works with Vitamin D in calcium regulation to reduce plaques in the arteries AND maintain bones. K-2 is not the same as K-1 which comes from veggies and noone is deficient in. Everyone (generally) is deficient in K-2 which comes from fermented soybeans, "natto". In Japan where they eat this nasty natto stuff, no osteoporosis, but in Northern / Eastern Japan where they don't, lots of broken hips.
Take the supplements. K-2 as MK-7 at 500mcg/day, not 180mcg/day. Or justify why not. Tested safe under 4500mcg/day * 2 years. K-2 as MK-4 has been prescribed for osteoporosis in Japan for decades, at 45mg/day (yes mg not mcg: 1mg=1000mcg). There is no known safe upper limit to K-2 consumption; in Japan they will even do double dosing of the 45mg treatment.) So look it up.
An anecdote to add to Science: my dentist today (4/4/2022) said I had not much calculus on my lower front teeth. After a lifetime of heavy calculus (calcified mineral deposits) on those very teeth, what changed was I have been taking 360mcg/day K-2(MK7), for the last few months. This is consistent with the claim that calcium deposits on the teeth are regulated also by K-2.p
Now, the definition of the RDA, the Recommended Daily Allowance, is SM+2S where S is the standard deviation of measurements of SM. Like, some mice died with a little more, some lived with a little less, and the average variation across the estimated death minimums is S, which could be a small absolute amount if they are all tightly clustered. But they figured that if you give enough to cover twice the variability, then 96% of the mice will survive with that little, and that's called the RDA. It's not actually a margin of error to cover additional uses of a nutrient (such as for long-term health), but a margin of probability to cover only the die-soon, high-priority uses, at the level of bare survivability. So those two standard deviations only covers 96%, and only for those uses. Great, we don't want to die in a few weeks, that's helpful information. But we are hung out to dry, trying to understand the long-term-health-supportive uses of ANY nutrient! The RDA hardly tells us anything, really.
So that's how we measure the RDA, and that's what it means. So, do you think that's enough?
No, it's not enough. You might have a hundred or 800 uses for some essential nutrient, like Vitamins D3 and K2 are our great first examples, and if you don't get the RDA then you die in the short term, but if you get the RDA, it still might not be enough for all the 800 uses that might give you a long and healthy life. The RDA is only designed to be good for keeping you alive in the short term.
So what we'd like to know is, What is the Optimum Healthy Allowance? This will depend on all the good uses of all the different nutrients, perhaps half of which we haven't even put on the list yet, according to Dr. Patrick. So we need a lot of research, basically, to figure this out. Meanwhile, we can expect that the RDAs might often be insufficient for keeping you strong and healthy for a long time, not merely alive for a short while.
It's not like if you consume half the RDA then half your needs are met: No! If you consume half the RDA then half the needs ON ONE BRANCH of the tree of needs are met. The other branches will make you live better, or longer, or have stronger bones, etc., but they get ZIP if you don't COMPLETELY meet the needs of the most important short-term need first.
(This is called "Triage theory" according to Dr. Ames.)
Since the RDA is based only on immediate survival functions, the optimum health allowance, let's call it the OHA, is potentially a lot more than the SM or the RDA. We have no idea how much more, or when we'll stop adding stuff to the OHA list.
This would be a great use case for Community Capitalism.
Is it people, or is it their doctors, that are the most gutless, when the doctors won't tell them to just eat falafel, oatmeal, stirfry, and salads, instead of batter fried meat on cheese plus dessert, so that they would maybe not die so much of cancer and everything else?
I heard it's because the doctors think everyone will just ignore them. Blaming the people. Fine, go ahead and frickin' die, people. That's what you prefer? Wow.
Apparently, you have to go to the non-idiot line at the doctor's office and say No, you don't prefer McDonalds to Life Itself and you're actually willing to reconsider a couple diet choices. Then *Maybe* they'll break down out of their standard-American-diet-locked customer mode and tell you, Hey that's actually good because you'll probably not die so young of cancer and heart disease and stuff because of that.
But SSHHHH, don't tell anyone, because it's a secret, not because it's actually a secret, but because some of the people around here are chickenshit and so we seriously can't even talk about it, or we can't even talk seriously about it.
Now you tell me, who is it, the people, or the doctors?
Now, low-carb advocates (like Amber O'Hearn who says "headache, lassitude, vague discomfort, are symptoms of 'rabbit starvation'" (!My Symptoms!)) are arguing humans evolved as marrow scavengers, i.e., as fat eaters. So the first tool was just a rock, but with a rock an early human, like an australopithecus maybe, could scavenge otherwise inaccessible and preserved marrow and brain by breaking the big bones and intact skulls left at a kill after the vultures were finished.
To this evolutionary phase we can attribute the development (as compared with chimps) of more acidic stomachs, a more ketotic metabolism, less chimp-like raw-fiber digestion, preference for fatter (large) prey. And we can infer it supported elaboration of tool use, more eclectic carnivory, fatter bodies, basically more of ye olde fat-based brain. If your metabolism is fat-burning, that's called ketosis; but did you know, most of the carbon atoms in the brain come into the brain as ketones? Fat = brain food. (Amber, Did I get it right?) If so, this evolutionary track helped us smarten up and get bigger brains, fatter bodies, and kickstart the whole human evolutionary cascade.
I do propose that the marrow scavenging preceded the aquatic phase of human evolution when we lost body fur, got (even) more skin fat, became fully bipedal (according to Elaine Morgan), acquired voluntary breathing control (necessarily preceding speech), learned to love swimming. Because the marrow+brain fat consumption and a bit more of our own brains would tend to lead us toward safer feeding of just as plentiful fat in netted or group-herded fish. A puddle off a pond being as good as a net if you have twenty family members splashing in a line to chase the fish into it, and a log to roll across the neck. And nets themselves, a generalization, being slightly more complex than rocks (and logs and puddles), but not much, less so than cloth, for example, therefore the semi-aquatic phase should be after the marrow scavenging phase based on increasing tool complexity.
Also if you can tolerate picked over carcasses you can tolerate stinky fish smells too. I'm going to say that it was evolutionarily later that our smell preferences changed to become more fastidious.
Here: My nutritionist said to try intermittent fasting. No late night snacks, and a late breakfast, gives your system 14 hours to reach ketosis and start to clean out the day's bluck. Longevity, okay? Study up, friend, the science is actually starting to come in! And please educate me, too; I don't know everything (for sure), or anything (possibly).