Clearly the daily and yearly solar cycles influence life in many obvious ways. Life evolved in the presence of lunar cycles as well, so wouldn't surprise me that they have some (perhaps less obvious) level of influence.
There are definitely some simpler animals that have strong links between behavior and moon cycle. Lots of amphibians will mate or lay eggs only during a full moon for example.
There are also definite impacts from daylight cycles on humans, is it at least plausible that there are some internal clocks which are influenced by the lunar cycle, particularly for people who actually regularly see the moon. This should be all kinds of noisy with indoor living and lighting, I rarely see the moon at all much less sleep outside with the different levels of nighttime illumination.
But let's be pretty cautious about the superstition feeling lots of people have, plenty of anecdotes around here about folks like nurses saying they notice patterns... let's get some real data like gunshot victims or psych ward admissions vs lunar cycle instead of anecdotes about people's impressions.
And the literature about how sleep affects mental health is legion.
So it's not at all nonsense, the effect size might be small. You might not see it at all in hospitals because the lighting in hospitals isolates you from the moon.
It is really easy to be swayed by anecdotes and equally easy to be swayed by a little scientific evidence to the contrary.
This is interesting. I used to work in schools and the teachers swore the swelling of the full moon was a harbinger of bad behavior. It was a predominately female cohort. I wonder if it was the onset of menses sensitizing them to aggravations or if it was students misbehaving, or a combination of the two.
Anyways, astrology has been kicking around for a while. Of course I wouldn't say it's the whole truth, but I expect there's something to it. Radioactive hotspots being occluded. No doubt we have a biological equivalent of bit flips. Sprinkle in some butterfly effect, viola.
I do believe there were studies indicating disparate behaviors from winter-born people and their summer counterparts. Seasonality and the way that people interact with their environment in one vs another, the sensitivity to initial conditions, epigenetics...
There may be more to it than meets the eye, and our ancestors just came up with an overfit model that combines an elaborate multivariate system which has been perverted.
The article states that some women synchronize with moon cycles and even professes skeptical speculation that premodern women may have been explicitly attuned. Just putting two and two together. I know I'd be irritable if I had to deal with that kind of pain and the various other symptoms with it - certainly not trying to point blame, after all most of the "trouble" was always trivial. And it may not be my cohort directly, but various other people. It's not hard to imagine knock on effects. Though I wouldn't discount that causality may lie elsewhere if not outright bias.
More curtly, yes, I'm not implying but outright stating on the basis of the article.
The article is referencing a paper that says that sometimes women's cycles synced with lunar luminosity. They were looking at women whose cycles weren't 28 days over decades. I have my doubts.
Just to head it off in case GP or anyone else wonders - menstrual cycles don't actually synchronize. When people see them sync it's probably just the birthday paradox
Nurses have something similar also. At the hospital my wife works at, they say full moons are always when the patients are the craziest. Lots of nurses also call out when its a full moon at the hospital she is at.
My late mom was a labor and delivery nurse and worked nights. She’d always talk about how many babies are born on a full moon, or say thing like “ugh it’s gonna be such a busy shift it’s a full moon.”
I believe it. I usually feel a bit “manic” and have more trouble sleeping during full moons
Rotton, J., & Kelly, I. W. (1985). Much ado about the full moon: A meta-analysis of lunar-lunacy research. Psychological Bulletin, 97(2), 286–306. https://doi.org/10.1037/0033-2909.97.2.286
Although this meta-analysis uncovered a few statistically significant relations between phases of the moon and behavior, it cannot be concluded that people behave any more (or less) strangely during one phase of the moon than another.
Unfortunately, the lunar hypothesis is so general that it can handle any result. It is not enough to show that lunacy is more likely to occur during full phases of the moon than other phases, as folklore and superstition suggest.
Adopting a more limited but common view of lunar influences, one must conclude that evidence for the lunar hypothesis fails to pass three crucial tests. The first and most important is replicability. For every study that has recorded more lunacy when the moon is full, another has recorded less. A second test, which should be regarded as a substitute for replicability, is statistical significance. Even when a large number of results are combined, one cannot reject the null hypothesis of no relation between phases of the moon and behavior at conventional levels of significance. A third and closely related test is predictability. Given the task of predicting people's behavior, knowingthe moon's phase reduces our uncertainty by less than 1%.
Taking a cue from this review's title, some may conclude that all we have written is "much ado about nothing." Because the topic borders on the paranormal, it is easy to dismiss lunar-lunacy research as ridiculous (Levy, 1975) and unworthy of serious consideration. However, psychologists need to take the topic seriously because, if for no other reason, so many people believe that their behavior is influenced by the moon. For example, Angus (1973) found that 74% of the nurses working in one psychiatric setting believed that the moon affects mental illness.
It's honestly derogatory. I would posit this type of research is strongly motivated by the wont of certain individuals to dispel outright any conception of superstition. A la hard-line atheists. They're not intent on observation and development of understanding but rather destruction.
Again, I would push that ancestral explanations tend to hold a kernel of truth and with it value. However, they're very much Fisher Price building blocks, moreover they're nth-hand and have been prostituted into industries and perverted in the process. Sensitivities have declined, people are reactive, and have in many parts of the world grown averse to such enchantment, so motive for a Hawthorne Effect exists, examining behavioral spectra as it relates to seasons and planetary shifts and tides/moon/solar cycles is I'm sure increasingly noisy as to have abated it's appearance almost entirely. Lest we forget our past was exclusively environmentally determined. Sedentary cultivars of humans have only existed for what is essentially the blink of an eye on the evolutionary timescale. And we have only had the probe of formal psychology for an equivalent of a microsecond and it proves unwieldy even now.
And we wouldn't be special in such a lunar/solar/seasonal regime.
What I would say is that it shouldn't be discounted out of hand, and that is all.
When my kids were born, there were no beds in the infant ward or ER, because they were swamped with kids being born. We had premies so we stayed longer, and the next few days there were almost none.
They were born on a full moon. The ER nurses swore this was common.
I have also run into this, but it was a nigerian male. I've heard of similar beliefs among men (and women) in certain parts of east asia, and I'm just a dilettante with little cross-cultural insight. Given how widespread the lunar calendar was before the globalized economy across entire cultures (and indeed remains so strong today), and combined with my anecdotal observations proving counter-evidence, I'd be very skeptical about any claims that suggest a dominating force of gender or phenotypic sex.
Unfortunately I can't even access the article to see what the research actually says.
The "star signs" thing probably has a kernel of truth to it. Someone born in winter will have a different early life to someone born in summer. In a modern context this carries through to schooling: they'll either be older or younger than their year-group peers.
That seems more of a function of the effect of school-year cutoff timing(eg you must be 5 by Oct 1 to attend kindergarten, or whatever) more so than when someone is actually born.
My girlfriend is a general practitioner and used to work as out-of-hours service doctor . She swore (and her collegues agreed) that full-moon nights and new-moon nights were the worst: much more cases, much more serious and with strange/deranged/dangerous people.
So we did a pseudo-statistic research, taking notes on number of patients and severity of the illness. On full-moon nights and new-moon nights there were:
- 40% more calls
- 20% more requests for urgent care
- 35% more requests from drunks, drugs addicts and mentally ill people
Also, the vast majority of situations in which occurred some form of verbal abuse, violence or sexual assault (always verbal thankfully, but other colleagues were not so lucky) happened on those nights or the day before or after.
This was a "research" lasted for a year, 3 nights per week. We asked to work on it for her PhD, but no other doctor was willing to take part in it, and my girlfriend was too horrified from her experience as a out-of-hours service doctor to keep going, so it kind of died here and there.
> was a similar reduction in sleep on those nights in many of the undergraduates in Seattle, a large city where artificial light drowns out moonlight and students often have no idea when the full moon even is.
Can someone who actually lives in Seattle weigh in on whether the article was right to so quickly dismiss moonlight as the explanation?
I live in a suburban neighborhood near a sports stadium where the lights are on either all night or at least as late as I ever stay up. Between street lights, the stadium, and bright LEDs from my neighbors, light pollution is high enough that I can't see most constellations, just a scattering of a few dozen stars at most on a good night.
And yet I can always tell when the full moon is out because it shines like a beacon around the edges of my blackout curtains in a way that none of the artificial light in the neighborhood does.
I've never lived in downtown Seattle (or any large downtown), but I'm having a hard time imagining light pollution so bad that a full moon would be completely imperceptible at even an unconscious level. It seems to me that even subtle changes in light levels are a much more likely explanation than humans having gravitational senses.
There's also a lot less visible sky in a lot of places in a city—it only takes four or five stories to limit your view and block the moon a lot of the time. And in my experience, it's less the brightness of lights than the sheer volume of light from a city that reduces the apparent brightness of the moon and stars. I'm always amazed how you can see the halo of a city over the horizon at night just from its lights.
But has that been your experience with the full moon specifically? That thing is surprisingly bright—I'm thinking less about direct visibility of the orb than I am about just the general illumination in the area.
And again—the effect size is much smaller for the urban group than the non-urban groups, so I'm not suggesting that the light is starkly noticeable if you're not looking for it, just that it's probably detectable by the human eye and processed from there unconsciously.
In my mind you'd have to literally be unable to detect the difference in light between full moon and no moon (with a sensor with similar sensitivity to our eyes) before speculating about human gravitational senses becomes a rational move. We know we have eyes, and we know that what they're wired up directly to frequently surprises us.
I live in the urban part of Seattle. The moon was full the other night and I definitely noticed. With that being said, the effect is not as noticeable than in areas with no light pollution.
Right, I would definitely expect it to be a lower impact. I've spent nights in rural Wyoming and the full moon makes a huge difference there compared to my suburb, and I'd assume a similar effect going from suburb to downtown.
The weird thing about this piece is that the data explicitly shows that the effect size is much smaller for the urban population than for the rural, which suggests that the mechanism that triggers it is duller for them but not absent.
Going from "it's Seattle" to "so the moon is invisible" to "so maybe humans can sense gravity" was a set of violent enough leaps of logic to leave my head spinning.
It'd be pretty surprising if the moon was found to have no impact whatsoever considering how strongly it's impacted many (most?) cultures. A lot of economic discourse is bullshit but it doesn't stop it from strongly impacting peoples' behavior in ways a third party might find irrational, for instance.
> Most cultures evolved prior to cheap artificial light:
Doesn't this predate known cultures by like a million years? Humans are already noted to have been adapted for some life after dark other primates have not been, thought to be related to our penchant for starting and maintaining fires. It's also not clear at all that the switch to artificial light is irrelevant to our behavior—I think we have decent evidence working a grave shift effects most peoples' health negatively.
"cheap" was an important word in that quote. Look again at that graph I linked to.
In 1303, a million lumen-hours cost £40,475 in year 2000 inflation adjusted GBP, or 24.71 lumen-hours per pound. Before the industrial revolution, almost everyone was earning what is now considered "abject poverty", which was about "a [US] dollar a day" in 1996. This is (I think, approximately) £0.66 in 2000 money, which is enough for 13.31 lumen-hours per day if that's all they spend their money on… but IIRC 90% of the entire GDP was food people had to eat to not die, so realistically it was going to be 1.33 lumen-hours per day if they made it their main goal in life and had no other interests.
The light of a full moon on a clear night is up to 0.3 lumen/square meter, which means that the average person could afford to light a single bed (0.9 by 1.9 meters, 1.71 m^2) to the intensity of a full moon (0.3 lux * 1.71 m^2 ~= 0.5 lumen) for up to 2 hours 40 minutes per day… but even then only if they bed was fully boxed in (not from the examples I've seen) or if they had the optics to focus it only on the bed (which they didn't), and a more realistic room size of 4m^2 would have reduced the duration they could afford moon-equivalent light levels to (0.3 lux * 4 m^2 = 1.2 lumen) to 66 minutes per day.
Again, this would be for someone with an obsessive need to have as much light as they could get and thus spent every spare penny on candles or firewood, not a normal person.
And it only gets better by a factor of about 3 from that by 250 years ago.
That graph stops in 2006 at £2.67 per million lumen-hours, before LEDs were taking over from CFL as the efficient light source, and even that price means that if you want to fill a 10 m^2 room with simulated full direct sunlight (100,000 lux), that's 1 million lumen, which is £2.67/hour, at 12 hours per day it would cost about £11,694.60 per year — most people would think that's weird (including until recently the police, who would ask questions about which plants you were growing), but it illustrates the scale of the transformation.
> It's also not clear at all that the switch to artificial light is irrelevant to our behavior—I think we have decent evidence working a grave shift effects most peoples' health negatively.
Indeed, though I'm suggesting that in comparison to that, any impact from the moon is going to be noise.
https://archive.ph/7ztie