As a Type 1 Diabetic, I'd love to have a continuous version of this to pair with my Continuous Glucose Monitor, mainly because I'm convinced that glycogen output from the liver is what's causing a lot of variability, and would explain, e.g. early morning fasting increases in blood sugar.
What you're describing is likely the dawn phenomenon - cortisol's morning spike triggers hepatic glucose production, and continuous monitoring could finally give diabetics actionable data to adjust treatment timing accordingly.
I'm curious: did it ever become typical for these studies to publish the data and code? I haven't kept current, but I remember having read studies in times past and they definitely didn't. Not necessarily referring to this or vaccines, just like "hey we did this analysis" but they don't publish the code and data, when we have an ongoing replication crisis in science.
On page 8 of the supplemental material, they pasted some R code, at least. Hopefully that code runs once you load the packages they reference. I wish they made it easier to download and start working with the data, though. It’s from a national registry, so I suppose it’s available to those who look/make a request, but I’d like a 100 MB CSV.
But to really answer your question - not really. In fields where Jupyter Notebooks are common, those are generally available via a Github link, but in medical fields code and data are still relatively difficult to find.
In any future fusion power plant, a plasma with a high triple product must be maintained for long periods.
I love vague terms like "long periods". Long compared to the Planck length? Geological time? Is the advertised 43 seconds almost there or "off by 17 orders of magnitude?"
Toroidal reactors have to operate in pulses. Stellarators can be operated in steady-state (although sometimes they are pulsed to achieve higher energy).
Tokamaks can also be operated in steady-state, at least theoretically. The reason a tokamak is pulsed is due to the fact the toroidal current is driven inductively, so there is a limit to how long you can keep increasing the current in the central solenoid. However there are other methods, for example, neutral beam injection and electron cyclotron current drive. You can even exploit the bootstrap current (self-generated by collisional processes in the plasma) to obtain a near 100% non-inductive toroidal plasma (this is called "advanced tokamak" regime).
Anyway, the older generation of devices was pulsed for engineering reasons (like non-superconducting coils getting too hot). The current generation of device is solving most of these and is limited by MHD instabilities alone (neoclassical tearing modes, mostly), if we can get active control mechanism working, then will be finally approach the long-pulse or steady-state regime.
> During the record-setting experiment, about 90 frozen hydrogen pellets, each about a millimeter in size, were injected over 43 seconds, while powerful microwaves simultaneously heated the plasma. Precise coordination between heating and pellet injection was crucial to achieve the optimal balance between heating power and fuel supply.
A smooth toroidal magnetic field cannot confine plasma. The field at the outer side (further away from axis) are spread more widely and weaker than in the inner side. In a very short time, this will cause ions to drift out of confinement at the outer side. The solution is to produce a twisted, helical field, where the field lines go in circles in both directions of the toroid simultaneously, like the stripes of a candy cane in the bend.
Different reactor designs have different solutions to this. Tokamaks use a solenoid to drive a strong toroidal current in the plasma. This, in turn, causes a poloidal magnetic field, which provides the second half of the field needed for confinement. But this only works when magnetic field of the solenoid coil is varying smoothly over time in a single direction. Eventually, you hit some limit in your ability to do that, at which point you lose your ability to confine the plasma and the pulse ends.
Stellarators do not have this issue. They get the full field geometry needed from their primary field, by twisting it around the toroid in a very complex path. The downside is that they are much more difficult to design and build.
I agree vague language in popular press is sometimes annoying.
“Off by 17 orders of magnitude” would be off by 136 billion years, so not that much for sure. Assuming you want to be able to test the plant and or maintain it once per year, 43 seconds is less than 6 orders of magnitude off. The jump was more than a full order of magnitude compared to past records, so another handful such developments and we are there.
Even 1 hour of stability with a relatively short restart period (under 5 minutes) would be fine with a battery system assuming the rest of the power plant was cheap enough to build and operate.
Nuclear already gets taken offline for several weeks for refueling, but redundancy covers such issues.
Any fusion reactor that produces masses of free neutrons is uneconomic, because the neutrons are ridiculously corrosive to everything they collide with. Neutron activation produces a mess of radioactive isotopes, some of which fission quickly. It doesn't take long - certainly much less than a year - before you're left with components that no longer do their jobs and are also radioactive.
This is a much less sexy problem than containment, but it's a showstopper for commercialisation. You can just about imagine an epically huge reactor with unfeasibly powerful containment fields that trap fusion in the centre of a large cloud of hydrogen, which captures neutrons to make tritium to power the reaction. But that's completely unbuildable with current tech.
Aneutronic fusion is possible, but it happens at even more extreme temperatures, which are barely theoretical at the moment.
At this point we've been chasing fusion for more than 70 years, and commercialisation is as far away as ever.
You might as well just build yourself a small star.
Or perhaps even spend all that research money on making better use of the star we already have.
General Fusion claim to get around these issues by having the fusion take place inside a centrifuge of liquid lithium. I'm not knowledgable enough to determine how plausible their claims are, though.
Long compared to the current generation of experiments. JET pulses lasted a couple of seconds, an actual power plant might be more like a couple of hours or even a steady-state.
From everything I've ever read and my (admittedly old) ISP experience, people really don't use all the bandwidth. Yes, occasionally there are exceptions.
Direct Rendering Manager (DRM) has to be one of the worst possible re-uses of an acronym ever. This is the second time I've seen it used recently without an expansion of the acronym.
It's really convenient for being able to quickly whip up diagrams to share for discussion. Like when you have a small group of folks that need to iterate on a DB design, it's super simple to change things.
I've used this a bunch early in projects where the team needs to agree on DB structure:
1. whip something up as a discussion piece
2. usually a quick meeting to iterate on it, spot problems, correct.
3. you're close enough to build the DB bits, ship it. :)
As someone who has helped build one of those cookie banners, I will admit that using cookies to store the state of what you consented to is somewhat ironic, or at least meta.
It's got some really interesting parts though: serving those banners, geolocated (show the right thing for the legal regime you're in) at actual "web scale lol". And while folks hate them, we respect the GPC signal, and then won't show you the banner, just opt you out of everything.
The alternative would be something like "making you log in to save cookie preferences". There are legitimate reasons to do this, but most would oppose it. Would you provide some market feedback?
Would you rather log in to preserve your settings?
Can you think of any other way to (semi-)anonymously identify you and preserve your settings across different sessions?
Note: this is actually a problem I'm thinking about, and really do want to find a workable solution.
> Would you rather log in to preserve your settings? Can you think of any other way to (semi-)anonymously identify you and preserve your settings across different sessions?
The fundamental problem is that 99% of sites with cookie banners don't need cookie banners, because they don't need cookies at all. They say (lie) that it's to "improve your experience", but it's really just for tracking purposes.
If a site has a login, then of course they need some kind of storage, and they can present the cookie consent in the login form. Otherwise, though, it's an imposition with no benefit to the site visitor.
When you say "we respect the GPC signal", do you mean the Sec-GPC header? That would be a bit of improvement if all sites had to respect the header, but at this point it's still experimental and not widely used or respected.
WRT GPC, yes. We treat GPC as a "essential only" choice, and don't show you a banner. We treat DNT (Do Not Track) the same way, but its legal standing seems a lot less clear, especially due to the hijinks Microsoft did.
And GPC is not experimental: if you don't respect it, you're in violation of California's CCPA/CPRA law [1]. It's likely that other states will fall in line here as well.
I think we have different use cases: I'm trying to help people get compliant && do the right thing. From my perspective "anything that can trigger an uncomfortable conversation with an auditor" is not experimental. And I can tell you from the analytics that a significant portion of the web browsers out there are using it.
It may not have been clear: I work for a SAAS company that sells a "cookie banner", or more properly, consent management; it's quite a bit more complex than just managing cookies.
Actually, if you're managing cookies, you've already lost.