Propellant is still used for rotation control. Reaction wheels can "saturate" if they compensate for rotation more in one direction than the other on net, so propellant is needed to get them back down.
Ion engines, generally speaking, do not use background dust. They still carry propellant, they just eject it electromagnetically.
An photon engine, basically just a laser pointed backwards, uses pure electricity to produce thrust. But of course the numbers all work out, since photons have momentum. They're extremely weak though, even lasers of staggering power produce very little force. There's no way you could put one on a satellite
> Reaction wheels can "saturate" if they compensate for rotation more in one direction than the other on net, so propellant is needed to get them back down.
Torque Rods can be used to desaturate wheels without needing any propellant
The docs say, "one bit is reserved for the OCaml runtime", so doesn't that mean that one of the bits (likely the high bit) are unavailable for the programmer's use?
I mean, I understand "reserved" to mean either "you can't depend upon it if you use it", or "it will break the runtime if you use it".
So the "one bit" you refer to is what makes the standard int 63 bits rather than 64. If you could do things with it it would indeed break the runtime- that's what tells it that you're working with an int rather than a pointer. But full, real, 64-bit integers are available, in the base language, same goes for 32.
I think you need to re-read some of the comments you are replying to. There is a 64 bit int type: https://ocaml.org/manual/5.3/api/Int64.html
You can use all 64 bits. There are also other int types, with different amounts of bits. For example, 32 bit: https://ocaml.org/manual/5.3/api/Int32.html
No one will stop you. You can use all the bits you want. Just use the specific int type you want.
ravi-delia explained that the fact is that an OCaml int is different than either Int32 or Int64 because an 'int' sacrifices one of its bits to the OCaml runtime. Int32 or Int64 are treated completely differently and are library defintions, bolted onto the OCaml runtime.
That is a runtime system not suitable for systems-level programming.
My C experience gave me a fundamental misunderstanding because there, an int is always derived from either an 32- or 64-bit int, depending on architecture.
OCaml is architected differently. I imagine the purpose was to keep the programs mostly working the same across processor architecture sizes.
I imagine this fundamental difference between OCaml's native int and these more specific Ints is why there are open issues in the libray that I"m sure the int does not.
Regardless, no one should be using OCaml for systems-level programming.
Thanks for helping me get to the heart of the issue.
The situation is that OCaml is giving you all the options:
(a) int has 31 bits in 32-bit architectures and 63 in 64-bit architectures (which speed up some operations)
(b) the standard library also provides Int32 and Int64 modules, which support platform-independent operations on 32- and 64-bit signed integers.
In other words: int is different but you always have standard Int32 and Int64 in case you need them.
It seems therefore that the use for system-level programming should not be decided for this (although the fact that it is a garbage collected language can be important depending on the case, note that still its garbage collector has been proved one of the fastest in the comparisons and evaluations done by the Koka language team of developers).
Ok, running this by you one more time. There is a type called "int" in the language. This is a 63-bit signed integer on 64-bit machines, and a 31-bit integer on 32-bit machines. It is stored in 64 bits (or 32), but it's a 63-bit signed integer, because one of the bits is used in the runtime.
There is also a 64 bit integer, called "Int64". It has 64 bits, which is why I call it a 64-bit integer rather than a 63-bit integer. An "int" is a 63-bit integer, which is why I call it a 63-bit integer rather than a 64-bit integer.
So an int has nothing to do with an Int32 or Int64.
Thanks for your patient elucidation.
This means the semantics for Int32 and Int64 are COMPLETELY different than that of an int. My problem is that I come from the C world, where an int is simply derived from either a 32- or 64-bit integer, depending on the target architecture.
OCaml's runtime is not a system designed for systems-level programming.
Thanks again.
Now I know why the F# guys rewrote OCaml's fundamental int types from the get-go.
The reason of F# guys did things different from OCaml is not because system-level programming but because F# is a language designed for the .NET ecosystem which imposes specific type constrains. F# language was not specifically designed for systems-level programming.
Again, the semantics of Int is different but the semantics in OCaml of Int32 and Int64 is the same/standard. So you have 3 types: int, Int32 and Int64 and it is an static typed language.
I mean I guess you could say they have different semantics. They're just different types, int and Int64 aren't any more different from each other than Int64 and Int32. You can treat all of them exactly the same, just like how you have ints and longs and shorts in C and they all have the same interface.
Regardless, I don't think C's "probably 32 bit" non-guarantee is the make or break feature that makes it a systems language. If I care about the exact size of an integer in C I'm not going to use an int- I'm going to use explicit types from stdint. Rust makes that mandatory, and it's probably the right call. OCaml isn't really what I'd use for a systems language, but that's because it has no control over memory layout and is garbage collected. The fact that it offers a 63-bit integer doesn't really come into it.
> int and Int64 aren't any more different from each other than Int64 and Int32
They are, though. Int64 and Int32 only differ in bit length and are in formats native to the host microprocessor. int has one of its bits "reserved" for the OCaml runtime, but Int32 has no such overhead.
> The fact that it offers a 63-bit integer doesn't really come into it.
It does if you interoperating with an OS's ABI though, or writing a kernel driver.
But you're right: there are a host of other reasons that OCaml shouldn't even have been brought up in this thread ;-)
Peace be with you, friend. Thanks for so generously sharing your expertise.
> Performance notice: values of type int64 occupy more memory space than values of type int
I just couldn't even imagine that a 64-bit int would require MORE memory than an int that is one bit less (or 33 bits less if on a 32-bit architecture).
It really makes absolutely no sense discussing OCaml as a possible systems-level programming language.
bruh, it's just saying single scalar Int64 types are boxed. This is totally normal thing that happens in garbage collected languages. There's no semantic loss.
OCaml does this 63-bit hack to make integers fast in the statistically common case where people don't count to 2^64 with them. The top bit is reserved to tell the GC whether it manages the lifetime of that value or not.
For interoperating with binary interfaces you can just say `open Int64` at the top of your file and get semantic compatibility. The largest industrial user of OCaml is quant finance shop that binds all kinds of kernel level drivers with it.
(and yes, 64-bit non-boxed array types exist as well if you're worried about the boxing overhead)
This is why we never should have invented the phonograph. People who want to listen to music can just buy a record, making it literally impossible for them to perform an activity humans ENJOY doing. Without it everyone would surely be making all their own music, and nothing valuable would be lost
The ability to record has led the greatest expansion I musical artistry in human history.
Ty it don’t think peasants were listening t to Bach, do you? Only the extraordinarily wealthy could afford to have music as anything like an every day thing.
And with more affordable and easier-to-learn tools, the creation of music will be similarly made much more accessible?
DAWs and virtual instruments running on regular laptop was one step, generative AI models will be another?
You're debating two different things, two different experiences
Creation is a human activity, charged with emotions, efforts, which are their own rewards, as much as the end-product, which is invested of this human (sometimes collective and not instant) effort and intention and creative loopbacks. Let's call that some kind of history (because the process did happen).
Generation short-circuits that entirely, as it happens at non-human speeds, and non-human scales. It's something _else_ entirely. You do get an end-product. It may be fun and useful for some; it sometimes is. However, you don't get the process, the collaboration and the inner transformation it comes with.
Adding: with two different end-products, the issue is then how they are perceived, received, appreciated and valued by those not "in the know" of how they were made. And that is both an artistic, aesthetic and economic problem. Generating soulless shit that isn't invested with a human sentiment miseducates people and destroys taste.
I agree with your overall description of creation. But I do not agree that generative models are something else entirely. They are tools, and while their affordances do influence what people do with it, in the end the responsibility is on the creator. You can make "soulless shit" or "thoughtful commentary" or anything else you put your mind to, by using these tools in combination with all the existing ones.
Models that are oriented around one-shot, text-only direction are pretty limiting in creative flow. This will hopefully continue to improve.
To make what I consider a halfway decent song with these current easiest-to-use services (like Suno and Udiio) takes a few hours in my experience.
To get there one has to work with the text, the song structure, find a decent style, and then do corrections on sections where the models goes off track.
To make something that is closer to "good", I would go and re-record all the lead vocals myself, and then mix this in a DAW.
The tools and knowledge for making music are already unbelievably accessible. Anyone with an internet connection and a decent computer can read about music theory, learn to use a DAW, and get some basic virtual instruments. The same goes for producing art, which doesn't even require anything digital.
This does not augment the music making process in any way, it simply replaces it with what might as well be a gacha game. There's no low-level experimentation, no knowledge acquisition, no growth, and you can't even truly say you made whatever comes out.
It's not a tool for music creators, it's a tool for people who want slop that's "good enough".
Sure, with several hundred hours to spare one can make some songs in a DAW. Now one can make something as good/bad in maybe 1/10x the time. Or, given the same time investment, one can possibly make something better!
The goal of AI automating labor should be to give us more leisure time to pursue hobbies, not to fill our limited leisure time with low quality substitutes for those hobbies.
Making an activity in which the primary limiting factors for most people are the time, knowledge, and effort required (as opposed to expensive tools) into an effortless slot machine pull is enfeebling to human creativity and agency. Who will spend the hours of making bad music to get to the point where they become good if they can just rely on something else to generate music that's "good enough"?
There's something to be said about all this which is related to AI generated images that I rarely see brought up: people with specific skills play roles within groups, so AI making their hobby that they dedicated so much time to more easily accessible makes them lose social value, which might make them quit altogether.
The common response that "people should make art because they love it, not for attention" is a prescriptive statement that supposes there are more or less "pure" forms of performing an activity and also ignores that art is a form of communication.
"low quality substitute" and "effortless" are value judgements on your behalf. Many made similar judgements about DAWs and VSTs. And that is your right. But not everyone sees it in the same way - for some generative models are opening up a new world of possibilities.
I agree that the slot machine pull of current models is tedious and boring. I look forward to models/systems which better facilitate more creative control, directed exploration and iterative refinement.
Yes, there are a TON of free tools and endless instruction on using them. If you move your budget up to making one-time payments for things that cost less than one month using a subscription service, you get an astonishing breadth of new options. Beyond that, so many of the more expensive music making tools are one-time payments rather than subscription services. Buy Ableton once? You own it. You can get the latest version at a discount, but there's absolutely nothing stopping you from using the version you bought, in perpetuity.
Lots of common people did listen to Bach, because he wrote many works for church organ. Church attendance was almost universal, and even small churches had (small) pipe organs.
His work was not commonly performed in his lifetime, and I think you're rather proving my own point? Yes, they could perhaps occasionally listen to Bach, if the organist at their church was aware of him (most would not have been, not until hundreds of years later), had the music, were willing to perform it, and you happened to be in attendance when they did. That's a lot of chained ands.
There are like 6 core activities that bind humans together: shared creation of food, myth and music; co habitation, protection, child rearing.
We've done these things ourselves for hundreds of thousands of years. As we are increasingly convinced to buy them for convenience we loose the very things that make us know our connectedness.
So ya, there are real problems caused by the convenience of technology
People will still enjoy making music. Musicians will make music quite regardless of whether anyone is listening or whether there’s recordings or AI available.
The question wasn't "why do protons have +1 charge", it was "why do protons have +1 charge, *considering electrons have -1 charge". The fact that possible charges are restricted to a few values is a much more satisfying answer to the latter than the former
Not so much for takeoff! Most rocket designs better than chemical rockets trade off thrust for specific impulse. That's an improvement in orbit, since delta-v is delta-v. But imagine a 10kg rocket- it's receiving ~100N of gravity. If your engine doesn't put out 100N of thrust you'll just sit there on the launch pad. As you pick up speed you no longer have to deal with that (after all, LEO has basically the same gravity and doesn't have to burn against gravity at all) but when you're launching off something other than a point mass, some of your thrust has to go towards ensuring you don't hit the planet, or you will not into space today.
The practical designs we have for NTRs are solid core, which after long effort got up to a thrust to weight ratio of 7:1, meaning they could in principle carry up to 6 times their weight and accelerate up in Earth's gravity rather than down. Chemical rockets can get 70:1. No one ever had plans to use NTRs in lift platforms- instead they could serve as more efficient upper stage engines, for orbit-orbit transfer burns and the like. In principle there are engines which are technically NTR and offer much better performance, but no one's ever gotten a working prototype. Also you probably wouldn't want to launch with an open cycle rocket, since the open part describes how the radioactive fuel is ejected out the rear. Unfortunately, with the technology we have, we have to make tradeoffs between efficiency and thrust. For the lift stages chemical rockets are, for now, unrivaled.
(Unless of course your nuclear propulsion is of the more, shall we say, entertaining variety. Project Orion has its proponents...)
When discussing potential alien civilizations, one can’t discount the existence of civilizations which exist on substantially more radioactive planets.
If the background radiation of earth was 100x higher, would we care about an Orion launch? Or a small nuclear exchange…
The more fuel you have to pile onto the rocket, the less the weight of the engine matters.
Using the chart in the accepted answer, launching with chemical engines takes 50 thousand tons at 3x gravity and 3 million tons at 4x gravity.
Now consider a theoretical engine that has a 7:1 thrust to weight ratio at 1G but sips fuel. Take a 25 ton engine, strap 10 tons of fuel to it and 1 ton of payload. Watch it go to orbit on a single stage.
A real NTR doesn't save nearly as much fuel, but it can still be useful in certain ranges.
I can't help but think that any species insane enough to use Orion drives in the first stage probably already found a way to blow itself up before it gets to that point.
And maybe I'm taking Terra Invicta too seriously but maybe they would wait until they figure out nuclear fusion and have more options.
It actually bottoms out pretty low though. The apparent fractal dimension changes over scales, but is sufficient that ruler length really does make a huge distance in coastline length
I think what you're referring to is the false proof suggesting that pi=4. That is not what squaring the circle is, and does not in fact become smoother and smoother but rougher and rougher
Sure, but you should only be using space heaters and not heat pumps (or non-electric sources of heat) if you're using them infrequently, at which point the cost of silicon over resistive wire is not worth it.