On a much smaller scale, I used to use our CI server to build and deploy our CI server. But if it went down it was a huge pain. Not really worth it for the fuzzy feeling, just make it build offline.
I think the self-hosted bit is just for syncing, as long as you have multiple devices its not likely to lose data even if you don't follow the 3-2-1 backups.
I hope Apple loses and is forced to change their logo i to something like a bitten iPhone. The other company make actual apples and for longer than Apple, so they should have the right of way.
Another way of doing this is assign Hyper key to Caps Lock with Karabiner and then use that to set up hyper+X shortcuts to different or macros using Hammerspoon.
So e.g. I have hyper+a for Alacritty, hyper+b for Browser etc.
Yep, this was the way I use it too. I had 'qmk' shortcuts but later moved to hammerspoon approach.
Right now trying this set up . Andweeb's 'Ki Spoon' [1] for much efficient workflow. It is still under development
I do the same (though I use spacehammer which is built on top of hammerspoon). It's been the biggest game changing in my keyboard-driven productivity. I don't like how it feels non-deterministic to command-tab since the order of the apps change. With a hotkey for each I know exactly what will show up when I hit the key, and better than using function keys, I don't have to move my hand from home row, plus it works when I use an external keyboard (which is most of the time).
You can also make your Spacebar (with a tiny delay not to interfere with typing) an app-launcher key, and then you don't even need to move your poor pinky for such frequent
space+r for browser
space+f for file manager, etc.
This is what I did in stumpwm. And I had a nice macro so I could define an app's command, what key to map to, what desktop it lives on, and whether jumping to the window or pulling it to me was the default.
The battery won't spin and lose all charge, it has a breaker mechanism that will stop it similar to how seatbelts work, it gets stuck if it pulls too hard. This is to prevent damaging a circuit if you forget resistors.
Of course this is a model of how electricity should work, but it is still useful and they document the differences from "real" electricity decently in the books.
In any case its a super fun toy and I highly recommend getting it!
ELT approach means you copy raw data from transactional sources, events, external APIs and dump it in the first “layer” of your data warehouse.
Then you progressively clean it and model it into some format that is easily usable for dashboards or self service analytics.
To do that you build a DAG of SQL scripts and dbt makes that easy through templating, macros and automatically generated docs.
You still have to execute the DAG somehow and here you either use their OSS version and schedule it with anything from Github Actions to Airflow, OR you buy their Cloud offering
My definition of low level is no tracing GC, values are unboxed by default, and users still have control to do low level things (raw pointers, other unsafe operations) when needed, even if it is not the default.
This is a very deep and mature assessment. I have high expectations for the future of Ante. Higher than Rust, in particular, provided it supports some analog of destructors.
The generally accepted definition of a low level language is a language that provides little or no abstraction from a computer's instruction set architecture. In actuality, C is a lower level functional programming language than Ante, because C functions are first-class citizens.
I like the lack of GC! What is the method for always incremental compilation? The benefits are obvious, but isn't it problematic to have two authoritative representations of the same source? It would be fantastic if you get that to work well!
Fwiw C is actually a huge abstraction over modern hardware.
It's a good representation of the PDP-11 and similar-era computers. It's also a good abstraction for modern microcontrollers.
C's view of the world is also a really poor fit for today's larger CPUs. Your computer has to jump through a ton of hoops to make itself seem C-like. C has no concept of vectorization, speculative execution, branch prediction, multiple cores, caches, MMUs, etc.
For basically any desktop/laptop/smartphone CPU, the C runtime is more like a little VM than an accurate model of the hardware underneath.
How would you expose speculative execution & branch prediction? Cache behaviour is well understood so what's wrong with it? What would you do to expose MMUs to the programmer, and what would they do with it/how would it help?
[1] but aren't there libraries that expose it just fine in a portable way?
So what are the $%^&* semantics we're supposed to build into the language to avoid the need for speculation via OOO exec and branch prediction. Because as a guy interested in languages you might have something to teach me, assuming you have any idea what you're talking about.
Eh, isn't it a bit of a stretch to claim you can program functionally in C? You can mimic it, but without easy closures you can't follow _any_ functional programming patterns that are built on composing functions into new ones in expressions (e.g. binding the operation argument to a fold)
C is great, and it would be cool to see something that is similarly close to the instruction set but has functions as values
A functional programming language is a programming language in which functions are first class citizens. C is such a language.
Of course there are many programming patterns that are in more acceptable functional programming languages than C. Whether a programming language is considered functional is not the same as which patterns are supported in the language.
Your idea of what FP means is completely nonstandard.
For the record, there is not one accepted definition, but we can get close by saying that FP languages are those based on lambda calculus as their semantics core. And the primary mechanism in lambda calculus is variable capture (as done in closures).
C is based on the Von-Neumann model and has absolutely nothing to do with the lambda calculus. No reasonable PL expert considers it functional.
There's a lovely Perlisism for this: "A programming language is low level when its programs require attention to the irrelevant." Which is interesting in this context because it's clearly not a good fit for Ante, where it looks like many traditional features of "low-level" languages no longer require such attention! So maybe it's not Perlis-low-level; maybe it's "machine-oriented"?