"Keep in mind that the 5th Congress did not really need to struggle over the intentions of the drafters of the Constitutions in creating this Act as many of its members were the drafters of the Constitution."
"Clearly, the nation's founders serving in the 5th Congress, and there were many of them, believed that mandated health insurance coverage was permitted within the limits established by our Constitution."
This seems like a fallacy of composition, and done so to try and persuade the reader. By my rough count, just 6 of the original founders that signed the Constitution were still in Congress at this time, or just 18% of the signers[1]. There's no roll call vote that I can find, but signer Charles Pinckney had voiced general oppositions and thought "it only reasonable and equitable that these persons pay for the benefit for which they were themselves to receive, and it would be neither just nor fair for other persons to pay it"[2]
"And when the Bill came to the desk of President John Adams for signature, I think it’s safe to assume that the man in that chair had a pretty good grasp on what the framers had in mind."
This just points to the same argument that's always being made between Spirit vs Letter of the law proponents, ~4% of Congress during the 5th Congress were signers of the Constitution and we don't know how they even voted on this. So ~96% of Congress were basically in the Spirit vs Letter dispute that we're in today.
This is awesome, congratulations. I'm glad to see some text-to-sql models being created. Shameless plug: I also just realized you used NSText2SQL[1] which itself contains my text-to-sql dataset, sql-create-context[2], so I'm honored. I used sqlglot pretty heavily on it as well.
Do you think a 3B model might also be in the future, or something small enough that can be loaded up in Transformers.js?
I also think this is the route we are heading, a few 1-7B or 14B param models that are very good at their tasks, stitched together with a model that's very good at delegating. Huggingface has Transformers Agents which "provides a natural language API on top of transformers: we define a set of curated tools and design an agent to interpret natural language and to use these tools"
Some of the tools it already has are:
Document question answering: given a document (such as a PDF) in image format, answer a question on this document (Donut)
Text question answering: given a long text and a question, answer the question in the text (Flan-T5)
Unconditional image captioning: Caption the image! (BLIP)
Image question answering: given an image, answer a question on this image (VILT)
Image segmentation: given an image and a prompt, output the segmentation mask of that prompt (CLIPSeg)
Speech to text: given an audio recording of a person talking, transcribe the speech into text (Whisper)
Text to speech: convert text to speech (SpeechT5)
Zero-shot text classification: given a text and a list of labels, identify to which label the text corresponds the most (BART)
Text summarization: summarize a long text in one or a few sentences (BART)
Translation: translate the text into a given language (NLLB)
Text downloader: to download a text from a web URL
Text to image: generate an image according to a prompt, leveraging stable diffusion
Image transformation: modify an image given an initial image and a prompt, leveraging instruct pix2pix stable diffusion
Text to video: generate a small video according to a prompt, leveraging damo-vilab
It's written in a way that allows the addition of custom tools so you can add use cases or swap models in and out.
I like the analogy to a router and local Mixture of Experts; that's basically how I see things going, as well. (Also, agreed that Huggingface has really gone far in making it possible to build such systems across many models.)
There's also another related sense for which we want routing across models for efficiency reasons in the local setting, even for tasks for the same input modalities:
First, attempt prediction on small(er) models, and if the constrained output is not sufficiently high probability (with highest calibration reliability), route to progressively larger models. If the process is exhausted, kick it to a human for further adjudication/checking.
This shows up as zero for me but the badge site says I've used it twice. It's pretty easy to manually check since I haven't made too many comments. I suspect the badge site is fuzzy matching since "luck" or some variation has appeared twice (now three times)
A reason I like it is I have an "older" AMD GPU which is no longer supported by ROCm (sort of AMDs version of Cuda) which means running locally I'm either trying to figure out older ROCm builds to use my GPU and running into dependency issues or using my CPU which isn't that great either. But with WebGPU I'm able to run these models on my GPU which has been much faster than using the .cpp builds.
Its also fairly easy to route a Flask server to these models with websockets, so with that I've been able to run python and pass data to the model to run on the GPU and pass the response back to the program. Again, there's probably a better way but its cool to have my own personal API for a LLM.
I just finished augmenting some of the Spider and WikiSQL data on huggingface [1] I initially intended to train a text-to-sql LLM that would take a natural language question and be provided the CREATE TABLE statements to provide some grounding for the response. So instead of hallucinating the column and table names by using the question alone, I was hoping the CREATE TABLE statement would limit the choices. We'll see if it's useful or not, but funny enough I came across this article after I finished the dataset.[2]
I'd definitely like to see how others are doing it.
I've got it running on Chrome v113 beta on Ubuntu with an older AMD RX 580. The feature flags don't seem to be taking for me in chrome GUI but if you start chrome from terminal like this it works.
"Clearly, the nation's founders serving in the 5th Congress, and there were many of them, believed that mandated health insurance coverage was permitted within the limits established by our Constitution."
This seems like a fallacy of composition, and done so to try and persuade the reader. By my rough count, just 6 of the original founders that signed the Constitution were still in Congress at this time, or just 18% of the signers[1]. There's no roll call vote that I can find, but signer Charles Pinckney had voiced general oppositions and thought "it only reasonable and equitable that these persons pay for the benefit for which they were themselves to receive, and it would be neither just nor fair for other persons to pay it"[2]
"And when the Bill came to the desk of President John Adams for signature, I think it’s safe to assume that the man in that chair had a pretty good grasp on what the framers had in mind."
This just points to the same argument that's always being made between Spirit vs Letter of the law proponents, ~4% of Congress during the 5th Congress were signers of the Constitution and we don't know how they even voted on this. So ~96% of Congress were basically in the Spirit vs Letter dispute that we're in today.
[1] https://www.constitutionfacts.com/content/constitution/files... [2] https://www.congress.gov/annals-of-congress/page-headings/5t...