Subversion works ok over webdav, it has done it for decades.
Mounting a directory through nfs, smb or ssh and files are downloaded in full before program access them. What you mean?
Listing a directory or accessing file properties, like size for example do not need full download.
I am confused, what do you mean? What OS forces you to download whole file over NFS or SMB before serving read()? Even SFTP does support reading and writing at an offset.
If I open a nfs doc with, let's say Libreoffice, will I not download whole file?
On a second thought, I think you are looking at webdav as sysadmins not as developers. Webdav was designed for document authoring, and you cannot author a document, version it, merge other authors changes, track changes without fully controlling resources. Conceptually is much like git needs a local copy.
I can't imagine how to have an editor editing a file and file is changed at any offset at any time by any unknown agent whitouth any type of orchestration.
If you open a file with LibreOffice will read the whole thing regardless of whether or not the file is on NFS or not.
The parent comment was stating that if you use the open(2) system call on a WebDAV mounted filesystem, which doesn't perform any read operation, the entire file will be downloaded locally before that system call completes. This is not true for NFS which has more granular access patterns using the READ operation (e.g., READ3) and file locking operations.
It may be the case that you're using an application that isn't LibreOffice on files that aren't as small as documents -- for example if you wanted to watch a video via a remote filesystem. If that filesystem is WebDAV (davfs2) then before the first piece of metadata can be displayed the entire file would be downloaded locally, versus if it was NFS each 4KiB (or whatever your block size is) chunk would be fetched independently.
But many others clients won't. In particular, any video player will _not_ download entire file before accessing it. And for images, many viewers start showing image before whole thing is downloaded. And to look at zip files, you don't need the whole thing - just index at the end. And for music, you stream data...
Requiring that file is "downloaded in full before program access them" is a pretty bad degradation in a lot of cases. I've used smb and nfs and sshfs and they all let you read any range of file, and start giving the data immediately, even before the full download.
That's the beauty of working with WebDAV, also captured vividly in the above article -- any particular server/client combination feels no obligation to try and act like some "standards" prescribe, or make use of facilities available.
I might be wrong, but when I last mounted webdav from windows, it did the same dumb thing too.
This is the website of a shrimp farm in the interior of Spain. Some years working now. They do not taste like wild but they are ok. https://norayseafood.es/en/
Are those Macrobrachium? The freshwater river prawn? I can't find anything on the site, but I doubt they're doing the world's largest saltwater aquarium...
Unless the grow multiple species, it's pacific white shrimp [1] which seems to be a salt water species. Also the pictures do not look like Marcobrachium
Wonder if that means they're just using stock photos of shrimp. The other guy wasn't lying when he said "interior", it's about as far from the sea in Spain as seems possible. I thought all the non-coastal fish-farming ops were doing freshwater species.
So true. I find another prove that altruist collaboration wins any other model although users may not perceived as such or there is no interest spreading these facts.
Altruist? DARPA is a military agency, ARPANET was a prototype network designed to survive a nuclear strike. I think the grandparent comment's point is that the innovation was government-funded and made available openly; none of which depends on the slightest on its being altruist.
> The CYCLADES network was the first to make the hosts responsible for the reliable delivery of data, rather than this being a centralized service of the network itself. Datagrams were exchanged on the network using transport protocols that do not guarantee reliable delivery, but only attempt best-effort [..] The experience with these concepts led to the design of key features of the Internet Protocol in the ARPANET project
Keeping with the theme of the thread, CYCLADES was destroyed because of greed:
> Data transmission was a state monopoly in France at the time, and IRIA needed a special dispensation to run the CYCLADES network. The PTT did not agree to funding by the government of a competitor to their Transpac network, and insisted that the permission and funding be rescinded. By 1981, Cyclades was forced to shut down.
> Rumors had persisted for years that the ARPANET had been built to protect national security in the face of a nuclear attack. It was a myth that had gone unchallenged long enough to become widely accepted as fact.
No, the Internet (inclusive of ARPANET, NSFNet, and so on) was not designed to survive a nuclear war. It's the worst kind of myth: One you can cite legitimate sources for, because it's been repeated long enough even semi-experts believe it.
The ARPANET was made to help researchers and to justify the cost of a mainframe computer:
> It's understandable how it could spread. Military communications during Nuclear War makes a more memorable story than designing a way to remote access what would become the first massively parallel computer, the ILLIAC IV. The funding and motivation for building ARPANET was partially to get this computer, once built, to be "online" in order to justify the cost of building it. This way more scientists could use the expensive machine.
That's a valiant attempt at myth-fighting, but it doesn't fully convince me. For example, one hop to Wikipedia gives this:
> Later, in the 1970s, ARPA did emphasize the goal of "command and control". According to Stephen J. Lukasik, who was deputy director (1967–1970) and Director of DARPA (1970–1975):
> "The goal was to exploit new computer technologies to meet the needs of military command and control against nuclear threats, achieve survivable control of US nuclear forces, and improve military tactical and management decision making."
> That's a valiant attempt at myth-fighting, but it doesn't fully convince me. For example, one hop to Wikipedia gives this:
And in that same Wikipedia section there are 3-4 other people, including Herzfeld, who was the guy/director who authorized the actual starting of the project, who say otherwise:
Meanwhile you cherrypick 1-2 paragraphs at the end, while there are over a dozen that say the opposite. Note that Kukasik was later the director of DARPA, which had a completely different mandate that (no-D) ARPA.
>In many parts of Europe a couple of strikes would have taken place. Usans are numb or has no social conscience and muscle.
Americans are not numb. And we have both of those things, or at least most of us do. What we don't have are strong unions. Less than 10% of American workers are union members.
What's more, for as long as there have been unionization efforts in the US (150+ years), the forces of corporations, aided by the government -- to this day -- have suppressed those efforts -- for a long time, brutally and fatally -- and such suppression (although generally not with clubs and guns these days) continues, often with government support.
As such, even a "general strike" would have minimal effect, as many union members are forbidden by law to strike (in various jurisdictions, police, fire departments and others).
The vast majority of US workers are "at will" workers whose employment can be terminated at any time, for any (or no) reason. Workers can do so just the same as employers, so it's definitely fair!
More than half of Americans can't afford an unexpected $USD600 expense. As such, how many do you expect to essentially quit their jobs (possibly multiple jobs) to join a "general strike"?
Labor and unions don't work in the US the same way they do in say, Iceland, Sweden, Denmark and other countries with high levels of unionization[0]
As much as I'd love to see repeated "general strikes" in the US, protesting the anti-democratic (small 'd') assaults on the rule of law, freedom of expression and due process, given the state of organized labor in the US, that's just not going to happen.
As such, we organize where we can and mobilize widely. As the pain comes, more and more of folks will begin to see the problem. I hope it won't come too late.
Why you and other devs say Insomnia is unmaintained?
There has been a release in september, issues has been solved within last month, and multiple pull requests has been managed (merged and rejected) also recently.
Maybe you refer to issues specific to a platform? Thanks in advance.
reply