A "static site" implies HTML rather than a JavaScript app.
With respect to JavaScript apps (React, Angular, etc.):
It's not clear these days because the major search engines don't explicitly clarify whether they parse JavaScript apps (or if they only parse high-ranking JS apps/sites. But 10 years ago it was a must-have to be indexed.
One theory on pre-rendering is it reduces cost for the crawlers since they don't need to spend 1-3s of CPU time pre-rendering your site. And by reducing costs, it may increase chances of being indexed or higher rank.
My hunch is that long-term, pre-rendering is not necessary for getting indexed. But it is typically still necessary for URL unfurls (link previews) for various social media and chat apps.
I really don't see how improving security and reducing privacy are related: like generating fingerprints of installed programs, or scanning the memory, disks for malware and reporting results to the OS provider/device manufacturer and mapping the results to a user account or an ip address ?
For me improved security was more synonym to degraded performance.
Hard to do I think. Even plain old "linux but for phones" is pretty difficult at this point because:
--most drivers and hardware specifications are proprietary, and probably secret under NDA
--most bootloaders are cryptographically locked and controlled by vendor
--necessary wifi and cellular modem hardware is the same, and are also patent minefields even in the foundational platonic ideals of design, as is mobile graphics hardware
--the modems are subject to regulatory requirements that they be secured from modification by the user/owner of the device
Secure in this context means secure from the user and device owner, which can arguably be for good reason -- think of an ATM kiosk, for example.
So no "tails for phones" yet, but people are trying. Check out postmarketOS, lineageOS, replicant, sailfish. Last I tried things were still kind of science project, like 90's style linux.
Adding ubports to the list- I have it on my Pinephone and can place and receive calls, send and receive SMS, and utilize GPS. There are some features that don't yet work on Pinephone like MMS but it's on the threshold of being daily-driver worthy.
Users on Pine64.org have reported their devices can run 14 hours on idle and my own experience is the battery will last all day with moderate usage. Getting it on the phone is as simple as getting Raspbian on a Raspberry Pi, you just flash the image onto an SD card.
I feel as though the improvements in security lend themselves to improvements in privacy, and vice versa, in a rather linear relationship.
That being said, privacy factors are primarily user-choice in that it is the optional apps and programs that compromise privacy, even if the privacy breaches are less-than-voluntary, as seen in the recent clipboard skimming scandal. Making an OS more secure by limiting unauthorized access to information like the clipboard is both more secure and more private.
Tenderloin, I mainly looked in the Russian Hill / Marina / Lower Pac heights areas for private 1 bedroom situations in 2/3 additional roomates. I ended up going for a place in the Presidio. For the most part the opportunities I was seeing were for subleases where a tenant moved out hurriedly, which seems to be VERY common right now.
FWIW all private room + 2/3 roomate situations were in the range of $1200 - $1500 vs. my current 1 bed room apartment (no roomates) @ ~$2800 previously.
I worked in the TL for three years and went on a few dates with a girl who lived there. The people in actual apartments are fine. People in SROs not so much, people on the street can be a problem. The needles, urine, feces are real. The attendant at the lot I parked at said he thought he'd be dead within six weeks of taking the job, but in my time there, I think he was only assaulted once (it was more jarring than injuring). One coworker got slapped by one of the crazies, another assaulted. Oddly enough, my car did fine. I've street-parked it around Turk and Jones and picked it up at ~midnight with all my windows intact. As bad as the TL is, it isn't Bayview. I'm always amazed at walk from with Powell to Civic Center, Geary to Market, or west from Union Square; how quickly it changes is incredible. You also see scared, lost tourists a lot because of that.
There are some interesting restaurants there, and you're actually close to a lot of things. If you're living there in a secure building with a secure garage and can mostly avoid the street, it's tolerable. I wouldn't make a habit of casually walking around the block or stumbling home drunk. If you're paying attention and walking with intent, you're generally fine.
Not weird - though its really just the same as downtown SF: 2-5 days a week you will have some combination of a screaming and manic individual immediately in your walking path, see someone injecting themselves immediately in your vicinity, and nearly (or actually) step in dog/human poop. It by no means is the end of the world but over time it wears you down and desensitizes you to the point of completely ignoring all homeless activity always.
After being here for ~2 years I would not recommend the TL because the cost savings and walking commute to downtown are not enough given alternative choices.
If you need to save money you need to move further away to realize real savings, there are too many hidden fees living in the city.
If you have money you'll want to either live in 0-commute distance i.e. near the embarcadero or in literal downtown, or live in the Marina / Presidio / Anywhere else and have a 15 minute drive in.
I'm going to shill my own product https://www.prerender.cloud because it's (mostly) solved this since Chrome headless was finally released in fall 2016.
If you can be patient enough to cache any state you might have in a special global var that gets serialized during the "server side render", then it works for any JavaScript framework that can rehydrate/reattach to SSR'd HTML.
Order of magnitude = 10. Binary order of magnitude = twofold = 2x.
Mixing terminology as it suits oneself is considered deceitful and/or manipulative. Making excuses for someone who does this and doesn't restate/recant is also bad form.
Maybe he was talking rate? 2T seems off the charts if you're talking the normal events, looks like 2008 made it jump 1T to 2T in 2 months. This jumping 2T in 1 week is a little off the charts.
Linear search approaches fall down when you have a lot of data and you only want to select a very small portion of it.
A linear based approach can get you to about 1GB/s or so per core with Rust.
A medium-ish size startup probably logs around 200GB/day of logs if they aren't very tight on their log volume. If you only want to search the last 24 hours that is maybe ok, you can search that in ~10-20 seconds on a single machine.
However this quickly breaks down when a) your log volume is a multiple of this and/or you want to search more than just a few hours.
In which case you need some sort of index.
There are different approaching to indexing logs. The most common is full text search indexing using an engine like Lucene. Elasticsearch (from the ELK stack) and Solr explicitly use Lucene. Splunk uses their own indexing format but I'm pretty sure it's in a similar vein. Papertrail uses Clickhouse which probably means they are using some sort of data skipping indices and lots of linear searching.
Of these approaches Clickhouse is probably the best way to go. It combines fast linear search with distributed storage and data skipping indices that reduce the amount of data you need to scan. (especially if you filter by PRE WHERE clauses).
So why not go with Clickhouse? Clickhouse requires a schema. You can do various things like flatten your nested structured data into KV (not a problem if you are already using a flat system) and have a single column for all keys and the other column for values. This works but doesn't get great compression, makes filtering ineffective for the most part and you now have to operate a distributed database that requires Zookeeper for coordination.
The reason I am choosing to build my own is that logs require unique indexing characteristics. First and foremost the storage system needs to be fully schemaless.
Secondly you need to retain none word characters. The standard Lucene tokenizers in Elastic strip important punctuation that you might want to match on when searching log data.
Field dimensionality can be very high so you need a system that won't buckle with metadata overhead when there are crazy numbers of unique fields, same goes for cardinality.
TLDR: For big users you must have indices in order not to search 20TB of logs for a month. Current indices suck for logs. I write custom index that is hella fast for regex.
I considered that but it's harder than it sounds. Clickhouse s very strongly coupled to the idea of a schema and it also is very coupled to only using indices for data skipping.
If I was to make the changes I want to Clickhouse, i.e schemaless and full indexes per/segment then it wouldn't be Clickhouse anymore.
With respect to JavaScript apps (React, Angular, etc.):
It's not clear these days because the major search engines don't explicitly clarify whether they parse JavaScript apps (or if they only parse high-ranking JS apps/sites. But 10 years ago it was a must-have to be indexed.
One theory on pre-rendering is it reduces cost for the crawlers since they don't need to spend 1-3s of CPU time pre-rendering your site. And by reducing costs, it may increase chances of being indexed or higher rank.
My hunch is that long-term, pre-rendering is not necessary for getting indexed. But it is typically still necessary for URL unfurls (link previews) for various social media and chat apps.
disclosure: I operate https://headless-render-api.com