Armchair comment. I would LOVE to be a grad student again and try to pair it with ultrasound speaker arrays, for medical applications. Essentially a super HIFU (High-Intensity Focused Ultrasound) with live feedback. https://en.wikipedia.org/wiki/Focused_ultrasound
I do my PhD in in-air ultrasound with phased arrays and talk to the medical guys at conferences/labs that we talk to and it's soooo much harder in solids/liquids. The frequency is significantly higher, think 1-10MHz instead of like 40khz, so any normal electronics are out the window.
Hey saw your message a while back in a thread talking about continuous glucose meters and feeling tired and fatigued etc.
Mind contacting me? I'd love to chat. My email is in my profile
One problem is that the speed of sound is not constant (or approximately constant) across the bandwidth you're interested in when the sound wave is traveling through solids and liquids.
howdy! Clay makers here. Can you share more? Did you try Clay v1 or v0.2
What image size embeddings from what instrument?
We did try to relate OSM tags to Clay embeddings, but it didn't scale well. We did not give up, but we are re-considering ( https://github.com/Clay-foundation/earth-text ). I think SatClip plus OSM is a better approach. or LLM embeddings mapped to Clay embeddings...
Hey hey! We tried Clay v1 with 768 embeddings size using your tutorials. We then split NAIP SF to chips and indexed them. Afterwards, we performed image-to-image similarity search like in your explorer.
We tried to search for bridges, beaches, tennis courts, etc. It worked, but it didn't work well. The top of the ranking was filled with unrelated objects. We found that similarity scores are stacked together too much (similarity values are between 0.91 and 0.92 with 4 digit difference, ~200k tiles), so the encoder made very little difference between objects.
I believe that Clay can be used with additional fine-tuning for classification and segmentation, but standalone embeddings are pretty poor.
Check this: https://github.com/wangzhecheng/SkyScript. It is a dataset of OSM tags and satellite images. CLIP fine-tuned on that gives good embeddings for text-to-image search as well as image-to-image.
Can we help you?
We build the equivalent for land, as a non-profit. It's basically a geo Transformer MAE model (plus DINO, plus matrioska, plus ...), but largest and most trained (35 trillion pixels roughly). Most importantly fully open source and open license. I'd love to help you replace land masks with land embeddings, they should significantly help downscale the local effects (e.g. forest versus city) that afaik most weather forecast simplify with static land cover classes at most. https://github.com/Clay-foundation/model
You'll need to subscribe to Alexa weather plus, for only 9.99$/month.
Now seriously, yes, hyperlocal short-term weather forecast should be a commodity, even public utility?
I like accuweather's minutecast which is a higher resolution short-term forecast (+60 min) that is not just pulling the forecast for the nearest weather station to you.
Windy(.com) premium also has a great hybrid weather radar+forecast view which was recently released and which I find has been very effective at predicting rain at a specific location on the map vs "nearby". With smaller weather patterns it is entirely possible for it to rain a few blocks away but not at your location. An 11-KM resolution weather forecast (as referenced above) will not be able to capture this nuance.
In case you're curious -- computer scientists have been trying to simulate/predict weather over half a century and it's led to some really awesome math/compsci discoveries.
If you've ever heard of the Lorenz/Butterfly Effect/Strange Attractors, those chaotic systems were discovered because of a discrepancy between two parallel weather simulations. One preserved the original simulation's calculation train while the other started off with simply the previous results (out to like 10 decimals) and suffered from a rounding error and thus both simulations diverged hugely.
Lorenz was trying to simulate weather by subdividing the atmosphere into tons and tons of cubes. Really interesting reading/video watching tbh.
The Planetary Computer from Microsoft has an explore tool, which includes Sentinel-1 at any time and location, rendered on-the-fly. Free open access. No account required.
I fondly remember how a small DC startup subsubsubcontracted to do the front landing page was able to ensure that at least people landed on healthcare.gov page and not a 500 error page.
All thanks for a single server (with one backup) using Jekyll, if I remember the story correctly.
As a civic technologist currently contracting, I feel like more Americans should know that procurement rules all but prevent federal offices from doing their own software development and that most of it is, as you wrote, subsubsubcontracted.
And that generally, the people who've won the contract to build the website or application are not the same people who've won the other contract to manage the databases, which is a totally different someone than those who've won the contract to run the DNS system, oh and also that the DNS contractors and the database contractors definitely bid on the website contract but didn't win so they want the website contractor to fail so they can get a second chance at bidding on the contract when you do
Don't forget the database contractor and frontend contractor aren't allowed to directly talk to each other but have to make contact through their individual contracting officers.
and that the leads from both of the other teams are on the change control board eagerly waiting for your change to come before them so they can find the least-obvious-but-still-patently-obvious excuses to make your request look unprepared to the govvies who pay them in lieu of having any expertise so "new table" requests can be shot down with lowbrow questions like "But have we thought about the security implications?!?"
Pissed off contracting officers at the least, losing their trust will ensure you will be micromanaged on all the things to the letter in all the hundreds of pages of contracts you signed. They can make life hell and probably stop you from renewing the contract at the least, or make it an instant recompete at the worst.
Should all gov websites move to the recreation.gov model where the contractor is incentivized disproportionately on a percentage basis ? It seems that customers get shafted no matter what; the latter at least creates a better product.
What does it have to do with procurement rules? I was under the impression that it was just that the federal gov't won't pay salaries that are remotely competitive, so the only way they can get work done by competent people is by hiring a contractor who then pays market wages to a consultant.
Eh, I've worked with a whole lot of incredibly intelligent, talented people in the federal government. If there's a problem they have with wages, it's more that they can't retain the best people any more, but it wasn't so much of a problem 20 years ago before Silicon Valley salaries skyrocketed so quickly. The security of the civil service and a guaranteed pension used to make up for the slightly lower pay, but it doesn't make up for 1/3 the pay. Given it takes a literal act of Congress to change the pay bands and Congress won't even fund the existing budget they already passed, they simply can't adjust at the speed of industry.
But they probably will eventually add a compensating bonus for critical work the way they had to do with medical doctors. Physicians that work directly for DoD make an enormous bonus compared to others in the same pay grade but different career field because otherwise they wouldn't be able to hire anyone. They react slowly but eventually react.
What you're talking about, though, is a problem with procurement procedures. It's far easier to get through Congress than any change in operational budget that involves increased pay for federal employees, even if the outcome is otherwise identical. And it frankly makes sense to a large extent. Even if they have to pay way more for a private workforce, they only have to justify it for a several year project. Hire the same number of civil servants and you're committing to employing them for the next 30 years. They need to know they'll consistently have work for them to do. They can't just institute mass layoffs and hiring spurs on the other side the way industry can.
I mean, ok? 20 years ago is a long time. As you say, pay is now like ~1/3 that of industry, and the pension doesn't make up for it.
> What you're talking about, though, is a problem with procurement procedures. It's far easier to get through Congress than any change in operational budget that involves increased pay for federal employees
Wait, is raising the pay of federal employees considered "procurement"? If not, aren't you agreeing that procurement of software consulting services is an end around the main issue, which is that wages of federal employees are too low (a non-procurement issue)?
It is notoriously extraordinarily difficult to fire an underperforming federal employee. It is possible, but generally, you are committing to keeping this person around for quite some time.
The initial version of healthcare.gov, before the exchanges launched, was a small informative site that was built by Development Seed, a small DC-based web shop. It was built in what would today be called “static page generation” style and therefore was able to be hosted on small hardware. At the time it was considered pretty advanced tactics.
I think this may be what you are thinking of. To my knowledge that site was entirely replaced when the federal exchange launched, and there was not a serious challenge serving the landing page since it did not take in or process any personal data. It was only once you started trying to find a plan that everything ground to a halt.
Development Seed also developed another project you may have heard of: Mapbox. (To which they eventually pivoted the entire company.)
Down the river, right after DC its the Naval Research Laboratory, in front of Alexandria. I worked there, and some crazy folks from Alexandria sometimes crossed the river in canoes to the NRL pier... That got terminated after 911, so they had to bike or drive all the way up around DC and down through Anacostia Air Force Base. Quite a much longer commute.
When an aunt worked there, in the 1950s, there was a boat from National Airport or thereabouts for NRL employees from Virginia. The boat wasn't very nice. It was an open boat, and in rainy weather, a sheet of plywood served to shelter the commuters. They were always glad when the boat was out of service and an Air Force boat replaced it.
You can use this link to see the appalling scope of this flood.
The layers you can toggle in the bottom right: 1) Lastest radar, 2) Last year radar for reference, 3) nighlights as a proxy of population.
I selected radar because it's really good at detecting standing water on the ground (as blue).
The part that is confusing is highest resolution (1) full-disk and (2) outer atmosphere: (1) "Full-disk" is clear to understand: the higher the resolution, ^2 the work to make it also full-disk (especially when the Sun rotates differentially and evolves in high-cadence, so you gotta be fast. (2) "Outer atmosphere" is also tricky as only few wavelengths see the outer atmosphere. The vast majority of the light comes from the "surface" or photosphere (hence the name). In this case surface, the highest resolution is roughly 0.05 arcsec or 50km/pixel. But to see the outer parts, you have to do to emission of elements like Iron that only emit when highly ionized and super high temperatures (those are the special characteristics of the sun's outer atmosphere... yes, it's way hotter than the surface, just WAY less dense). Those emissions happen in the Ultraviolet, 17 nanometers, like the caption says. That's like 50 times smaller wavelength. Angular resolution is proportional to wavelength (1.22*wavelength/Diameter) which is on the order of 1000 km/pixel (but linear resolution makes less sense since the atmosphere is such a 3D shape... it's better to say 1 arcsec of resolution).
I might be too biased (I'm a solar physicist) but the explanation above makes the image way cooler and they should have added it): The most detailed image of the Sun's metal corona :D
Microsoft "AI for Earth" | 3 roles in GIS+ML+ Sustainability | ONSITE,REMOTE,VISA all ok
We are building our commitment of the "Planetary Computer" [1]. We are looking for a principal architect (most senior position), a Datasets/ETL senior engineer, and an applications engineer[2]. Candidates for all three roles in the intersection of GIS/Cloud/OSS/ML/sustainability. Asymmetric candidates on these skills ok.