Hacker Newsnew | past | comments | ask | show | jobs | submit | wkyleg's commentslogin

FREELANCER | USA/REMOTE | WORLDWIDE

ABOUT ME Full-stack engineer with 9+ years of experience shipping production software and leading technical teams. I specialize in turning complex ideas into real-world products, especially at the intersection of emerging technology and pragmatic business needs.

My background spans startups, established products, and research. I've contributed to DARPA and NASA-affiliated research and helped launch multiple B2B/B2C/B2G products. My technical work is grounded by a business foundation (Finance B.S.) and strong product intuition.

I’m currently looking for freelance or contract roles where I can:

- Build MVPs or iterate on high-growth products

- Lead architecture and implementation from 0→1

- Join agile teams pushing the frontier in AI, biotech, data, or web3

- Consult on strategy + technical execution for emerging tech businesses

RECENT FOCUS

- LLM applications (chatbots, agent workflows, embeddings, prompt engineering)

- Blockchain tooling (Ethereum, Foundry, Viem, building protocols)

- Custom data dashboards & visualizations (GraphQL, D3, SQL)

- Real-time systems, graph databases, and event-driven architectures, general "big data" stuff

- Security-conscious engineering: OSINT, encryption, privacy tools

KEY SKILLS React, Next.js, Vue, TypeScript, Python, Java, GraphQL, Node, SQL, Linux, Rust (learning), Solidity, Viem, Foundry, Cryptography, Prompt engineering

CONTACT GitHub: github.com/wkyleg Email: wkyleg_eth [at] pm [dot] me ENS: wkyleg CV & LinkedIn: available upon request


that should be titled "SEEKING WORK"


In my experience Pi hole is a very worthwhile investment. People who used my internet when I had one would remark how much faster it was. Everything in general seems faster, even things that you wouldn't think of. I typically use Brave for browsing which has good ad blocking capabilities, but this adds a whole additional layer.

The only reason I don't use one now is that I travel a lot more so it's irrelevant, and I have to work enough on tools with Google/Vercel/other analytics that it is just very inconvenient.

Regarding smart TVs, I have found that it's better to just use an Apple TV or Kodi box and never connect to them internet though. Having said, I gave my TV away because I never used it, so this might not be as up to date. A Pi hole will block ads on smart TVs though.


I used to love pihole, but it seems like it's more trouble than it's worth now. Advertisers have wised up and will use the same subdomain for both content and ads. I've also had issues with normal website functionality being broken due to pihole which isn't fun for my wife. It seems mostly useful for blocking background traffic on smart devices, not so much for ads.


Wouldn’t a smart tv do something ... smarter than just using the default dns given to it by the network?

I’m not up to speed on this stuff but I thought pihole only blocked the simplest stuff from devices that play nice?


> Wouldn’t a smart tv do something ... smarter than just using the default dns given to it by the network?

It could certainly try... but usually you would block that in your firewall. Fixed DNS servers or fixed server IP addresses are tricky because if you ever need to change them, you can't, because you'd need to update the hardware (which you can't since it sits behind a firewall).

It could try to use things like Google's DNS server, but that is easily blocked in your router.

Not a lot that could be done except trusting your (internal) DNS server...


Why should the programmers of the TV's OS look for edge cases, and do you think the TV makers would give them budget for that? For 90+% of users the standard config of trusting the DHCP server will work fine, and the Pi-Hole users will probably not give them money anyway, and will be dedicated to defeat their workarounds...


I've been worried about companies that make software like this (applications with embedded telemetry or advertisements) starting to do their on DoH style lookups.

I don't KNOW of any doing it but I can't imagine it'd be too hard for them to do.


I had an Apple TV connected to a TCL Roku TV and the TV was analyzing video frames from the AppleTV to popup ads suggesting to watch the same content on other streaming services.


Interesting that they don't know exactly how they are perceiving magnetism. I suppose it could be an internal brain mechanism, but if that were to be able to operate without direct outside sensory input, there might be all kinds of other potential latent abilities in animals.

More likely, this might be the same as the mechanism which exists in birds, who can effectively see the Earth's magnetic field in their eyes, although an internal magnetic detection system present in animals would be much more interesting. See https://royalsocietypublishing.org/doi/10.1098/rsif.2019.029.... Birds essentially perceive the Earth's magnetic field through a chemical process in the receptors for red and green light in their eyes, and are believed to have special cells in their beak, brain stem, and possibly their vestibular system as well. It can be disrupted fairly easy by magnetism or different wavelengths of light.

There are obvious other strange implications of this as well, although I'm not sure how much real evidence there is to support them. For instance, there are many concerns about the effects of EMF exposure, as well as that EMF can affect plant life. There is research on magnetic and electric brain stimulation as well. At the most out-there level, there is the research into remote viewing and things like telepathy. So, to whatever extent this actually exists, there are interesting implications for the phenomenological experience of other forms of animal life.

Regardless, it is interesting to see research that is actually showing new and intriguing things regarding different forms of perception (I guess this is arguably ESP, or at least a new sense) that are not complete crank nonsense. I've always felt like my sense of direction was a sort of 6th sense; I guess it really could be.


I learned C++ and enjoyed it but never went too deep. I always enjoyed C more. There's also so many S tier codebases to read to learn C better like the original Dune game engine or Unix utilities.

Having dabbled a bit with Rust recently I can't see any strong reasons to use C++. The combination of strong functional programing inspired type system with performance seems unbeatable. Almost a perfect language.

I'm sure there must be some legacy reasons to use C++ though. Maybe Game Engines, embedded programing, some kind of other legacy tie in?


I switch between C++ and Rust at work. Honestly with modern C++20/23 a lot of the pain points are being fixed mostly by just copying Rust. If you make a new C++ codebase its possible to do it reasonably cleanly. But at this point I don't understand why you would make new software in C++.

Here are a bunch of C++ annoyances I can think of. Library/Package management is nonstandard. Headers are code duplication. The standard library changes depending on implementation and is almost undreadable. Weird behavior in the standard. People relying on undefined behavior without realizing. Use before assignment issues. Subtle ownership issues and memory leaking due to bad refcounts. Needing to make everything const instead of by default. Checking for exceptions in everything you call to ensure your code is noexcept. Unreadable errors when working in the standard library. Heavily OO code is basically tech debt. The LSP is not structural like in Rust where the definition is found by checking the AST. Navigation and codebase discovery is slow in C++ because of the poorer LSP.

Rust has first class explicit 'nostd' support whereas in C++ you need compiler specific flags to disable the std lib and its hard to make sure you did it right. So the embedded reasoning is silly to me.

Rust also has game engines like bevy but they are new. You could hook into godot scripting with Rust if you want. Low level audio is just as easy in Rust and you can do it cross platform with a single crate.

In general I think it's just legacy code and hesitancy to change.


Huge amount of legacy across many dimensions, not just apps written in it [1], like number of users, published and available knowledge / resources (books, courses, blogs, articles, videos, software libraries, etc.), high compatibility with another huge language (C), etc.

This is just software industry general knowledge, for those who have been there for more than a few years in the field. I am not even a proper beginner in it, because I have never used it much, although I had bought, read and to some extent, understood some classic C++ books, including by the language creator (Bjarne Stroustrup [2]), Scott Meyers [3], and a few others, earlier. I did have a lot of experience using C for many years in production, though, including on a successful commercial product.

[1] https://www.stroustrup.com/applications.html

[2]:

https://www.stroustrup.com

https://en.m.wikipedia.org/wiki/Bjarne_Stroustrup

[3] https://en.m.wikipedia.org/wiki/Scott_Meyers


C++ makes sense when you need low latency, but don't really care about correctness

So I believe gamedev will stay on C++ the longest

Game crashes with segfaults for 0.001% users? So what?


HFT perhaps?


I can see that.

On most metrics I've seen Rust is comparable on general speed.

Maybe if you're at the level where you've essentially writing portable assembly and are okay with lack of safety. You need to know exactly what is happening within the CPU, maybe on custom hardware.

I bet some defense applications would be in this category too, although for my own sense of self preservation I would prefer the Rust type system.


Rust would probably be a good fit for HFT, but as the field is so dominated by C++ is hard for another language to make inroads. Java managed to some extent.

I would expect a lot of unsafe though.


Ecosystem effects are definitely important to C++'s dominance in HFT, but it's also a domain where a lot of the guarantees Rust offers just aren't all that relevant. From a security perspective most code always runs in sandboxes accessible to only a select few whitelisted IPs. True, you don't want a segfault while you're in the middle of sending an order to an exchange, but most of those are pretty easily smoked out in simulation testing.


Yes, if you are receiving malicious data from the exchange, getting p0wnd is the least of your concern.


Something like air tags would be a great use case for an open source DePin project. Ideally Zero Knowledge proofs would be used to preserve privacy.


The Counterfactuals of his ideas are very interesting.

Chilean socialism didn't work (just ignoring the coup, it couldn't actually run the economy), but the reasons why it failed, or in other forms could have worked bear consideration.

In short, it failed for the same reasons central planning tends to, considering modern understandings of complexity theory and ideas suggested in books such as Seeing Like A State. Just having a dashboard and greater access to information is still subject to the same forms of hubris as is general central planning, even if these shortcomings are better anticipated.

Yet, many of the innovations in this project bare similarity to how large enterprises CAN run will, such as with ERP and Business Analytics in the private sector, and modern intelligence and command and control systems in the military.

So in all, they didn't completely work, but in the way is ideas did work they were very early.


Winners write the history books. There's whether or not a system fails, and there's who it fails for.

Because we can all see a heck of a lot of fail in today's system, but those failing or being failed don't tend to get much of a platform to write about it.


Amazing. Younger people (and people new in any field) often think of intersting approaches without the blessing/curse of "best practices." The process for creating the game was very good and very logical way to learn a new field.

I would recommend a couple of small things for the code. Variable names are usually ALL_CAPS if they never change (for instance const PI =3.142) and camelCaseForOtherVariables. snake_case_variables aren't really used in JavaScript, but aren't technically wrong. Also, it's usually good to put variables into nested data structures with hashmaps instead of comparing based upon array index. This is in "the real world" though, in academic computer science algorithms based on position in lists are more common.

If you want to get what we call "Code Review" a good way would be to feed your source code into a LLM to have the LLM give feedback based on your code, and recommend improvements. Most people like Claude best for dealing with code nowadays.

I would also recommend putting your code on Github so that people can check it out.

Very impressive!


Appreciate the feedback on the code style & variables. I raised this previously but perhaps external feedback will be what he needs to get him motivated to clean it up :)


Agree 100%. Since dad helped, I was expecting to see sprites sheets, or some JS classes for some OOP, etc. Was pleasantly surprised to see how "simple" the approach was for such fun output with a decent amount of variety.


He doesn't understand what a sprite is, or really the motivation behind sprite sheets. To him they're just animated images he made in a tool. Internally he switches the "costumes" of characters (terminology he got from Scratch I believe).


That's so great. So many software engineers (myself included) tend to overcomplicate things.


This is how cool new ideas arrive each generation


Yes, in terms of market share.

But the key difference is that it's leagues better than other browser engines on quality. From the perspective of competition this isn't great, but the network effects are hard to ignore. Firefox and Safari (webkit) just tend not to work as well.

It's very different in terms of quality though. Internet Explorer was a terrible browser and often lagging in implementing standards. The better comparison would be Safari now, which often completely breaks many sites on mobile for me. It also doesn't eliminate a lot of newer CSS animations properly.

This is really very unfortunate because it's good to have competition in browser implementations. Everything is Chrome under the hood now except for Safari and Firefox.


Internet Explorer was an attempt to monopolize and control the early internet. They intentionally left standards unimplemented or just implemented their own insane version of them in order to trap people into the platform.

It's no wonder the product saw them taken to court over antitrust violations.


Nobody really uses multithreading for node. I mean I'm some projects do (maybe pnpm?) but the threading support isn't great and the event loop performs well.


Almost everyone running Node in a machine with multiple cores is using multithreading.

Node is multithreaded by default. I believe the default setting is using 4 threads. Most of Node is written in C++.

The JS code written by end users is single threaded (most of it at least) but IO etc is all executed with libuv.

https://docs.libuv.org/en/v1.x/threadpool.html#threadpool


Agreed, I realize it would need to be effectively another language, or at least a very different implementation.

This isn't to far off from what new projects like Deno and Bun are doing though, apart from also needing to spread the event loop implementation horizontally


Can you point me to what you’re talking about in Deno? That’s really interesting.


Deno doesn't implements this differently, it's just an alternative server side run time. It has some better defaults compared to node though


Oh I thought you meant they were working on horizontal scaling. I think this is very far off from what they are doing. It’s still Javascript, and you can take for granted that it follows the Ecmascript spec even if its runtime is different.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: