• 10 Posts
  • 480 Comments
Joined 2 years ago
cake
Cake day: July 3rd, 2023

help-circle




  • This is the only positive thing in the Trump administration so far. I was worried that they were going to de-fang the FTC and DOJ’s antitrust efforts and just start handing out slaps on the wrist again, but nope. It seems they’re going ahead with the Biden-era aggressiveness, which is awesome.

    Of course, they’ll never touch Musk’s companies, and it remains to be seen if Microsoft/Google/etc will be able to bribe Trump for a settlement. But at least it’s a glimmer of hope in the otherwise onslaught of depressing news.


  • When I was a kid, like ~6 years old, I walked into my parent’s room and saw a little bottle of eye drops on the night stand. I guess I was curious, but for some reason I decided to apply them to my eyes.

    The bottle was harder to squeeze than I expected, but when a tiny drop finally came out I missed my eye and got it on the edge of my eyelid. Turns out it was superglue and not eye drops. Luckily, it was such as small amount that I was able to wash it off and pretend like it never happened.

    I still don’t know why some super glue bottles look so similar to eye drops.






  • “real time raytracing” as is advertised by hardware vendors and implemented in games today is primarily faked by AI de-noising. Even the most powerful cards can’t fire anywhere near enough rays to fully raytrace a scene in realtime, so instead they just fire a very low number of rays, and use denoising to clean up the noisy result. That’s why, if you look closely, you’ll notice that reflections can look weird, and blurry/smeary (especially on weaker cards). It’s because the majority of those pixels are predicted by machine learning, not actually sampled from the real scene data.

    Blender/Maya’s and other film raytracers have always used some form of denoising (before machine learning denoising, there were other algorithms used), but in films they’re applied after casting thousands of rays per pixel. In a game today, scenes are rendering around 1 ray per pixel, and with DLSS it’s probably even less since the internal render resolution is 2-4x smaller than the final image.

    As a technologist, I’ll readily admit these are cool applications of machine learning, but as a #gamer4lyfe, I hate how they look in actual games. Until gpus can hit thousands (or maybe just hundreds) of rays per pixel in real time, I’ll continue to call it “fake AI bullshit” rather than “real time raytracing”

    also, here’s an informative video for anyone curious: https://youtu.be/6O2B9BZiZjQ